Meta's Fact-Checking Shift Explored

Aaron Wolpoff [0:13 - 1:21]: All right, here's how this works. In each episode, we pick a company we all know that has something going on right now. Then we put ourselves in charge and see if we can fix it. You'll be hearing from Melissa and operations. Peter on finance, Chino on people and culture, and me on marketing. My name's Aaron. As always, a quick disclaimer. We are going into this somewhat cold and nothing we say should be construed as legal advice, financial advice, or anything that would get us in trouble. These are our views and opinions. We're here to ask the kinds of questions everyone's thinking, have an engaging conversation, and maybe come to some conclusions that we feel are worth exploring by the end. If We Fixed it. You're welcome. All trademarks, IP and brand elements discussed are property of their respective owners. Welcome back to We Fixed it. You're welcome. We are your fixers. You have Melissa, Chino and myself. My name's Aaron. We're without our fourth today. Who's Peter? Who's our finance expert, but we're working to get him back in here soon. Today we got a really big topic. We're all chomping at the bit to get going. I know that we're all fearless fixers, but this might be our most daunting one so far. But hey, it's not like us to shy away. So Chino, what are we talking about today?

Chino Nnadi [1:22 - 1:44]: Yeah, in today's episode, we'll be discussing how Meta recently announced a series of sweeping policy changes in its platforms, essentially getting rid of their independent fact checking organization. And so our question really centers around what is matters corporate responsibility in this. It's heavy. It's a heavy one.

Aaron Wolpoff [1:44 - 2:08]: Let that hang there for a second. And it's not just matter. I mean, let's focus. You know, that's breaking news. Every, every moment of every day is changing. Right. But it's not just met others. There's others in the same similar situations that are either have taken similar approaches or are maybe at the point of questioning their role in, you know, in the social discourse. Right.

Chino Nnadi [2:08 - 2:18]: Yeah. And I guess to set it up too, just to give our listeners some context, which I'm sure, you know, people are Meta. I'm sure people have experienced X formerly known as Twitter.

Melissa Eaton [2:18 - 2:19]: Twitter.

Chino Nnadi [2:19 - 3:21]: And so what they are doing is removing the fact checking organizations that they had. So they had a global third party vendor that would fact check things. You know what many people said that work there. It's a lot like whack a mole. You can stop a few bad communication, but it doesn't mean that it doesn't stop all of it. Right. And so what Meta has decided to do is take on a similar approach to X, which is introducing community notes. And so this heavily relies on user sourced fact verification versus kind of that third party global vendor. And so, you know, again, going back to the question of what's that corporate responsibility for Meta, where, you know, something like this can become very highly politicized. You know, it's tricky, but I'm excited to dive in. But that was just the background there.

Melissa Eaton [3:21 - 6:06]: Yeah. And I think that one of the things that we want our listeners to be aware of is that we know that this is very triggering for a lot of people because it definitely has to do with the political climate here in the United States. There's a content in policy shifting, there is connections to the incoming president, the outgoing president too. And to see how policies are changing and shifting as the times change, I think we need to also be cognizant of that. But I do think that what we want to primarily focus on is what Chino has so graciously and eloquently provided and communicated, which is around the idea of what fact checking is all about. And how do you account for that when it's gone. I know at Meta that, and I did see this online in some forums that the 10 different fact checking third party vendors that they use and are outsourced to, including, they're all pretty global to Chino's point, they're from all over the world. So I think that's one of the interesting things is the fact checking is coming from not just internally, from like the United States in one perspective, but from all over, from all, all different types of places for fact checking. Scientific organizations, political organizations, all, you know, technology organizations, those kinds of things is that they did not know that they were not going to be let go until this was announced. One of the fact, I think, I don't want to get their name wrong, but factcheck.org or whatever they were said they even signed a contract just two weeks ago. So they were a little shocked by that. I know internally that Mark has communicated internally at Meta and this has been also made public that this group, the fact checking group, is moving to Texas, the internal fact checking team. And that's another thing that's kind of interesting too. And that has like some interesting implications of moving out of California, where Meta's headquarters are, into Texas, where I used to live in Texas, so I know this. They don't have state taxes, so there's a lot of tax advantages for corporations. So from an operational perspective, you could say, oh, that's a great business decision. But it is definitely something for us to think about.

Chino Nnadi [6:06 - 7:21]: Yeah. And I think, you know, kind of piggybacking on what you're sharing too. Melissa. Again, the fact that they're moving from something global to, you know, a really small team in Texas, you know, we have to acknowledge that that is cutting down the perspective that you have. Right. Someone in Texas will not have the same opinion as someone who is in Indonesia or in Europe. So, you know, when we look at that content moderation and this very small now team, we need to remember that there's going to be an introduction of bias just by the pure nature of they're no longer global and they don't have that perspective. And so from a culture perspective, I do think, and from a talent training perspective, there does need to be some training, I think globally, just in terms of sensitivities and just being aware of other kind of global perspectives, knowing that if this small team is the one that's going to be managing this, they need to also broaden their horizons so that in doing their content moderation, they can hopefully do it in a way that is less biased.

Melissa Eaton [7:21 - 9:52]: Well, and I feel like the reduction of content moderation by third parties is going to put, like you said, so much pressure on this internal team and it, it will degrade the user experience and platform safety significantly. They're going to be just trying to put out the biggest fires. Right. And to your point, are they escalation people, you know, what kind of training have they have, can they actually shut down an account? Because then what are you doing? You're actually doing the same thing you're saying you don't want to do. Right. So I do think to your point, there's this whole look of like organization design, like, I don't know, I'm assuming Meta is a huge, huge corporation that they've got people on it from a change management perspective and trying to figure that out. I know that it was communicated internally that there were not going to be job losses besides the third parties. So there's savings from the third parties. But are they going to ramp up that team in Texas? Hopefully they will. But with those community driven reporting tools, like somebody's got to monitor those. Right. Those are the forums where you get some crazy things in there and then those go viral and then all of a sudden everybody believes that you should be doing these things or this is happening in the world and it's not really happening. Whatever. We have AI, we have machine learning systems, but as we found out with Microsoft, they're only as good as the information that they're getting. And so if they're getting misinformation and they look at that as truth and fact, then what are they, are they going to flag things that are, you know, and I don't want to again, not to be political. I don't want to say that my truth is the only truth. Right. And I think that's the thing that is interesting about these community driven, you know, forums to keep people honest. But like, everybody has their own perspective, so it'll be interesting. And Aaron, I'd love to hear how you think Meta should. I mean, I think meta is coming across as a little bit like a puppet of the incoming or, you know, incoming government. And what do you think they should do to help stand their brand up?

Aaron Wolpoff [9:53 - 11:55]: Well, you know, Melissa, it's interesting because, you know, I wonder if they, the reason for bringing on an international third party consortium, right, is to shift the responsibility and say at the time, I believe the impetus for it was to say we're neutral, we're taking a neutrality stance, we care about what goes out there. Right. And so to make sure that there's some accountability, but maybe not our accountability, we're going to bring on these other entities to take that role. And I think of it like if you have a potluck at your home, right, and you invite others to bring, bring dishes and what it's, but it's your house. What, what responsibility do you have? Well, you might want to make sure there's not too many side dishes. If something looks a little off, you might hide it on a back table or put it back in the kitchen. But that's, you know, it's contained to your, your home. If you're running a buffet restaurant and your business model is everyone come in and bring a dish and we're going to serve it, what's your responsibility there? Right? That's got, that's stickier and I think that's what they found themselves caught in is this is, we're monetizing your content, we're putting it back out there. We, from a company brand perspective, we, we've got to express how we feel about, you know, what goes out there and what we're doing. But maybe, you know, by appointing the, the community driven initiatives, maybe that is keeping up with their stance of neutrality from the past, saying, look, we're still not going to be the ones apart from that Small independent or internal putting out fires team. When it really, really gets bad, we're not going to be the, the moderators, we're not going to take control of this. We're going to stay neutral. Community, you deal with it. And I wonder how that's going to play out.

Chino Nnadi [11:55 - 14:11]: Yeah, it's interesting because I wonder are there any cases where community notes have been helpful and I think of X where I've seen certain posts and it's been great when it's like, you know what, this was actually not factual. Here's the link to the article for this which is helpful. And again, if you go back to the thinking that to monitor content that's coming from everyone and anywhere, it's hard to do, it's whack a mole. Even though you have a large team, there's still going to be so many more users and there are this small subset of team, whether that's a global third party or an in house content management team. And so, you know, kind of looking at it from another side is would community notes be helpful in terms of kind of gauging and safeguarding some of that information, you know, at a more faster pace than it would be to having to go through this one internal team wants to look at it and this and that. And a great example of this would be Wikipedia. Right? It's not perfect. You know, I know you really shouldn't be sourcing Wikipedia every time and it's great to just kind of use it as a baseline, but it has, it does hold a lot of relevant great historical information and research that you can kind of take and look more of. And I think, you know, there's times where pages get changed, you know, but then they have a really quick moderator that will re change a fact if that's needed. And so, you know, will meta become more attuned to a Wikipedia where their internal in house team is kind of catching those changes quickly are using AI to fact check some of the things that are change potentially. So I know that a lot of people are scared or worried about this. There is potential for misinformation to be spread. But at the same time as people have been trying to fight and combat against that, will those community notes actually be helpful in putting out more of those fires in a faster way similar to what Wikipedia does?

Aaron Wolpoff [14:12 - 15:00]: Well, in the way I'm hearing it, and I understand it is the internal team is not intended as a one to one replacement for a global third party consortium of independent entities. It's kind of a last, last ditch Effort, like, like you've taken it way, way, way, way, way too far. Okay, we're maybe going to step in, right? It's that, that last kick out the door. So the responsibility really falls to the community. But is that to, you know, to be this the source, source of truth or point, counterpoint, is that the right thing to do when it comes to this? Is not Reddit where you're asking, you know, which, which wine do I pair with salmon? This is, this is where people go to get their, their news and shape their worldviews. Is it fair for it to fall to the community?

Melissa Eaton [15:01 - 18:20]: Well, I think that's a really good point, Erin, because one of the things that I was going to ask you all to think about and what your opinions were is that in today's world, there is such a large group of individuals that get their, this is their source of news, really, to be honest. They're not watching, you know, the BBC, they're not watching cnn, they're not watching MSNBC or any of these outlets. What they're doing is they're on their phone scrolling, they're scrolling TikTok, they're scrolling Facebook, they're scrolling X, they're all those things. And when they see something on threads or X or wherever, that's what's telling them that this is, this is happening. So, you know, we're also kind of entering this stage where the social media platforms, which used to be about drawing your community, like your connections, right, Your friends, like, you know, all those kinds of things is now, like where you're getting sourced information about the LA wildfires or whatever it might be that's going on. And is that the right place? It may not be the right place, but that's the place. So now what kinds of these types of safeguards are really going to help that situation? And I do think, Chino, your point about a community being able to quickly address something is really a great thing because I've seen that, that oftentimes, let's say on TikTok or I'll see it on Instagram or any of those places where people will be like, this is an old video. This is not. This did not happen yesterday. So stop saying that this happened yesterday and this person did this or whatever it was, right? But again, you know, it's, it's interesting to think about these tech companies or social media companies like meta, like X as news organizations or are they really tech apps, right? So when you think about like a telephone company, right, the telephone company says, well, we transmit information so like, I don't, you know, on my, you know, you're just using your phone as a body of transmitting information. Right. So are they that or are they really like a news organization where we need to fact check or we need to make sure that what we're putting out there because our name is behind it and we want to make sure it's as neutral and as, you know, objective as possible. You know, what are we going to do? So I do feel like, you know, we're at the verge of this really interesting intersection of where news is changing from just, you know, news outlets to this whole social media thing. And even our news outlets today, we all know are politicized, right? So, you know, you listen to one news outlet versus the other and it's night and day and they're talking about the same thing. So, you know, how do we, how do we monitor that? How do we, you know, and I don't, I don't want to say control, but it feels like what we're trying to do is somewhat control the narrative too.

Chino Nnadi [18:20 - 20:01]: Yeah. What's interesting though, so going back to the question around corporate responsibility. So first saying, listen, Meta is a social platform to connect, say what you want, find your community of people who agree with it. Yes, you use it as a news source, but is that really, are we a news platform? Not really. So they kind of have that gray area. But when we go back to the question around what is their corporate responsibility in a world where people don't have the critical thinking of fact checking themselves? Maybe it's Meta's corporate responsibility to ensure that people understand that not everything that you read is real and provide certain tools on how to fact check. Because I think that is more powerful than just saying, you know, we have a small team who's going to try to whack a mole, things that it's impossible to catch everything, I think it would be really smart for them and for that team, selfishly for them to take away some of that onus on them and to again bring it back to the community and say, here are resources in terms of how to spot misinformation, how to fact check yourself, how to critically think about the media, the social media that you are consuming. I think that will be an important tool that they should incorporate moving forward amongst X and Wikipedia and everything else. Because if you do that, you know, you're kind of checking the whole corporate responsibility piece there and then leaving it to the community. If I were Mark Zuckerberg, that's what I would do.

Aaron Wolpoff [20:01 - 21:11]: And I like that you keep bringing up Wikipedia and a lot of times that comes down to more of a facts or facts, I guess. But if there's a battle and there's the wrong year for the battle, you can bet that within, you know, a few seconds someone will be on and correct that kind of information. And that's documented and you can call facts what you want, but that's a. I'll say for the purposes of this conversation, that's a proven historical fact. And then you go to something else like a Google Maps or Waze where you have the ability to community source and report traffic accidents and that's beneficial to listeners or, you know, users of the app and real time updates that they wouldn't get otherwise. So communities prove themselves capable in other situations of, you know, maybe smaller or more more specific use cases, but they prove themselves capable of stepping up and, and taking that, that moderation piece or that that role is. Is meta the right environment for that?

Chino Nnadi [21:11 - 23:23]: I think it could be if it's set up right. It needs to become the right environment for that. If they want to hold any corporate responsibility, right, there is a corporate responsibility, whether they want to admit it or not, that they take because it has become the new source. So whether they want to call themselves a new source, that is what they've become. So there is a responsibility that just by nature, what they have to realize. And I think again, you can't. Was it. You can't lead a horse. No, you can lead a horse to water, you can't force it to drink. Right. If you can, at the bare minimum lead your community to recognize what misinformation looks like and how to fact check on their own, great. What they do with that is what they do with that. And I think that in those extreme cases, that's where your internal team comes in and can kind of navigate those extreme cases. But I want to go back to the idea of, you know, free speech, right. It kind of goes, it really bleeds into that. And you know, I can say whatever I want. I can go off right now and say as many slurs and with what Meta is doing is they're actually removing some of those policies that kind of safeguarded that, that's great. Would I ever do that? Absolutely not. Because my name is associated to this podcast and someone can very easily go on my LinkedIn or and tarnish my brand. There are real consequences to what you put out there. And I think people sometimes hide behind the screen and forget, yes, free speech is there. You have the right to say what you want to say, that does not mean there are no consequences to that. And especially in a world of cancel culture, what you say on these social media posts is live, it's archived, someone can screenshot it. And so just because you can say it doesn't mean that you necessarily should. And I think that is the other part of what people forget when we talk about, you know, social media and sharing your opinions on a platform that's very public.

Melissa Eaton [23:24 - 25:59]: And I think that, you know, and this is kind of in your realm and so I kind of softball this to both of us to talk about. But the also the impact to marginalized communities and online bullying and what safeguards exist to protect them. I mean, really a lack of robust safeguards like this fact checking could widen disparities and erode trust on the platform as well as those communities being disproportionately affected by misinformation and hate speech, which will lead to online harassment, stigmatism, real world harm, you know, the stereotypes, the becoming more and more prevalent. And so those kinds of safeguards also need to be prioritized. I think one of the concerns I had was like thinking about like when I love to see communities jump into action, right? Like to come to really trying to help someone, you know, whether it be a deib type of situation, whether it be, you know, whatever it might be, immigrants, whatever, whatever the issue might be, it's great to see people jump in. But I also know that some of the strongest voices and like some of the people I know that have very strong opinions in different ways and really fight for the rights of all have found, have been just so put off by x, meta, etc. That they've just decided to go away from this. So then that leads me to be like scared for these folks that don't have people standing up for them. Right. And I don't know where that, you know, and that doesn't feel like what Zuckerberg is trying to set up in Texas. I don't feel like that's what he's trying to do. Definitely not that he's already putting into place policies against the LGBTQ community within meta. So like, I don't, I don't think that's what the point is there, but it just a thought is like, how do we protect those communities and how do we uplift and allow that freedom of speech? Because again, I, again, I know people all have their own opinions about all these types of topics, but how can we do it in a just a human way and just, you know, be thoughtful.

Chino Nnadi [25:59 - 29:01]: I would agree. And I think what's really important is remembering that social media is not a psychological safe place. It's not. There are trolls everywhere. I can say the sky is blue and someone will fight me adamantly, drag my name in the mud, threaten to kill, harass me, my family and everything else because they don't agree that the sky is blue. And that is a honest fact of social media. It's a sad truth. And as you were mentioning to Melissa, a lot of people who are advocates like myself for deib, there are many DMS that you receive that are not kind and it's, you know, very hard because you have to almost safeguard yourself and ignore those messages. And you're right, people pull away from that. And so I found myself in those situations not posting as much or not sharing my thoughts. And again, it's then skewing to, you know, only certain voices being shared. But with as many trolls as there are, there are trolls trolling the trolls. And there's a community that believes very strongly and you can look at it in comments for a lot of things, right? For every hate speech or someone that's like that, it's really horrible to say, you know, four or five other people attacking them. And I think unfortunately the name of the game for social media is engagement, whether it's good or not. And I think people, again, as part of maybe this training that this internal group shares, is reminding people that it's not a psychological safe space. You are putting yourself in a way where people can attack, bully and harass again. There are laws against that. So screenshot them, bring them to the police, do what you need to do within your state or country because you know, as you've seen it many times, many quote unquote Karens where they spew a bunch of things in the name of free speech, losing their jobs or getting arrested for hate crimes because of that. And so sure, meta's not doing that, but there's another system behind that. And again, once you post something that's there, someone can take that and can share that. But I do think it's important for people on both sides to remember it's not safe. It's not that community, it was never that going to be community. In a system that encourages engagement, you need to remember that and move accordingly. And if that means that you need to separate yourself from that, then please do. But if you feel like you have the gusto to fight the trolls, go ahead. And I'm sure people have beaten many an online battle and you know, my younger days, I would, you know, get on there and fight a little bit more. I don't anymore because I just realized no matter what you say, someone's always going to disagree. And so where do you want to spend your time and energy?

Melissa Eaton [29:01 - 31:40]: Well, it's having the moral courage as well. And I love what you said there because I do believe that, like, you're right, there's always another side to everything. You brought something up that I think from a business perspective and operationally and when we talk about what could meta do, what could X do, I mean, there's a way to look at enhancing their algorithmic moderation. And what I mean by that is. So you, you brought up an example of let, you know, let's say the world is flat, right? So somebody starts posting that and, and you start to read it. And if you're reading it and you actually engage in that video or if you engage in that post, that is now going to be bumped up to the top of your algorithm, right? So you're now going to get a bunch of, you're going to get a bunch of posts from people that all believe the world is flat. Okay? Now the question would be to me is like, could you set up an automated moderator for that algorithm that would also showcase, okay, here is a couple posts of opposite opinion and not trying to change anybody's opinion, but allowing that there is opinion on both sides. And I think, you know, seriously, for me, I don't, I wouldn't be concerned seeing people say, giving me that content as well. Right. Like, I think that you, you have to make the decision for yourself. But I think if you can see both sides and leave that decision, Chino, like you said, up to yourself and know what you know and what you believe in, then that's a helpful situation. And that may be one way for some of these large corporations without adding scale of actual humans touching every single post, which we know they're not going to be able to do to really help because we know these algorithms are set to hit that dopamine center in your head. And so that's why you can't, it's, it is addictive. That's why you can't get off of it. They know that like I'm on like a freaking food TikTok that won't stop. So, you know, and it's not been great because I've been sick. So I'm just like watching all these food things and I'm like, oh my God, I'm going to try that tomorrow. Right. You know, but like, they know, and as soon as you start engaging in it, it's like you get just flooded with all of that. So there's got to be a way for them to help by, you know, providing maybe, you know, similar yet not exactly the same views.

Chino Nnadi [31:40 - 31:41]: Right.

Melissa Eaton [31:41 - 31:53]: So I think that might be actually one of the ways that operationally you could look at it from a business perspective to just show that you're trying to stay semi neutral.

Chino Nnadi [31:53 - 32:09]: So what I would say on that, because in a situation where we're talking about a date for something historic or people who believe in flat earth, it's easy to kind of show the contrary because it's. There's facts and there's science around that the problem becomes when it's an opinion.

Melissa Eaton [32:10 - 32:10]: Yeah.

Chino Nnadi [32:10 - 33:38]: Right. And so I don't think it's necessarily safe to sometimes show the other side of the coin, because if I'm someone who believes really strongly in LGBTQ rights, but I'm also getting fed things that are saying, oh, actually this is not it that I don't think would be great. So what I would say is taking kind of that nugget, though, that you. You were mentioning Melissa, and saying that I do think there is some onus to refresh that algorithm so that you're getting, you know, if you're super deep in, you know, I was on hoof talk for a long time. Hoof cleaning talk. Weird. But it was, you know, learned a lot. But it got to a point where I had to like literally type in a search to get myself out of that. But if it can, you know, the algorithm can automatically bring you back into a more neutral for you page. Type non politicized or like extreme view on one thing or the other. That could be really helpful. And again, maybe it's even when you are on a very deep, dark hole in one theme or another, there's that reminder of how to fact check. Maybe there's a video that comes up, maybe there's a community note saying, hey, you've been really in this hole. Right. It's important to fact check what we're reading here. Right. We need to remember this is social media and not the news. Maybe that's as far as it goes. Right?

Melissa Eaton [33:38 - 35:19]: Yeah. And you know what's interesting too is I have two friends that work at Meta, and recently one of them told me that she had to set, even though she works there, so she has to obviously go on Instagram and all those things all the time for work. She's like, I set A limit on my screen time because it, it is addictive and they do it on purpose. And so if you don't, you will be, like you said, you'll be on clean Talk thinking that you gotta clean your stove. Like the limes and the baking soda and vinegar. I've seen that. Yeah. And so like you end up in place and you all of a sudden, four hours of your life have gone away. So it's, it is amazing because there's, that's, there's those algorithms that get you addicted. And so again, it's not that the fact checking isn't important. It's all, it's so important. But to your point, fact checking on things that you can fact check on versus fact checking on opinions. And for a lot of people, it's being part of a community is, just feels great. And they may not have, they may live on a very isolated life. They may be working from home. They may, you know, Covid really isolated them or whatever it might be. And this might be their own only way out in community. And it's just an interesting thought that, like, we're in this age where so much is out there that you can find anyone to agree with you on any. Anything.

Aaron Wolpoff [35:19 - 36:12]: Yeah. And is this move designed to appeal to those that say, okay, good, now I, I'll get a counterbalance. I'll get, I'll get other perspectives without the interference. That's a good thing. Or is it designed to appeal to those that want confirmation bias, that want, oh, good, now there's more of us. We're louder. We have a platform and there. It's somehow being tacitly endorsed. Now we can say what we really want to say. And is there going to be. We'll have to wait and see what plays out. But is there also going to be a mass exodus from these, you know, everyone. Everyone. I don't know if it's ever been a safe space like you said, but it's for everyone. Are we gonna. Going to see more and more specialty platforms and communities that are just, you know, people that think and act the same way and try to keep the trolls out the best they can?

Chino Nnadi [36:12 - 37:43]: Yeah, you're seeing that already with X, right. People are moving from X formerly known as Twitter. I have never refreshed my, you know, I never redownloaded it. It's still Twitter on my phone. But, you know, you've seen, you've seen Gabrielle Union and another, a number of other celebrities who've moved to Blue sky with that in mind, Aaron, where they're Saying, I don't really want to be a part of this anymore. This has become a really negative safe space. It was never great. It wasn't like, I can share, you know, what my address is and my phone number. No one's doing that. We all are aware and Internet savvy enough to know not to put our personal information. We understand the possibility of getting doxxed. So there's that. But I think, I think the next level is kind of teaching and training around the screen times, the critical thinking. Your own fact checking will be the corporate responsibility that meta will need to play. Same thing as when you were scrolling on TikTok. After a while, I think after the first year it would say, hey, you've been on it a little too long and sure, you can ignore it and keep swiping, but it was there as a reminder and it helped me a little bit during COVID to stop when I, when I hit that, I was like, okay, this was a little much. So maybe that's all the corporate responsibility that it needs to be. And as users, knowing that it's not going to be as safe and it can be quite toxic and that's kind of what you're signing up for.

Aaron Wolpoff [37:44 - 37:54]: So if we're, if we're the ones running meta, did we make the right call? And in a year, are we going to be walking it back? Are we going to be changing our policies? What's the best case outcome of this?

Melissa Eaton [37:54 - 39:05]: Well, since I'm not Mark, I would not have chosen this route, but I understand he was under a lot of pressure to do this and I think that we did not fix it, in my opinion, because I do still feel like there needs to be some fact checking, moderation. I would like to see though, what he has in store and how meta, you know, evolves over time. I think that's the one thing is that like the evolution of meta, it's just so different from when he dropped out of Harvard and started it. Right. You know, that it's just not the same thing. Connecting with friends, right. That kind of thing. So as this has become meta in this huge thing, I would say that there's a lot to be considered and I think we raised some great topics and I think actually we could have gone into every single one of those in a lot more detail. And I would have loved to have Peter's opinion from a financial perspective because I do think that, like this was a business decision on Mark's part as the new Trump presidency begins in a couple weeks.

Chino Nnadi [39:06 - 40:10]: Do I agree with decision? No, I just think it's a completely unsafe space for people. You've just opened the floodgates for people to be more racist, more prejudice. And I think people need to remember that going into this. Do I think he needed to kiss the ring? Absolutely. There's a reason why this was announced when it was and I think people need to remember that. You know, it's going to be an interesting few years. You know, in a few years, if there's another flip to the Democratic Party, will these be rolled back? Will DEI programs come back into place? Who knows? We, I can't tell the future, but this is very clearly the sign of the times. And so I think from a business perspective, I get why he did it personally. Well, you should have a little bit more integrity than that if you're going to stand for something and say all these statements around DEI like you had earlier a few years ago and now just saying, you know, to hell with all that. No, but you needed to kiss the ring.

Aaron Wolpoff [40:10 - 41:00]: So, yeah, I think it's opening the floodgates and I think it's just going to turn into a dumping ground where, where people get out shouted and pushed off. And I do think we're going to see a rise in these smaller community driven initiatives where it's maybe more homogenous, not as much pushback or interesting discourse, but it does have that feeling of a community where I belong. I think we're going to see more and more of this, more and more of the specialized platforms spring up as a result, but we'll see what happens. As always, thank you for joining us. And we are@wefixeditpod.com on socials at we fixed it pod even the bad ones. And we'll see. We'll see you out there. This podcast is produced by Straightforward Media Group.

Melissa Eaton [41:00 - 41:04]: All rights reserved. If you'd like to learn more about how a podcast can help your company.

Aaron Wolpoff [41:04 - 41:07]: Establish authority and generate leads, please email.

Melissa Eaton [41:07 - 41:17]: US us@erictraightforwardmg.com or go to straightforwardmg.com for more information.

Meta's Fact-Checking Shift Explored
Broadcast by