Behind the Bastards

There’s a reason the History Channel has produced hundreds of documentaries about Hitler but only a few about Dwight D. Eisenhower. Bad guys (and gals) are eternally fascinating. Behind the Bastards dives in past the Cliffs Notes of the worst humans in history and exposes the bizarre realities of their lives. Listeners will learn about the young adult novels that helped Hitler form his monstrous ideology, the founder of Blackwater’s insane quest to build his own Air Force, the bizarre lives of the sons and daughters of dictators and Saddam Hussein’s side career as a trashy romance novelist.

Part Two: Let's Look at the Facebook Papers

Part Two: Let's Look at the Facebook Papers

Fri, 19 Nov 2021 20:59

Robert is joined again by Jamie Loftus to continue to discuss the Facebook Papers.

Learn more about your ad-choices at

See for privacy information.

Listen to Episode

Copyright © 2022 iHeartPodcasts

Read Episode Transcript

Let's do it. Bad guys. Yes. All right. Well, have the. Let's have that be what starts the podcast, what we just said. Let's start the podcast. Start the podcast. Let's start the podcast. Well, I'm Robert Evans. Yep. I'm Sophie lichterman. I never introduced myself. I'm Jamie. Who are you? Is there Jamie loft? Yes. Anything more we need to say? Are we done with the episode? Anderson's here. No, I think that. Yeah. Yeah. Anderson is here. Sure. Well, you know what's happening in the world. No. Facebook, yeah, is happening to the world and it's unfortunate. It's not great, Jamie. It's not great, Sophie not not a fan of the Facebook we left off having gone through some of the Facebook papers, particularly employees attacking their bosses after Jan 6th, when it became clear that the company they were working for was completely morally indefensible. Packing that they already had. They already knew. I would have called attacking either. I would call it, you know, I mean, there's a guy like the, the, the, the quote there was the guy who's like, history won't judge us kindly. The guy, I mean, yeah, when we didn't ban Trump in 2015, that's what caused the capital riot. I mean, facts are facts. Is that really attacking if you're just like, well, I think, yeah, I think stating facts can be an attack. Yeah. Whoa. Yeah. OK. Put it on a T-shirt. I mean for people like this, you know? Yeah, I think, I think stating facts can be an attack and we ended Part 1 by sharing some of the blistering criticisms of the of Facebook employees against, you know, management and the service itself. So as we start Part 2, it's only proper that we cover how Facebook responded to all of this internal criticism. As I stated last episode, Facebook is in the midst of a years long drought of capable engineers and other technical employees. They are having a lot of trouble hiring all of the people that they need for all of the things they're trying to do. So one of the things is for a lot of these employees when they say things that are deeply critical. They can't just dismiss the concerns of their employees outright because act like if they were to do that, these people would get angry and they need them, right? Facebook's not in the strongest position when it comes to people who are good engineers. They they have to walk a little bit of a tightrope. However, if they were to actually do anything about the actual meat of the concerns, it would reduce profitability and in some cases destroy Facebook as it currently exists. So they're not going to do anything, which has meant that they've had to get kind of creative with how they respond. So, mark? And his fellow bosses pivoted and argued that the damning critique like calling him Mark now. Yeah, old sucky suck. So when when this all comes out and people are like, boy, it sure seems like all of your employees know that they're working for the ******* death star. Zuckerberg and his like mouthpieces made a statement that, like, all of these damning critiques from people inside the company were actually evidence of the very open culture inside Facebook which encouraged workers to share their opinions with management. That's exactly what a company spokesperson told The Atlantic when they asked about comments. The history will not judge us kindly. The fact that they're saying we'll be damned by historians means that we really have a healthy office culture. Hashtag, death star. Proud. Yeah. Death star. Proud. Everybody. Yeah, yeah. It's like the fact about remove the stigma of working for the devil, right? I mean, come on, the devil I would be proud to work for because he's done some cool stuff. Like, have you ever been to Vegas? Nice town? I've been to Vegas. I've seen, I saw. I saw the Backstreet Boys in Vegas right before. Two of them were revealed to be in Q Anon, so really caught the end of that locomotive. Oh, wow. I did not realize that a sizable percentage of the Backstreet Boys had gotten into Q Anon. That makes total sense of the Backstreet Boys. They're from Florida. They're they're ultimately 5 men from Florida. So what can you do? As the author of that article, The Atlantic Article noted, this stance allows Facebook to claim transparency while ignoring the substance of the complaints and the implication of the complaints that many of Facebook's employees believe their company operates without a moral compass. All over America people used Facebook to organize convoys to DC and to fill the buses they rented for their trips. And this was indeed done in groups like the Lebanon ME truth seekers where Kyle Fitzimmons posted the following quote. This election was stolen and we were being slow walked towards Chinese ownership by an establishment that is treasonous and all too willing to gaslight the public into believing the theft was somehow the will of the people. Would there be an interest locally in organizing a caravan to Washington DC for the Electoral College vote count on January 6th, 2021? Yeah, and Kyle recently pled guilty not guilty to 8 federal charges, including assault on a police officer. Mark Zuckerberg would argue that like Facebook didn't play a significant role in organizing January 6th and couldn't have played a significant role in radicalizing this guy and many other people. But the reality is that for the people like the people who managed, part of what led Kyle Fitzsimmons to go assault people on January 6th was the fact that he had been radicalized by a social network that for years made the conscious choice to amplify. Angry content and encourage anger because it kept people on the site more right? Like all of the anger that boiled up at January 6th that came from a number of places, but one of those places was social media because social media profited and specifically Facebook knowingly profited from making people angry. That was the business and of course it blew up in the real world. I have a question just out of your own experience and observation, which is how do you like if you're doing a side by side case study? Of how Facebook responded to events like this versus how like YouTube Slash, Google responded to radicalization, are there like significant differences? Did is there any did anyone do better or different? Twitter has done better than probably most of them to YouTube. I mean, and again, I'm not saying this is Twitter's done well or that YouTube has done well, but they've both done, particularly with coronavirus, disinformation, a bit better than Facebook. And they were they were better. In general on not really YouTube as much, but like Twitter was definitely has taken has been the most responsible of the social networks around this stuff. It did. Seemed like for a while there the the various networks were kind of like duking it out to see who could do the absolute worst and damage the most lives. And it's it seems like Facebook won that I I would say Facebook. But and again, Twitter chose to do a lot of the same toxic things Facebook did. So did YouTube, and they did it all for profit. A number of the things we've criticized Facebook for, you can critique YouTube and Twitter for I would argue Twitter certainly has done more and more effectively than Facebook. Not enough that they're not being irresponsible because I would argue that Twitter has actually been extremely irresponsible and knowingly so. But I I think Facebook in my analysis Facebook has been the worst. Although I'm I'm not as I haven't gotten study as much about like Tik T.O.K yet so we'll see. But but my analysis you got to get on to pivot out of podcasting and into Tik T.O.K dances. Yeah I mean it's it's not the dances that concern me on Tik T.O.K. It's the the the minute long conspiracy theory. Videos that have convinced a number of people that the Kardashians are Armenian witches and had something to do with the collapse of the Astro world's or the deaths in the Astro world thing. My, my concern there is the dances that go over those conspiracy videos that really marry the worst of both worlds. Yeah, I'm sure because I have seen dancing on tick there. I have seen conspiracy videos that involve dancing incredible and skin care routine. Have you ever seen a conspiracy video where someone's also doing their skin care routine? Because that is a thriving. Yeah. So I'm sure that's. Yeah. So. Well, I nobody none of all of all of this. I was like, that is just a thing that goes on many platforms, kind of all. There's a YouTube channel I watch. The companies are willfully bad at stopping radicalization because making people angry and frightened is good for all of their bottom lines. So they all knowingly participate in this. I think Facebook has been. The the least responsible about it. But that doesn't that shouldn't be taken as praise of anybody. Like saying Twitter saying Twitter has done the best is saying like, well, we were all drunk driving, but John could actually walk most of a straight line before vomiting. So he was the least irresponsible of us who drunk drove that night. Just to put it in terms that I understand I was it sounds like Twitter is the Backstreet boy. That's like, look, I don't believe in Q Anon, but I see their points. That's. That's kind of the vibe I'm getting. Fair enough. So when deciding what which posts should show up more often in the feeds of other users, Facebook's algorithm weighs a number of factors. The end goal is always the same, to get the most people to spend the most time interacting with the site. For years, this was done by calculating the different reactions of post got and weighing it based on what responses people had to it again. For years, the reaction that carried the most weight was anger, the little snarling, :) icon you can click under a post it was at one point. Waited five times more than just like a like really like that again when I'm when I'm saying this was all intentional. They were like people who respond angrily to posts that keeps them on the site more. That's they spend the most time engaging with things that make them angry. So when it comes to determining what by which method, like how we choose to have the algorithm present people with posts, the posts that are making people angriest is the posts are algorithm will send to the most people. That's a conscious choice. That's a conscious. Yeah. It's so funny. How? I mean, not funny. It's tragic and upsetting, but just how specific the Facebook audience is that it's like, you would have to be the kind of person who would be like, I'd better react angry to that. Might be as specific as possible in my feedback to this post, which is FarmVille moms and yeah, it's boomers. It's it's boomers. And yeah and yeah, they just kind of knowingly set off a bomb in a lot of people's ******* brains. They're addicted to telling on themselves for no reason why? Why? Anyways, Facebook has something. Will the integrity Department, and these are the people with the unenviable task of trying to fight misinformation and radicalization on the platform, they noted in July. That was so embarrassing and watching the first date. Going on a first date, if you like. I work for the Facebook integrity department. Like, yeah. Good ******* luck. Yeah, I work for the airport in your life. My job is to go door to door and apologize to people after we bomb them. We have gift baskets for the survivors, you know? Like, it's that that's the gig, really. Send edible arrangements to people who have been drone strike like Ohh Jesus awful and a lot of bad. There's was one of the my favorite followers on Twitter is Brooke Binkowski, who used to work for Facebook and was like one of the people early on who was trying to to warn them about disinformation and radicalization on the platform years ago and left because like it was clear they didn't actually give a ****. And and a lot of the integrity Department people are actually like really good people who are a little bit optimistic and kind of young and come in and like OK. We'll make it. It's my job to make this huge and important thing a lot safer. And these people get chewed up and spit out very, very quickly. And the members of the integrity team were kind of analyzing the impact of weighing angry content so much, and some of them noted in July 2020 that the extra weight given to the anger reaction was a huge problem. They recommended the company stop weighing it extra in order to stop the spread of harmful content. Their own tests showed that dialing. The weight of anger back to zero, so it was no more influential than any other reaction, would stop rage inducing content from being shared and spread nearly as widely. This led to a 5% reduction in hate speech, misinformation, bullying, and posts with violent threats. And when you consider how many billions of Facebook posts there are, that's a lot less nasty **** some of which is going to translate into real world violence. And again, this was kind of a limited study, so who knows how it would have actually affected things in the long run. So Facebook made this, well, less money? This actually was kind of a win for them. Facebook did make this change. They pushed it out in September of 2020. And the employees responsible deserve real credit again. There's people within. There's people within Facebook who did things that really actually were good. Like changing this was probably like made the world a bit healthier. That said, the fact that it had been weighted this way for years, you you don't undo that just by dialing it back now. For one thing, anger has become such an aspect of. The culture of Facebook that even without weighing the anger emoji, most of the content that goes viral is still stuff that makes ****** people off. Because that's just become what Facebook is, because that's what they selected for for years. Like it also like, who knows, like if they've done this years ago, if they'd never waited anger more, it might be a very different platform with like a very different impact on the brains of, for example, our aunts and uncles. I think that that's really interesting too, because that that timeline lines up pretty quickly with or pretty exactly with where it feels like a lot of younger people were leaving that platform and the platforms became, became associated with older people. Because I feel like I don't think I was using Facebook consistently after 2017. I want to say was maybe my last Facebook year. Yeah, I stopped. I mean I I stopped visiting at super regularly a while back. Yeah, maybe around 2017. So in April of 2020, Facebook employees came up with another recommendation, and this one wouldn't be as successful as as changing, you know, the reaction of the algorithm to the angry reaction. Spurred by the lockdown and the sudden surge of Q Anon boogaloo and anti lockdown groups urging real-world violence. It was suggested by internal employees that the news feed algorithm deprioritized the posting of content based on the behavior of people's Facebook friends. So the basic idea is this a new what Facebook was doing was. You would if normally like the way you'd think it would work, right? Is that like your friend's posts something and you see that in your newsfeed, right? Like the posts of the people that you've chosen to follow and say are your friends, right? That's how you would you would want it to work is that worked? At one point they made a change a few years back where they started sending you things not because someone you followed had said something, but because they'd like to thing or they'd commented, like, not even commented, just like liked a thing. Like if they'd reacted to a thing, you would get distracted anyway. You would get that sent to your news feed and members of the Integrity team start to recognize like this has some problems in it. For one thing, it results in a lot of people getting exposed to dangerous ********. So they they they they start looking into like the impact of this and how how just sharing the kind of things your friends are reacting to influences what you see and what that does to you. On Facebook. The Integrity team experimented without changing this might work and their early experiments found that fixing this. Would reduce the spread of violence, inciting content. For one thing, what they found is that, like, normally if you hadn't seen someone, like a post about something that was maybe, like, violent or aggressive or conspiratorial, like a Flat Earth post or a post urging the execution of an elected leader. If you hadn't seen anyone that you knew react to that post, even if you saw it, you wouldn't comment on it or share it. But they found that, like, if you just saw that a friend had liked it, you were more likely to share it, which increases exponentially the spread of this kind of violent. Content. And it's this idea like the whole people weren't stopped being afraid to be racist at a certain point as as much as they had been earlier. And it led to this surge in real world violence. It is kind of the same thing people felt by seeing their friends react to this. They felt permission to react to it too, in a way maybe they would have like, well, I don't want to. Like, maybe I'm interested in Flat Earth yet, but I'm just going to ignore this because like, I don't want to seem like a kook. That is so ******* upsetting and and fascinating in the way that it affects your mind is is, yeah, there, there was a time. Where you would if you were, you know, racist, misogynist, homophobic, whatever you were, but you just didn't talk about it. But then all of a sudden there's this confirmation that like, hey, this person, you know and see all the time feels the same ******* way you do. So why be quiet about it? Let's discuss like it's just that's so dark. It's really dark. And so the, the, the integrity team sees this and they're like, we should change this. We should only show. We shouldn't be showing people just like the reactions their friends have had to content. Because it seems to be bad for everybody and they do find in some of their, you know because when they experiment they're like, we'll take this country or this city and we'll, we'll roll this change out in this limited geographical location to like try and see how it might affect its scale. And they do this and they see that like, oh changing this significantly reduces the spread of specifically violence inciting content. So they they're like, hey, we should roll this out service wide. Zuckerberg himself steps in according to Francis Haggen them the whistleblower End Quote. Rejected this intervention that could have reduced the risk of violence in the 2020 election. From the Atlantic quote an internal message characterizing Zuckerberg's reasoning says he wanted to avoid new features that would get in the way of meaningful social interactions. But according to Facebook's definition, its employees say engagement is considered meaningful, even if it entails bullying, hate speech, and re shares of harmful content. The episode, like Facebook's response to the incitement that proliferated between the election and January 6th, reveals a fundamental problem with the platform. Facebook's Mega Scale allows the company to influence the speech and thought patterns of billions of people. What the world is seeing now through the window provided by reams of internal documents is that Facebook catalogs and studies the harm it inflicts on people, and then it keeps harming people anyway. See that? That is so. That's always so interesting to to hear. And by interesting I mean, you know, psychologically harmful. Yeah, but. Because it's like, yes, that is a fundamental flaw of the platform, but that's also very entrenched into, like, what the DNA of the platform always was, which was based on harshly judging other people. Like that's why Mark Zuckerberg created Facebook, was to harshly judge women in his community. So it's like, I I know that it is, you know, on a bajillion scale at this point. But I'm always kind of stunned at how. People are like, oh, it's so weird that this went, you know, the way that it did. It's like, well, to an extent. It was always like that and maybe it was like cosplaying as not being like that. And for certain people, there were eras in Facebook where your user experience wouldn't be like that. But it, you know, this goes back almost 20 years at this point of of this being in the DNA of this, this **** show. Yeah, and it's it's really bleak. It's just really bleak. And it it also goes to show like the one of the things Zuckerberg will say repeatedly when he talks about when he, when he does admit. Yes, there are problems and there have been like negatives associated with the site and we're we're aware of that. They're humbling. But like you know, you also have to include all the good that we're doing, all of the meaning and the way he always phrases this is like all of the meaningful social interactions that wouldn't have happened otherwise. And then you realize when every time he says that have meaningful social interactions that have taken place on Facebook, when he says you can, he's including as these internal documents he includes bullying and people like making death threats and like talking about their desire to murder people like that's a meaningful interaction. People getting angry and trying to incite. Silence together is a meaningful social interaction, which I guess yes is not meaningless that it that has meaning. Plan meetings were meaningful social interactions. You know, you you gotta give the KKK that the Nuremberg rally was a meaningful interaction. The last meaningful interaction I had on Twitter led to like, a rebound I was dating coming to my grandma's funeral blackout drunk. So I you know, it's all just. Ohd man. God, it's been too long since I've shown up at a funeral just too drunk to stand. It is still one of my favorite memories with my family to this day. They're like, who is this guy? And I'm like, I don't really know. He's drunk as **** though. He came on the Megabus. Hell yeah, he did. Hell yeah, on the Megabus. And junk from a camelback on a Megabus? Yeah, that would be when I used to do a lot of bus trips, like when I was traveling and stuff. That would be one of the tactics as you feel like a thermos or a a camelback with like 40% cranberry juice, 60% liquor and and just, oh, I mean, get above that anyway. Awesome. I'm not above getting ****** ** on a Megabus, but, you know, on your way to my grandma's funeral, that was that was a move me and my friends got. Like wasted in San Francisco. One day, just like going shopping in broad daylight with a camelback where we would we would get a bottle of orange flavored Trader Joe's Patrone tequila, and we would get a half dozen lime popsicles and you just throw the popsicles in with the patrone and the camelback and throughout the day. It melts and you just have constant cold Margarita. It's actually ******* amazing, that *******. We were rocks. I wish I knew that when I was 22. Yeah, I recommended heavily. You will get trashed. And people don't notice. Dude, walking around with the ******* camelback and San Francisco. Nobody gives a ****. Oh my God, you're basically camouflaged. Yeah, they're. You know who else is camouflaged? The products and services that support this podcast camouflaged to be more likable to you by being wrapped in a package of of the three of us. That's how ads work. I thought you were saying that you were taking ads from the US Army recruitment center again. I mean, it's entirely possible. But, but but. At the moment, we're just camouflaging. I don't know, whoever, whoever comes on next, whoever comes on next, you'll feel more positively about because of our presence here. Wow. That's how ads work. It's good stuff. Ohh, we're back. My goodness, what a good time we're all having today. How are you doing, Jamie? You you make it OK. You made it sound sarcastic. I'm having. I am having a good time. Well, I'm glad. I'm happy that you're having a good time. That's that's my only goal for for this show and for you. That's a good time. See, now you're doubling down on it and I'm getting insecure. I'm doubling down. And I'm also talking more and more like an NPR talking head as I as I get quieter by the bed. Now I'm going to start having a panic attack. I've never heard you talk. I know this is how I talk to my cats when I'm angry at them. They're Robert. Honestly, I feel like we do have that dynamic. I feel like I'm a cat that you get angry at sometimes. Yeah, because you you jump on my desk and knock over my zevia. It's inferior just for attention. I know. I know for attention. But I've gotta work to keep you in expensive cat food. I only feed my cats the nice wet food. I would rather have your attention than really nice food. OK, that's not what my cat say. So there's just a shitload to say about how Facebook negatively impacts the increasingly violent political discourse in the United States and how they help to make January 6th happen. But I think the way I'd like to illustrate the harm of Facebook next is a bit less political. It also occurs in a different Facebook product. I'm talking about Facebook. The company generally refer to Facebook, but now we're going to talk about Instagram. In part one, I mentioned that young people felt that removing likes from Instagram temporarily corresponded with a decrease in social anxiety, the impact of Instagram. Specifically on the mental health of kids and teens can be incredibly significant. One of the other Facebook internal studies that was released as part of the Facebook papers was conducted by researchers on Instagram. The study, which again almost certainly would never have seen the light of day if a whistleblower hadn't released it, found that 32% of teen girls reported Instagram made their made them feel worse about their body. 22 million teenagers in the United States log on to Instagram on like a daily basis. So that's millions of of teen girls feeling. Just about their body because of Instagram, I've never been less surprised at learning a thing. A revelation. Well, good news, it gets worse. Like, no ******* kidding. So good. These researchers released their findings internally on in March of 2020, noting that comparisons on Instagram can change how young women view and describe themselves. Again, not surprising. So company researchers have been investigating the way that Instagram works, though, for for quite a while. Years, about three years that they've like been doing this seriously, and their previous findings. All backup the same central issues. Photo sharing in particular is harmful to teen girls, one 2019 report concluded. We make body image issues worse for one in three teen girls. Its findings included this damning line. Teens blame Instagram for increases in anxiety and depression. This reaction was unprompted and consistent across all groups. So, like they almost always mentioned that this app specifically makes them feel worse about their body. And we don't have to prompt them at all. Like, this just comes up when they talk about Instagram. I mean that that truly it's so Sophie, I don't know how you feel. I mean I truly think that like cause I've been on Instagram since what like 2014 some showing earlier. I think I like that on earlier. It was around when we were in high school. I I truly think my my life and my relationship to my body would be very different. 100% would be different. Not been on that app for the better part of a decade. Yeah. I mean I mean especially when they introduced filters. Yeah, we're about to talk about that so. Ohh, here's here's the kicker. And by kicker, I mean the bleakest part. In teens who reported suicidal thoughts, 13% of teens in the UK and 6% of teens in the United States claimed their desire to kill themselves started on Instagram. That's ******* disgusting and terrible. That's pretty bleak. More than 40% of them. Just like. I wish I were more surprised. Yeah, I know. But it's good to have this data. The data shows that more than 40% of Instagram. So more than 40% of Instagram. Which are less than 22 years old, which means you've got 22 million teens logging onto the service in the US every day. 6% of those people becoming suicidal as the result of Instagram is 1.32 million children who want started wanting to kill themselves while using Instagram. Hey everybody, Robert Evans here. And I actually screwed up the math that I just cited, which is often the case when I do math. So anytime I do math of my own in an episode, you're right to question me. I was calculating 6% of. 22 million, basically. But as the study noted, it's 6% of kids who are suicidal say that they're suicidal. Feelings started on Instagram, so I wanted to recalculate that about 7672 to 76, kind of depending on the source, percent of American teens use Instagram. There are about 42 million teenagers in the United States, so I I calculated from that and about 18% of about 19% of high school students. Of like, teenagers seriously considered attempting suicide. So if we're just counting serious attempts or people who seriously considered attempting suicide, that's 5,745,600 teens who seriously considered suicide. 6% of those. If 6% of those kids had their suicidal feelings start on Instagram, that's 344,736 children in the United States who suicidal feelings started on Instagram. And I furthermore. Found that about 9% of kids who seriously attempt suicide or seriously consider suicide attempt it. So of that 344,736 American kids teens whose suicidal feelings started on Instagram, about 31,000 and 26 kids attempt suicide. So about 31,000 kids in the United States on an annual basis attempt suicide. Because of suicidal feelings that started on Instagram. So that is the more accurate look at the data. And I apologize as always for the error. But what's interesting is that these studies do document like Facebook has is is as physically harmful at scale as like a wide variety of narcotics. Like most narcotics probably are less harmful at scale physically than Instagram. I think weed certainly is. So my God, if if every teenager was smoking weed instead of doom scrolling on Instagram, the world would just be so ******* they were. If they were chain smoking cigars instead of being on Instagram, we might be better off. It's so weird because I think about, like, how, I don't know, whatever. Like, I'm in my late 20s, so I feel like I have like a little bit of memory of, like, what life was like before you were constantly being encouraged to compare yourself to every single person you've ever met in your life, regardless of whether you know who they are, how they are or whatever. And I just made call me nostalgic, but I liked how I felt. Better? Yeah. Like, it's so absurd how much I know about people I don't give a **** about and how bad it makes me feel to know about the curated lives of people that I don't give a **** about and how I let that actively affect my daily life. And it's just, yeah, it's just ******* miserable. It is. It's horrible. It's horrible. That said, I like flirting on the application, so, you know, it's curated. Now, here's why, despite the documented harm that Instagram does, nothing's ever going to change. But as I stated, 22 million US teens used the Graham Daily. Only 5,000,000 log on to Facebook. So Instagram is almost five times as popular among teenagers as Facebook, where kids are leaving in droves. So Facebook, Mark Zuckerberg's invention, is now definitively just the terrain of the olds. And Facebook knows that kids are never going to come back, because that's not how how it being a kid works. Like they you don't get them back, they're going to continue to do new ****. Eventually they'll leave Instagram for something else. You know, that's just the way it ******* goes. Unless unless the 30 year nostalgic cycle is like, Facebook is actually back now, it's actually cool. And don't think it gave anybody a good experience enough to have that it's not the ******* Teenage Mutant Ninja Turtles. Yeah, no one's getting dopamine hits. That's a good yeah, it's not like flaming hot Cheetos. Yeah, nobody's thinking fondly back to scrolling Facebook when they were seven. They're thinking back to, I don't know, SpongeBob SquarePants. But they should, and as well they should. But at the moment, Instagram is very popular with teens, and Facebook knows that if they're going to continue to grow and maintain their cultural dominance, they have to keep bringing in the teens. They have to keep Instagram as profitable and as as addictive as it currently is. And that's why they bought Instagram in the 1st place. They only paid like a billion dollars for it. It was an incredible investment. And they spend 50% more time, lemonade and a day. Yeah, that's that's cheap as hell for something as influential and huge as Instagram. That is not real. Yeah. I wonder, do you know what it's worth now? I would guess significantly more than a billion dollars, but I don't entirely know how to value. But Facebook's like a trillion dollar company now. That's right. Yeah, they're very Facebook ******* sucks. And it's well, but Facebook? That includes Instagram. Yeah, OK. So the oht, yeah. And among, you know, teens are one of the most valuable demographics to have for advertisers. And Instagram is where the ******* teens go. You want the number. It's estimated value is 102 billion. So yeah, that's a good investment. Good investment if money, yeah, you gotta yeah. So the fact that so much is at stake with Instagram, the fact that it's such a central part of the company having any kind of future, is part of why Mark and Company have been so compelled to lie about it. None of this stuff that we've been talking about was released when Facebook researchers got it. Of course not. They wouldn't want anyone to know this ****. In March of 2021, Mark took to Congress, where he was criticized for his plans to create a new Instagram service for children under 13. He was asked if he'd studied how Instagram infects. Effects children and he said, I believe the answer is yes, so not yes. I think we've studied that, he told them then. The research we've seen is that using social apps to connect with other people can have positive mental health benefits. And I I'm sure there's something that he's gotten paid researchers to come up with that he can make that case off of. I'm sure in certain situations it may even be true. There are ways you can use social media that are good dearment I've legitimately smiled or had my heart warmed by things that happened on social media. It doesn't not happen. And I do think that there is a case for like, I mean, and it's you can't credit Mark Zuckerberg with it, but just, I mean going back to ******* like live journal days of just like friendships that have deepened as a result of social media, that's definitely a thing. But the the costs outweigh the benefits there by quite a bit. It's it's great. So, so Mark goes on to say, you know, I, I think we've got research that shows it can have positive mental health effects. You know, I think we've studied whether or not how it affects children. But he doesn't talk about, he, he leaves out all the stuff that I all the statistics, like about all the kids who whose suicidal ideation starts on Instagram. They had that data when he went before Congress. He just didn't mention it. They hadn't told anyone that **** up. Like he didn't say a *** **** word about it. Yeah. He was like, yeah, I think we've looked into it and, you know, there's some ways in which it can be healthy. Not. And also 1.3 million American kids became suicidal because of our app. Like, he did not throw that info out. Did he throw that? I mean, truly, I'm like up in the air of like, did he not say that because he didn't want people to know? Or did he just say that because he heard it and he didn't care and he forgot, like, you just don't know what? That guy that is so ******* evil. Wow. It's pretty great. And we'll talk more about that later. In May of 2021, Instagram boss Adam Mosseri told reporters that he thought. The impact on teen well-being by Instagram was likely, quote quite small based on the internal research he'd seen. Again, they haven't released this research. He's saying, oh, we have research and it says that any kind of impact on well-being is is pretty small. And again, the actual research by this point showed 13% of kids in the UK and 6% of kids in the United States were moved to thoughts of suicide by Instagram, which I would not call. Small, I would not call, I wouldn't. I wouldn't necessarily say it's huge, but that is not a small impact. It's like thousands and thousands and thousands and possibly millions of children. Yeah, that's significant. The Wall Street Journal caught up with Masari after the Facebook papers leaked. So they were able to like drill him on this a bit and he said a bit more. Quote, in no way do I mean to diminish these issues. Some of the issues mentioned in this story aren't necessarily widespread, but their impact on people may be huge, which is like again a perfect non statement. That's right. They're like, but what about the thing we couldn't possibly gauge at all versus the thing? We did. And we're actively distancing ourselves from, I mean, those statistics that's like at least one kid in every classroom. Like, yeah, that is gigantic. And when you read the responses of guys like Missouri and compare them to the response of God, like people like Mark Zuckerberg and official corporate spokespeople, it's it's very clear that they're working from the same playbook, that they're very disciplined in their responses. Because Missouri does try to tell the journal that he, he thinks Facebook was late to realizing there were drawbacks in connecting people in such large numbers. But then he says, I've been pushing very hard for us to embrace our responsibilities. More broadly, which again says nothing. He then pivots from that to stating that he's actually really proud of the research they've done in the mental health effects on teens, which again, they didn't share with anybody and I would argue lied about by omission in front of Congress. He's proud of this because he says it shows Facebook employees are asking tough questions about the platform. Quote for me, this isn't dirty laundry. I'm actually very proud of this research, which is the same thing Zuckerberg said about his own employees damning the service after Jan 6. It's right. I was going to say that's the same exact thing as the. As the as the like actually bad work. You know, talking about how the working for the Death Star is bad, it is like evidence of ohh the Death star is actually has a really open work culture. Like no, I don't know, I I feel like there are a few, there are not many CEO's that are good at flipping a narrative, but Mark Zuckerberg is particularly bad at it. And it's, I mean part of why they can be bad at it is it doesn't really matter, or at least it hasn't. ******* so far, sure. But the patterns, I mean, not enough to get a better figurehead, no. Like, yeah, yeah, the patterns pretty clear here. When a scandal comes out, deny it until the information that can't be denied leaks out, and then claim that whatever is happening at the site, whatever. Like our information you had about how harmful it is, is a positive because it means that you were trying to do stuff about it, even if you actually rejected taking action based on the data you had and refused to share it with anybody else. Masari and Zuckerberg were also careful to reiterate that. Any harms from Instagram? Had to be weighed against its benefits, which I haven't found a ton of documentation on. In fact, as the Wall Street Journal writes in five presentations over 18 months to this spring, the researchers Facebook researchers conducted what they called a teen mental health deep dive and follow-up studies. They came to the conclusion that some of the problems were specific to Instagram and not social media more broadly. This is especially true concerning so-called social comparison, which is when people assess their own value in relation to the attractiveness, wealth, and success of others. Social comparison is worse on Instagram, states Facebook's deep dive into teen Girl. Body image issues in 2020 noting that Tik T.O.K, a short video app, is grounded in performance, while users on Snapchat arrival photo and video sharing app are sheltered by jokey features that keep the focus on the face. In contrast, Instagram more focuses more heavily on the body and lifestyle, March 2020 internal research states. It warns that the Explore page, which serves users photos and videos curated by an algorithm, can send users deep into content that can be harmful. Aspects of Instagram exacerbate each other to create a perfect storm the research. Yeah, I mean, again, not a shocking revelation. Over here, there it is. I mean, I and I do think that let's tick Tock and Snapchat get off. Yeah, easy there. Like, there are certain there is absolutely toxic body image culture on there, and I feel like thinspo will thrive on any platform it ******* gloms itself onto. But Instagram is particularly bad because it's like where so many lifestyle people have launched and and there's so many headless women on Instagram, it's. It is shocking. There's so many like, not like, not like you machete my head off, but like you're not encouraged to show your head by the algorithm which sounds. Weird, but it is true. The less like it is just very focused on how you physically look. And then there's also this tendency to like. Tear people apart if they have edited their body to look a certain way when it's like, well, that the algorithm rewards editing your body to look a certain way. And to do all this and it's you do, bring up a good point where it's like, it's frustrating that it's important to critique Facebook in relation to its competitors like Tik T.O.K and Snapchat. That can lead to the uncomfortable situation of, like, seeming to praise them when they haven't done a good job. They just haven't been as irresponsible. It's kind of like a. Tracking like Chevron, if you look at all of the overall harms, including like their impact and like covering up climate change, maybe the worst of the big oil and gas companies. I don't know. It's debatable, but it's like if you're if you're criticizing Chevron specifically, it doesn't. You're not saying BP is great. You're just being like, well, these are the guys specifically that did this bad thing and they were the leaders in this specific terrible thing. Other bad things are going on, but we can't like the episode, can't be about how bad everyone. We're talking about Facebook right now. We have these documents from inside Facebook, I'm sure versions of this. Happening everywhere else. Listeners in your everyday life just don't use Facebook as a yardstick for morality. You know what? You'll just end up letting a lot of people off for a lot of ****** ** stuff. I would say in your regular life, don't use Facebook as all the sentence we needed there. Wow. So you asked, you were talking earlier about like because Mark went up in front of Congress and was like, yeah, I think we've got research on this and I've definitely seen research that says it's good for kids. We know everything. I just stated that quote. I just read everything like that's that's in those internal studies. We know that Mark saw this. We know that it was viewed by top Facebook leaders because it was mentioned in a 2020 presentation that was given to Mark Zuckerberg himself. We know that when in August of 2021, Senators Richard Blumenthal and Marshall. Blackburn sent a letter to Mark Zuckerberg asking him to release his internal research on how his platforms impact child mental health. We know that he sent back A6 page letter that included none of the studies we've just mentioned. Instead, the study said that it was hard to conduct research on Instagram and that there was no consensus about how much screen time is too much. Meanwhile, their own data showed that 40% of Instagram users who reported feeling unattractive said that the feeling began while they were on Instagram. Facebook's own internal reports showed that their users reported wanting to spend less time on Instagram but couldn't make themselves. And here's a quote that makes it sound like heroin teens told us they don't like the amount of time they spend on the app, but feel like they have to be present. They often feel addicted and know that what they're seeing is bad for their mental health but feel unable to stop themselves. That's Facebook writing about Instagram, like that's that's their own people saying this. Like, this is not some activist getting in here, you know, that's so I mean it's I, I guess good on them regardless of the level of self-awareness going on there. I mean, and and what I what I was thinking about earlier when it comes to any time. Zuckerberg is in front of Congress or in front of political officials. I feel like for a lot of people. The take away and the thing that gets trending is how little? Political officials and members of Congress understand about how the Internet works. And that's the like, the funny story is like, oh, Mark Zuckerberg talked about an algorithm and they and you know, like, this is this comes up all the time. It comes up on Veep. It came up on succession of just like how not Internet literate the majority of people who decide how the Internet works are. And it's like, it almost becomes like a he he haha, old guy doesn't know how algorithm works, but it's like, well, the consequence of that is that it. Ends up making Mark Zuckerberg look way cooler than he is. And it also doesn't address the problem at all of like, No, Mark Zuckerberg is omitting something gigantic here. And the majority of our, you know, lawmakers in Congress don't have the ******* you know, cultural vocabulary to even understand that. And that is like and and and I I guess it like makes for a couple of games, but it's just like, no, this is bad. Jamie, can you, can you commit to cancel? Do you remember that or that was sad. Oh, God, that made me sad, right? Cancel 5th. I mean that. I think that was the most recent one where it's like, OK, yeah, that is, you know, objectively funny that. But but like the the consequence of that is, I mean, that's ultimately a win for Instagram and that's a win for Facebook because it makes them look like they're operating on a level that the ******* government doesn't understand. And meanwhile, you know, one kid in every classroom is suicidal as a result of the inability of law. Like lawmaking officials. To understand the effect that this has, yes, it's just it makes me real mad, Robert. And one of the things about this is that while these lawmakers don't understand and sound like idiots talking to Mark Zuckerberg, his own employees, these researchers who are part of the integrity team, these researchers studying the impact of Instagram on teens know exactly how harmful it is. And they are grappling in real time with like, the damage their product is doing to children, members of these teams reported. Frustration at the fact that their colleagues often refuse to take their findings seriously. One former researcher told the Wall Street Journal that we're standing directly between people and their bonuses when they try to reduce the harmful aspects of Instagram. Because, like anything that reduces the harm is going to reduce its popularity. It's going to cut down on time on site, it's going to cut down on users. And so everyone else at Facebook but the integrity team gets paid by how much they increase engagement on the site. That's where the bonuses come from. So if you're trying to make. This book less dangerous, a huge number of the most influential employees have a financially a vested financial interest in you failing. So that's what. Yeah. And what that boils down to is like there is a financial incentive for you convincing a 13 year old girl that she should get ******* fillers in her face. Like what the ****? It's just it now all I'm going to try to sell the teens because it's ad break time. Jamie and I would like to ohk OK, OK, if you're a teenager, if you're a teenager. Yeah, you don't need Instagram. That ****** bad for you. You certainly don't need Facebook. That's where the odds are. But you know what's come back around to being cool and what I think your generation needs to get on board with smoking huge comical cigars. I'm talking the ones that are like 9 inches long and like 3 inches wide. Massive teens listening right now. Go out to your local cigar shop. Do not drive a sad adober the worst adult you find outside the cigar shop. Pay him for huge, cheap cigars to me and Shane. That's the new Instagram is chain smoking horrible cigars. This is this is what this is what's going to bring buying teenagers look zoomers zoomers. Bring any cleaner, right? You're all gonna we're all gonna choke to death on wildfire smoke. You might as well burn down a big, fat, macanudo river. Can I tell you? I bought a teenager a white claw the other day. I'm so proud of you. Good for you. Thank you. I felt good. I felt like I did a public service. Yeah. Teens go buy those big, fat, ridiculous lunatic cigars. Bribe, bribe for it. You're just preparing yourself for climate change. All right, here's the other end. OK, I mean, white cloud goes great with a huge ****** cigar, Jamie. No, it doesn't it. Absolutely. Ice smoking is bad for you. Andy has a white paper smoking a cigar. You puff it so it's healthy. You, you. Alright, here's some ads. Alright, we're back. We are we all just, we all just enjoyed a couple of really comically large cigars. We did not have one of those ridiculous long asylum cigars. It was great. Why now? Why are you fixated on this? What is happening? Because I I find that sketch from I think you should leave while the little girls are talking about smoking 5 macanudos to unwind at the end of the day. Actually. Quite funny. I mean, yeah, but like why I'm what I love. Where you reveal yourself to be a basic *****. I am a basic ***** watching Netflix as you're Netflix. That's why I'm thinking about cigars. I love that. I love that we're the middle of a podcast and you can't get off that. Well, I also think making children do things that's bad for them is funny. But not this way. Not the way Facebook send us a damn flashes. Send them to Dan flashes. I mean, they've already. I think the teens are rejecting NFT's pretty widely. Jamie. So when Facebook does try to make the case that their products are benign, they like to bring up studies from the Oxford Internet Institute, which is a project of Oxford University, which show minimal or no correlation between social media use and depression. The Wall Street Journal actually reached out to the Oxford researcher responsible for some of these studies who right away was like, wasn't like, Oh yes, they're right, everything's fine. He was like, actually, Facebook needs to be much more open with the research that they're doing because they have better data than than we can get than researchers. Can get. And so our actual information that they're citing is hammered by the fact that they're not sharing what they're finding and who knows how things could change in our conclusions could change if we had access to all of that data. He even told the Wall Street Street Journal, people talk about Instagram like it's a drug, but we can't study the active ingredient, which you'll notice is not him saying it's fine. It's him being like, yeah, I really wish we could actually study this better. It's it's difficult right now. So he's referring to it like drugs, which is the comparable. Scale of how it manifests and. OK, yeah, yeah. He's certainly not being like, everything's fine. I think that's clear. He's truly, like, constantly, Mr Policeman, I gave you all the clues in this situation and just no one gives a ****. It is very funny and like that that movie. Right. And that's what I was trying to say is that it's hilarious. Yeah. So we focused a lot on these episodes about how Facebook has harmed people and institutions in the United States. But as we've covered in past episodes, the social network has been responsible for helping to incite ethnic cleansings. Mass racial violence in places like Myanmar and India's mob violence against Muslims in India, incited by viral Facebook misinformation LED 1 researcher in February of 2019 to create yet another fake account to try and experience social media as a person in Kerala, India might from the New York Times quote for the next three weeks the account operated by a simple rule. Follow all the recommendations generated by Facebook's algorithm to join groups, watch videos and explore new pages on the site. The result was an inundation of hate speech misinformation. And celebrations of violence, which were documented in an internal Facebook report published later that month. And this is from the Facebook researcher following this test user's newsfeed. I've seen more images of dead people in the past three weeks than I've seen in my entire life. Total. What a great site mark built. Facebook's new tagline? The place for corpses. Oh yeah. My goodness. I mean, and it's like, I, I know that we, we have discussed Facebook's role in, in supercharging ethnic cleansings, but that is just that is so good. Yeah, it's not great, Jamie. Someone wrote that down, Robert and I just wrote that down and hit publish. It's not greater her because India is Facebook's biggest customer. 340 million Indians use one or more Facebook products. That's that's. Who? The people? Yeah, 340 million. That is something that I think is important to remember and something that I lose sight of sometimes is like Facebook is not a super popular platform for people of all ages in North America, but that's not the case everywhere. It is just it is the Internet for a lot of these people like that is the way that that is the whole of how they they consume the Internet. In a lot of cases, I mean maybe with like YouTube or something mixed in, but probably getting a lot of their YouTube links from their Facebook feed. Now, the fact the fact that India is the number one customer in terms of like number of people for Facebook, I'm sure the United States is still more profitable just because of like differences in income and whatnot. But this is a huge part of their business. But Despite that fact, they have failed to invest very much in terms of meaningful resources into having employees who speak the language. Or is this more the problem? The languages of India see India super mixed country, right? In terms of different like ethnic groups and religious groups, they have 22 officially recognized. Languages in the country, and there's way more languages than that in India that significant numbers of people speak. There's 22 officially recognized languages. Anyone who can travel there, and I've spent a lot of time in India, can tell you that being able to effectively say hello and ask basic questions of people can require a lot of research if you're traveling a decent amount. But Facebook aren't 20 something tourists on the prowl for good tandoori and Bang Lassies. They have effectively taken control of the primary method of communication and information distribution for hundreds of millions of people, and they feared failed to hire folks who might know of some of those people are deliberately inciting genocide. Against other people in the country. 87% of Facebook's global budget for identifying misinformation is spent on the United States. The rest of the planet shares 13% of their misinformation budget. You want to guess what percentage of Facebook users N Americans make up? Ohh, 10% eighty 7% of their budget goes on. 10% of their users of like of like dealing with disinformation. Metaphor for something else. Dealing with disinformation specifically. Yeah, now OK, when this leaked out, Facebook's response was that the information cited was incomplete and did not include third party fact checkers. They're like, well, this doesn't include all of the people the third party companies we hire, except for the data they show suggests that the majority of the effort and money spent on 3rd party fact checkers is for fact checking stuff in the United States and of course they did not. Elaborate on how including this information might have changed the overall numbers. So my guess is not by much of it. All internal documents do show that Facebook attempted to create changes to their platform to stop the spread of the disinformation during the November election in Myanmar. Those changes which also halted the spread of disinformation put out by the military, which was a big like the IT was the military inciting ethnic cleansings and like and trying to incite violence in order to like lock down political power ahead of this election. So they cut this significantly prior to the election. They see it as a problem. The institute changes similar to the changes, they'd talked about putting up in the US if things went badly with the election and these worked it dropped dramatically, yeah, and again and it's it's that is that is good. I'm glad that was done. But it's they only respond. Give me a second Jamie exclusively. Give me a second Jamie because prior to the election, they institute. These changes, which are significant. It reduces the number of inflammatory posts by 25.1% and reduces the spread of photo posts containing disinformation by 48.5% this is huge. That's that's really significant. As soon as the election was done, Facebook reversed those changes, presumably because they were bad for money. Three months after the election, the Myanmar military launched a vicious coup. Violence there continues to this moment. In response, Facebook created a special policy to stop people from praising violence in the country, one which presumably reduces the spread of content by freedom fighters resisting the military as much as it reduces content spread by the military. It's obviously too much to say that Facebook caused a coup in Myanmar. It's been I mean, there's a lot going on there. I'm not gonna. I'm not pretending that this is like, it's all just Facebook, but a major contributing factor. It wasn't insignificant for sure. And the fact that they knew how much their policies were helping and reversed them after the election, reversing this effect and leading to an increase in inflammatory content because it profited them more, is damning. Right? That's the thing that's damning around the world, Facebook's contribution to violence may be greatest in places where the company has huge reach, but pays little attention. In Sri Lanka, people were able to automatically add hundreds of thousands of users to Facebook groups that spread violent content. In Ethiopia, nationalist militia coordinated calls for violence openly on the app. The company claims that it has reduced the amount of hate speech people see globally by half this year. But even if that is true, how much hate was spread during the years where they ignored the rest of the world? How many killings, how many militant groups sided with new recruits? How many pieces of exterminationist propaganda spread while Facebook just wasn't paying attention? The actual answer is likely. Incalculable. But here's the New York Times again reporting on that test account in Kerala, India's perfect Turner friends. Yeah, yeah. 10 days after the researcher opened the fake account to study misinformation, a suicide bombing in the disputed border region of Kashmir set off a round of violence and a spike in accusations, misinformation and conspiracies between Indian and Pakistani nationals. After the attack, anti Pakistan content began to circulate in the Facebook recommendation groups that the researcher had joined. Many of the groups she noted had 10s of thousands of followers, a different report by Facebook published in December 2019. Down Indian Facebook users tended to join large groups, with the company's median group size at 140,000 members. In a separate report produced after the elections, Facebook found that over 40% of top views or impressions in the Indian state of West Bengal were fake or inauthentic when one inauthentic account had amassed more than 30 million impressions. A report in March 2021 showed that many of the problems cited during the 2019 elections persisted. In the internal document called adversarial harmful Networks India Case study, a Facebook researcher wrote that there were groups and pages. Replete with inflammatory and misleading anti Muslim content on Facebook, the report said that there were a number of dehumanizing posts comparing Muslims to pigs and dogs, and misinformation claimed that the Koran, the Holy Book of Islam, calls for men to rape their female family members. So that's significant. Like the scale at which this **** spreads is huge. And and I I mean. I don't even. I mean, I I feel like I know the answer. If if the hate is existing on that scale, unmitigated, but. Who is working to like? How many people does Facebook have working? On is there, is there an integrity team for this region? Like technically, yes, the question is OK there, how many of them and how many of the languages there are represented by the team and it's not many exactly like it's not many can't have a global company and not have global representation or **** like this is going to happen. Like it's just, it's actually, you know what it kind of reminds me of Jamie. I I was looking at this and I was thinking about the East India Trading Company when the East India Company took over. Large chunks of India, they took it over from a regime, the, the, the, the, the government, the monarchical government that had been in charge in that area prior was not a good government, right, because I'm number one, they lost that war. But like they weren't a very good government, they were a government. So they did do things like provide aid and famines and disasters and have people whose job it was to like handle stuff like that and like handle like make sure that, like place. The stuff was getting where I needed to go during like calamities and whatnot and doing things specifically that helped people. But we're not. They're not profitable because a big chunk of what government does isn't directly profitable. It's just helping to, like, keep people alive and keep the roads open and whatnot, right? Yeah, sustain humanity. Yeah. When the East India Company took over, they were governing it in control of this region. And this is actually Bengal, I think is their first place. But they don't have any responsibility. They don't have teams who are dedicated to making sure people aren't starving. They don't have people who are dedicated to actually keeping the roads open in any way that isn't necessary for directly the trade that profits them. They they don't do those things because they're not. They're governing. Effectively. But they're not a government. And there's been a lot of talk about how Facebook is a is effectively like a nation, a digital nation of like 3 billion people. And Mark Zuckerberg has the power of a dictator. And one of the problems with that is that for all of their faults, governments have a responsibility to do things for people that are like, necessary to stop them, like to, to deal with like calamities and whatnot. Facebook has no such responsibility. And so when people were not paying attention to Sri Lanka, to West Bengal, to. To Myanmar, they didn't do anything. And as we know, like 40 in the, you know, in a region where there are millions and millions of people, 40% of the views were fake and authentic content, you know, like that. Because they don't give a **** what's spreading, because they don't have to, because they don't have to deal with the consequences unless it ****** people off. As opposed to a government where it's like, well, yeah, we are made-up of the people who live here and if things go badly enough, it can't not affect us. I'm not trying to be. Again, not like with tick Tock, I'm not trying to praise the concept of governance, but it is better than what Facebook's doing, right? Right. Yeah. It's it's I think that that is like a very. I never considered looking at it that way but but viewing it as this kind of digital dictatorship that a colonial dictatorship, it's colonized peoples information like information streams it it's colonized the way people communicate. But it has no responsibility to them if they aren't white and wealthy. Yeah and and like, yeah. And marginalized people in the same ways that actual dictatorships do in terms of how much attention is being given, how are people being hired to support and represent this area? And of course, the answer is no. And of course the result of that is extreme human consequence and and and harm and it's so and it it it like. It's just so striking to me that it's still feels like in terms of. The laws that exist that. Can I mean that that even attempt to address the amount of influence and control that a gigantic digital network like like Facebook has? You know, the Facebook. I mean, unless people are yelling at them and then let unless their bottom line is threatened, they're never going to respond to stuff like this. Like it. That's. That's been made clear for decades at this point. It's great. I love it. So while I'm all worked out, yeah, a great deal of the disinformation that goes throughout India on Facebook comes from the RSS, which is an Indian fascist organization closely tied to the BJP, which is the current ruling right wing party. And when I say fascist, I mean like some of the founders of the RSS were were actual like friends with Nazis and they were heavily influenced by that ****. And like the 30s, both organizations are profoundly anti Muslim and the RSS propaganda has been tied to to numerous acts of violence. Facebook refuses to designate them a dangerous organization because of, quote, political sensitivities that might harm their ability to make money in India. Facebook is the best friend many far right and fascist political parties have ever had. Take the Polish Confederation party. They're your standard right wing extremists. Anti immigrant, anti lockdown, anti vaccine, anti LGBT. The head of their social media team, Thomas Garbage check. Sorry Tamaz told The Washington Post that Facebook's hate algorithm, in his words, had been a huge boon to their digital efforts. He calls it a hate Edgar algorithm and says this is great for us, expanding it like, I think we're good with emotional messages and thus they their **** spreads well on Facebook. Quote from The Washington Post in one April 2019 document detailing your research trip to the European Union, a Facebook team reported feedback from European politicians that an algorithm changed the previous year. Billed by Facebook chief executive Mark Zuckerberg is an effort to foster more meaningful interactions on the platform. Had changed politics for the worst. This change, Mark claimed, was meant to make interactions more meaningful. But it was really just a tweak to the algorithm that made comments that provoked anger and argument even more viral. I'm gonna quote from the post again here. In 2018, Facebook made a big change to that formula to provoke meaningful social interactions. These changes were built as a design to make the news feed more focused on posts from family and friends and less from brands, businesses, and the media. The process weighted the probability that a post would produce an interaction such as a like emoji or comment, more heavily than other factors. But that appeared to backfire. Hogan, who this week took her campaign against her former employer to Europe, voiced a concern that Facebook's algorithm amplifies the extreme anger and hate is the easiest way to grow on Facebook. She told British lawmakers, many of whom have their jobs because of how easy it is to make people **** go viral when it comes to say, like, I mean that shows again why it will never change our system of power, that that's not true for yeah. Yes. What's again we're, we're focusing on Facebook here in part because I do think it's more severe in a lot of ways there, but also just because, like, they're the ones who had a big leak and so we have this data. So we're not just saying, yeah, look at Facebook, obviously hate spreading. They're saying no, we have numbers. We have their numbers about how *******. Had the problem is I guess that that is the difference. Yeah. Yeah. And and yeah. We have evidence that the system is well aware of the now. Yeah. I would love to be talking about Twitter too. It's just I and maybe Twitter just never bothered to get those kind of numbers. Who knows? This caused what experts describe as a social civil war in Poland. Like this change. One internal report concluded we can choose to be idle and keep feeding users fast food, but that only works for so long. Many have already caught on to the fact that fast food is linked to obesity and therefore in its short term. Value is not worth the long term cost. So he's being like, we're poisoning people and it's addictive, like, you know, McDonald's. But like, people are going to give it up in the same way that McDonald's started to suffer a couple of years back because, like, they don't, they don't like the way this makes them feel. Actually, it's fun for a moment, but it's horrible. Morgan Spurlock for Facebook. Baby, we just got to get where's the Super size be for Facebook? Our entire society is the Morgan Spurlock. Their Facebook January 6th was morning. Yeah. I was gonna say I was like, I feel like it's I I mean whatever. Not to say that McDonald's isn't a hell of a drug, but like this is not this. I mean I I it's stronger because it's your ******* brain and self-image and the view of yourself. And I feel like that is the most strong manipulation that any given system, person, whatever can have on you is controlling the way that you see yourself. It's not. The same in terms of like involuntary baseness. I feel like it's something that you you very much participate in. Yeah, it's bad. Yeah, it's good. I think it's good. That's what I think. Jamie's Facebook has been aggressive. Called me today to see good, actually, to read all this and then say so. That's fine. Let's never talk of it again anyway, OK. Facebook has been aggressive at rebutting the allegations that their product leads to polarization. Their spokeswoman brought up a study. Which, she said, shows that academic research doesn't support the idea that Facebook, or social media more generally, is the primary cause of polarization. Now ignore for the moment that not the primary cause doesn't mean it isn't a significant cause, and let's look at this study. The spokeswoman was referencing cross country trends and affective polarization, an August 2021 study from researchers at Stanford and Brown University. This study opens by noting it includes data for only 12 countries and that all but Britain and Germany exhibited a positive trend trend. Towards more polarization. So right off the bat, there's some things to question about this study, which is #1. They're saying that like, oh, Britain hasn't gotten more polarized, which is like, have you, have you been there? Have you talked to him? But I yeah, not the, not that, don't live there, but not what I've been hearing from my friends that do. Here's the thing. When you look at how Facebook is basically using this, citing this as like evidence that like, look, we're fine, social media is not this study from this very credible study says that we're not the cause of of polarization. So everything's good. The study doesn't quite back them up on this right off the bat. One of the the the authors provides like notes this and this is from a write up by one of the authors on this study in a website,, where he's talking about the study and what it says. A flat or declining trend over the 40 years of our sample does not rule out the possibility that countries have seen rising polarization in the most recent years. Britain, for example, shows a slight overall decline but a clear increasing trend post 2000 and post Brexit. So he's saying that like. We don't have as much data from like more recent polarization and that may be a reason why this study is less accurate and why some of our our statements are not do not conform with like what people have observed. He goes on to note the data do not provide much support for the hypothesis that digital technology is the central driver of affective polarization. The Internet has diffused widely in all the countries we looked at and under simple stories where this is the key driver, we would have expected polarization to have risen everywhere as well. In our data, neither diffusion of Internet nor penetration of digital news are significantly. Correlated with increasing polarization. Similarly we found little association with changes in inequality or trade. One explanatory factor that looks more promising is increasing racial diversity. The non white share of the population has increased faster in the US than in almost any other country in our sample, and other countries like New Zealand and Canada, where it has risen sharply, have seen rising polarization as well. So I have some significant arguments with him here, including the fact that, as he notes here, his study only looks at Western nations, with the exception of Japan, all of the nations in the study. Our European or the United States and Canada and so they have all have had prior to 2000, higher penetrations of the Internet and non Internet mass media like. Outside of this, if you're trying to determine the impact of social media, elements of what social media has done were present in places like Fox News in the United States years before Facebook ever existed. And that was not the case in places like Myanmar and India, which are not a part of this study. So right off the bat, it's problematic to try and study the impact of social media on polarization only in countries that already. That robust mass media before social media came into the effect, which is not to say that I agree with their conclusion because I think there's other flaws with this study, but it one of the flaws is just that, like hundreds of millions of their users exist in countries where they did not. This study was not done where they were not looking right at these places, which is, which is just like dependent on. Yeah. And that's dependent on most readers just conflating, you know, North America and Europe with the center of the ******* world. And again, I have issues about like, OK, well, you're saying that racial diversity is more of a thing. Like, where is the, where is the, uh, propaganda? Where is the hate speech about racial diversity spreading? Is it spreading on social media? Like, yes, it is. I can say that as an expert. It's also just like, again, not that this study is even bad or not useful. It is one study. And again, we have internal Facebook studies that make claims that I would say throw some of this into question. But again, this is just how a corporation is going to react. They're going to find a study that they can simplify in such a way that. They claim that that they can claim there's not a problem because nobody, like none of the people who they're gonna be arguing with on Capitol Hill and precious you, the journalists, are going to actually drill into this and then talk to other experts who can reach out to members of that study and be like, how fair is this phrasing? How does that, how does it gel with this, the information, this information? As we saw earlier with the last study, when we people reached when the Wall Street Journal, to their credit, reached out to that scientist, he was like, well, actually they have better data than me. And I'd love to see it because maybe that'll change our conclusions. Anyway, yeah, Mark Zuckerberg has been consistent in his argument that deliberately pushing divisive and violent content would be bad for Facebook. Quote we make money from ads, and advertisers consistently tell us they don't want their ads next to harmful or angry content. While I was writing this article, I browsed over to one of my test Facebook accounts. The third ad on my feed was for a device to illegally turn a Glock handgun into a fully automatic weapon. Just just his heads up. Yeah. One of my I have a couple of test feeds and it was like, hey, this button will turn your Glock automatic, which is so many felonies. Jamie, if you even have that thing in a Glock in your home, the FBI can put you away forever. I have to laugh. I have to laugh because that is really, really scary. But yeah, it is like, Mark being like, look, Oh no way advertiser wants this to be a violent place by a machine gun on Facebook. You know, next to ads that are like, T-shirts about killing liberals and stuff like. A machine gun advertiser maybe would be one that wouldn't take issue with that whole I had ******* hang the media shirts advertised to me on Facebook. Like my God, go to go like **** you, Mark. So at the last, well, when I quit Facebook a couple years ago, I was, I was getting normy advertisements, I was getting good for you. Those really scary ones that says, like those custom T-shirts that say it's a Jamie Loftus thing, you wouldn't understand. I wouldn't like why and you wouldn't. I would not. You know, the only time Facebook I can think of recently actually anticipated something I wanted is they keep showing me on all of the accounts that I've I've used videos of hydraulic presses, crushing things. And I do love those videos. Those those are those are pretty, pretty fun. And that's the meaningful social interactions that Mr Mark Zuckerberg was talking about was the hydraulic press videos. And those are very comforting on the good old Internet, which also wasn't all that great, but on the old Internet, which was was it was a lot more. Fun though there would have been a whole website that was just like, here's all the videos of hydraulic presses crushing things. Come watch this ****. There wouldn't have been any algorithm necessary. You could just scroll through videos. There's no friend function, it's just hydraulic press ****. Yeah, that's all I need, baby. That's all I need. Yeah. So back to the point. It is undeniable that any service on the scale of Facebook again, like 3 billion users, is going to face some tough choices when it comes to the problem of regulating the speech of political. Movements and thinkers, as one employee wrote in an internal message, I am not comfortable making judgments about some parties being less good for society and less worthy of distribution based on where they fall in the ideological spectrum. That's true. This is again part of the problem of not regulating them like a media company, like a newspaper or something, because by not making any choices, they're making an editorial choice, which is to allow this stuff to spread, presumably actually, like if you were actually being held to some kind of legal standard that, again, most of our media isn't anymore. You would at least have to be like, well, let's evaluate the truthfulness of some of these basic statements before pressing. And I would say that's where the judgment should come in on. But that's expensive. If Facebook is saying we won't judge based on politics, but we will judge based on whether or not something is counterfactual, that I think is morally defensible, but that's expensive as **** and they're never going to do that. Look, spent more moral decisions are famously not cheap, and that is a lot of the reason why people do not do them. Yeah. It is true that having morals is not a profitable venture. Yeah, it's no, of course not. And the other thing that's true is that Facebook already makes a lot of decisions about which politicians and parties are worthy of speech and they make that decision based mostly on whether or not said public figures get a lot of engagement. Midway through last year they deleted and like all of the different anarchist media groups that had and a lot of anti fascist groups that had accounts on Facebook just across the board that deleted like crime think and they kicked off. It it's going down like a rapper. I know soul like yeah they, I mean I mean it's nobody ever complaints when bad **** happens to anarchists except for anarchists. But yeah they nuked a bunch of anarchist content, just kind of blanket saying it was dangerous and I think it was because they just nuked the proud boys and they had to be shown to be fair. But it has now come out that they have a whole program called X Check or Cross Check, which is where they decide which political figures get to spread violent and false content. Without getting banned, they had an engagement, yeah, based on engagement. They've claimed for years that everybody's accountable to cite rules. But again, the Facebook papers has revealed that, like, that's explicitly a lie and it's a life Facebook has told other people at high levels of Facebook. And I'm going to quote from the Wall Street Journal here. The program, known as Cross Checker X Check, was initially intended as a quality control measure for actions taken against high profile accounts, including celebrities, politicians and journalists. Today, it Shields millions of VIP users from the company's normal enforcement process. The documents show some users are whitelisted, rendered immune from enforcement actions, while others are allowed to post rule violating material pending Facebook employee reviews that often never come. At times, the documents show, ex check has protected public figures whose posts contain harassment or incitement to violence, violations that would typically lead to sanctions for regular users. In 2019, it allowed international soccer star Neymar to show nude photos of a woman who had accused him of rape to 10s of millions of his fans before the content was removed by Facebook. Whitelisted accounts shared inflammatory claims that. Facebook's fact checkers dimmed, deemed false, including that vaccines are deadly, that Hillary Clinton had covered up pedophile wings, and that then President Donald Trump had called all refugees seeking asylum animals. According to the documents. A 2019 review of Facebook's white listing procedures marked attorney-client privileged, found favoritism to those users to be both brought widespread and not publicly defensible. We are not actually doing what we say we do publicly, said the confidential review. It called the company's actions a breach of trust and added unlike the rest of our community. These people violate our standards without any consequence. And they lied to, like, their board members about whether or not this, they didn't lie about whether or not it was a thing they said it was very small. And just really, I think the initial claim was like, we have to have something like this in place for people like President Trump, but it's a tiny number of people. And it's because they, they occupy some political position where we can't just as easily, you know, delete their account because it creates other problems because they're not as strange as they need to be for this conduct to be acceptable. That was their justification. Justification on the level of you could be an ethical and still be legal. I mean, that's the way. Still true. Well, here's the thing. They they told their board they only did this for a small number of users. You want to guess how small, what that small number was? Oh, I love when Facebook says there's a small number. What is this? What is this? 5.8 million? That's so many. Yeah. Ohh dear. Yeah. OK. It's very funny. It's very funny. It's all good. I. That is, I mean, yeah, that I they're just, they're just. Yep. Robert, can I say something controversial, please? I don't like this company 1 bit. You don't. Well, I feel like that's going a bit. That's going a bit far. I'm sorry. And I'm famously, you know, I I don't like making harsh judgments on others, but I'm starting to think that they might be doing some bad stuff over there. Yeah, I would. You know, I don't like these people. I don't like these people. Well, you know what? I do like Jamie ending podcast episodes. Hmm. Oh, I actually do like that. Yeah, that's the thing I'm best at. Do you wanna plug your plugable? Yeah. Passion? Yeah, sure. You can fight. I'm gonna just open by plugging my Instagram account, a famously healthy platform that I'm addicted to. And I don't really have any concerns about it. I don't really think it's affecting my mental health at all. Come over there. And that's at Jamie Cray superstar. I'm also on Twitter, which Robert can't stop saying is the healthiest of the platforms it is. It is the salt on that of all of the people who are drunk driving through intersections filled with children, Twitter has the least amount of human blood and gore underneath the grill of the car. Like, this is Robert saying for all you Backstreet Boys heads, he's saying that Twitter is the Kevin Richardson of Social Media. I'm there as well. I'm saying the. The drunk driving Twitter car made it a full 15 feet further than the Facebook car before. The sheer amount of blood being churned up into the engine flooded the engine air intakes. But at the end of the day, we're all ******. Yeah, I'm on Twitter as well at Jamie Loftus. How you listen to my podcast? Yeah, you know, you can listen to my podcast, the Bechdel cast. You listen to acast, that's about the Kathy comics. You can listen to my urine Mensa, you can listen to Lolita podcast. You can listen to nothing. You know what never led to a genocide in any country as far as I'm aware. Jamie. Uh-huh. The Cathy comics. Well, see then. You haven't listened to the whole series. Ohh really? Is it? Ohh. You know what? You you. Yeah, that's that's why the less the last episode is your your life report from Sarajevo in 1994. Episode 11. Yeah. Irving really has. His politics were not good. Yeah, he was. He was. He was like weirdly into the Serbian nationalism. Irving is like for the Kathy. Comics. He's like, OK, I'm about to make a wild parallel, but Irving is like the barefoot, contessa's husband, and that he looks so innocent. But then when you Google him, you're like, wait a second, this man is running on dark money. This guy is like, was on Wall Street in the 80s. This is a bad man. That's he's basically like Jeffrey, the barefoot, contessa's husband. The Barefoot Contessa, is run on dark money. I know people don't like to hear it. They love her. But it's just true. It's objectively. True. And that's what I would like to say at the end of the episode. I've never heard of the Barefoot Contessa, and I don't know what you're talking about. I'm not even 1% surprised, but that's OK. But you know what? I do know about what I know about podcasts, and this one is done. Great ending.