Behind the Bastards

There’s a reason the History Channel has produced hundreds of documentaries about Hitler but only a few about Dwight D. Eisenhower. Bad guys (and gals) are eternally fascinating. Behind the Bastards dives in past the Cliffs Notes of the worst humans in history and exposes the bizarre realities of their lives. Listeners will learn about the young adult novels that helped Hitler form his monstrous ideology, the founder of Blackwater’s insane quest to build his own Air Force, the bizarre lives of the sons and daughters of dictators and Saddam Hussein’s side career as a trashy romance novelist.

Part Two: Mark Zuckerberg Should Be On Trial For Crimes Against Humanity

Part Two: Mark Zuckerberg Should Be On Trial For Crimes Against Humanity

Thu, 24 Sep 2020 10:00

Part Two: Mark Zuckerberg Should Be On Trial For Crimes Against Humanity

Listen to Episode

Copyright © 2022 iHeartPodcasts

Read Episode Transcript

Hey, Robert here. It's been like two months since I had LASIK and I'm still seeing 2020. All I had to do was go in for a consultation, then go in for a maybe 10 minute procedure and then my eyes have been great ever since. You know, I healed up wonderfully. It was very simple, couldn't have been a better experience. So if you want to explore LASIK plus I can't recommend it enough. They have over 20 years experience in the industry and they performed more than two million treatments right now if you want to try getting LASIK plus you can get $1000 off of your surgery when you're treated in September, that's $500. Of per eye, just to schedule your free consultation. Hello, I'm Erica Kelly from the podcast Southern Fried true crime. And if you want to go from podcast fan to podcast host, do what I did and check out spreaker from iheart. I was working in accounting and hating it. Then after just 18 months of podcasting with Spreaker, I was able to quit my day job. Follow your podcasting dreams. Let's breaker handle the hosting, creation, distribution, and monetization of your podcast. Go to That's In the 1980s and 90s, a psychopath terrorized the country of Belgium. A serial killer and kidnapper was abducting children in the bright light of day. From Tenderfoot TV and iHeartRadio, this is La Monstra, a story of abomination and conspiracy. The story about the man who simply become known as. Lamaster. Listen for free on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. ****** damn near killed him. That was the punchline to a joke without the rest of the joke, because I think we can all put things together now as adults. I'm Robert Evans. Yeah. Hi, Jamie. This is behind the ********. We talk about bad people on this podcast, and that was a little, a little bit of levity at the start of it before we get into depressing **** again. Some abstract levity, some abstract levity? Yeah, pieces of levity that one can assemble into. Comedy. Yeah, very nice. Yeah. I mean, comedy, you know, it it it is just a series of things that you put in the correct order. It's like a deconstruction of comedy, like when people take apart a sandwich and then serve it on a plate in a fancy restaurant. I gotta be on it. Anytime someone says it's a deconstruction of comedy, it's the least funniest **** you'll ever hear in your entire life. Like, he's deconstructing the medium and it's usually just like some some guy. It's. Yeah, some guy. Yeah. It's never any good, but you know, it is good, Jamie. What does Facebook's peanut butter packet? I was like, this can't be a transition to Mark Zuckerberg. No, because nothing about him just talked about this delicious lunch he had as I eat a great lunch and I'm eating. That was good peanut butter. And you know what? I'm content. Sophie is eating peanut butter. Delicious lunch. They were fried eggs involved. Had a couple of chips and most important, most important, we're all going to get back to my favorite thing to do with my good friend Jamie Loftus, which is talk about the extensive crimes of Mark Zuckerberg. Oh yes, I changed shirts between episodes, Robert. So now I have a little Marky with me. Yeah, you do. You've got your, your, your Marquez shirt. Yeah, my favorite, my favorite mark quote that yeah, you can be unethical and still be legal. That's the way I live my life. Uh, huh? It is amazing. And he really. I mean, there's been a lot said about him, but the man sticks to his guns. He lives by this credo to this very day. Yeah. Yeah. Yeah. You know who else sticks to their guns? Jamie? Whom? The death squads of the various dictatorial political candidates who use Facebook to crush opposition and incite race riots. That was a transition. That was transition. So, Jamie. Alright, I'm gonna start. It's time to start the episode and we're gonna start with a little bit of a little bit little little bit of an essay here. So once Once Upon a decade or so ago I had the fortune to to visit the ruins of a vast Mayan city in Guatemala to call and the scale of the architecture there was astonishing. If you ever get the chance to visit one of these cities you know and in Guatemala and Mexico wherever I really worth the experience just the again the size of everything you see the the the precision of the stonework. Umm, it's just amazing. And one of the things that was most kind of stirring about it was the fact that everything that surrounded it was just hundreds and hundreds of miles of dense, howling jungle. So I I spent like an afternoon there, and I got to sit on top of one of these giant temple pyramids, drinking a one liter bottle of Gallo beer and staring out over the jungle canopy and just kind of marveling at the weight of human ingenuity and dedication necessary to build a place like this. Metaphorical. And while I was there, Jamie, I thought about what had killed this great. City and the empire that built it because a couple of years earlier, really not all that long before I visited, theories had started to circulate within the academic community that the Mayans had, in the words of a NASA article in the subject in 2009, killed themselves through a combination of mass deforestation in human induced climate change. A year after my visit, the first study on the matter was published in Proceedings of the National Academy of Sciences. I'm going to quote here from the Smithsonian magazine. Researchers from Arizona State University analyzed archaeological data from across the Yucatan to reach a better understanding of the environmental conditions when the area was abandoned. Around this time, they found severe reductions in rainfall were coupled with an A rapid rate of deforestation as the Mayans burned and chopped down more and more forests to clear land for agriculture. Interestingly, they also required massive amounts of wood to fuel the fires that cooked the lime plaster for their elaborate constructions. Experts estimate it would have taken 20 trees. To produce a single square meter of cityscape. So, in other words, the Mayans grew themselves to death, turning the forests that fed them into deserts, all in the pursuit of expansion. It's a story that brings to mind a quote from the great historian Tacitus, writing about Augustus Caesar and men like him. Solitude. Nam Faciunt Packham, appellant. They make a desert and call it peace. That's what he's saying about Augustus Caesar. In the emperors like him, they make a desert and call it peace. That's that's one of those sundial phrases. I think a more accurate summation of the 200 years of peace that Augustus Caesar created than than what Mark put out. They make a desert and call it peace. Now, I read that quote for the first time as a Latin. Student in high school, and I saw it referenced in relation to Mark Zuckerberg in a Guardian article covering that New Yorker piece we quoted from last episode. And the title of that New Yorker article was can Mark Zuckerberg fix Facebook before it breaks democracy? So democracy, a free and open society where numerous viewpoints are tolerated, cultural experimentation is possible, and evolution is encouraged. These are the things that have made Facebook success possible. It could not have come about without them and outside of a culture that in embodies those values. And now that Facebook's member count is closing in at 3 billion, the social network is doing what all empires do. It's turning the fertile soil that birthed it into a desert. And as it was with the Mayans, all of this is. Being done in the name of growth. Catherine Los was an early Facebook employee and Mark Zuckerberg speechwriter for a time in her memoir The Boy Kings, which is what you call it, you triggered. I don't know, that's a good title. She lays out what she saw as the engineering ideology of Facebook. Quote Scaling and growth are everything individuals and their experiences are secondary to what is necessary to maximize. A system. Mark Zuckerberg and thus Facebook have held a very consistent line since day one of the company operating as an actual business. And that line is that Facebook's goal is to connect people. But this was and always has been a lie. The goal is growth, growth at any cost. In 2007, Facebook's growth leveled off at around 50 million users. At the time, this was not unusual for social networks, and it seemed to be something of a built-in natural growth limit, like maybe 50 million is about as much as a social network can get. Unless you're really start jinking uh the, the, the results and 50 million users. That's a very successful business. You can be very rich person operating a business like that day. You could call it a day. That's a great thing to accomplish. Myspace. Tom was thrilled with that. Yeah, my God bless Tom Space Tom is now. Who hasn't done a *** **** problematic thing? Traveling the world now taking photographs of of of the world that Mark Zuckerberg is destroying. I mean, he might be. The only person worth hundreds of millions of dollars that I'm fine with not taking the money back, right? Like let Tom you're fine. Like go keep doing your thing. Use the money to be boring. It seems like he is. Just use his fortune to be boring. What is? Do not remember it took like 5 months for me to get my Myspace silly. I don't remember anything about Myspace. That's my favorite thing about Myspace is I've forgotten everything about it but the name Myspace. Oh for sure Myspace was not good. What did I learn about a lot of goth music on it? Yeah, not middle school angst from not being in somebody's top aide, but Oh my God, PC-4 PC. I did a lot of PC-4 PC, as did I, my friend. You are so pretty PC for PC. And you know what? You know what? No one used Myspace for genocide that organizing militias to show up at the site of protests and shoot Black Lives Matter activists. Not done with with with Myspace, and I suspect Tom would have had an issue if it had been, I think so. I well, I don't. I don't know about Tom's, know the man, but the fact that he's kept his ******* mouth shut since getting rich and going off to do whatever he does makes me suspect that he's a reasonable man. Sure. Facebook hits this growth limit and it, it and it it kind of levels off a bit. And again, it's a very successful business in 2007, but it's not an empire. And that's what Mark wanted. That's the only thing Mark has ever wanted in his entire life. And so he ordered the creation of what he called a growth team dedicated to putting Facebook back on the path to expansion, no matter what it took. So the growth team quickly came to include the very best minds in the company who started applying their intellect and ingenuity to this task. One solution they found was to expand Facebook to users who spoke other languages. And this is what began we talked about last episode. The companies heedless growth into foreign markets. Obviously. At no point did anyone care or even consider what impact Facebook might have on those places. Neat. Yeah, I'm going to quote from The New Yorker again. Alex Schultz, the founding member of the Growth team, said that he and his colleagues were fanatical in their pursuit of expansion. You will fight for that inch, Alex said. You will die for that inch. Facebook left. No opportunity untapped. In 2011, the company asked the Federal Election Commission for an exemption to rules requiring the source of funding for political ads to be disclosed. In filings, a Facebook lawyer argued that the agency should not stand in the way of innovation. Ohh, OK, see, it doesn't seem like an innovation to me, but really loose interpretation of the word innovation. You know, I get this to an extent. So the other day I was drunk driving my forerunner and I was, I was, I was shooting at some targets I'd set up in the trees and the people in the neighborhood. I was doing this and said, oh, for the love of God, you're, you're please, you're endangering all of our lives. And I said you're standing in the way of innovation because I was innovating. What you can do. Drunk in a forerunner with a Kalashnikov. And they they got in the way of that. Yeah, I understand. I like to really heat up a pan and put it on someone's face just to innovate that. The art of what you get. You innovate their skins by burning it your face. Yeah. And and really, people standing in the way of that innovation. Yeah. Am I supposed to make progress in, in hurting people's faces? Yeah. I'm a fan of how poll pod innervated the capital city of Cambodia by forcing everyone out of it and then killing hundreds of thousands of them. It's just it's innovative, the way all of this is just horrific. And, like, I don't know, like, we're just talking about something else, but it's like the language of Silicon Valley applied to the to genocidal situations is just so it makes my *******. I mean, I'm. I just peeled all my skin off. You have to you have to agree that Hitler was an innovator. He innervated so many things. He really did change. He changed the game. Yeah, he absolutely. Change the narrative from their not being a war in Europe to there being a war in Europe that's called disrupting honey. That he didn't disrupt it. He disrupted the **** out of the Polish government. Oh my God. Ohh fun stuff doing this to me. So Sandy Parakilas, who joined Facebook in 2011 as an operations manager, paraphrased the message of the orientation session he received as we believe in the religion of growth. The religion of growth is what he was told when he joined the company. That's what it was called to him. Not only horrifying, but like what? Could you sound like a more of a sniveling loser? No saying the phrase religion of group. Yeah, just said quote. The growth team was the cure was the coolest. Other teams would even try to call subgroups within their teams the growth X or the growth Y to try and get people excited. And in the end, Facebook's finest Minds decided that the best way. They could. I know, I know, I know. Like one. Yeah, it's excited, I'm ***** I'm ready to go. I mean, with with that kind of narrative, I love to hear it. To hear it. So in the end, Facebook's finest Minds decided the best way they could further the great God of growth was to for Facebook to become a platform for outside developers. That this was the way to really, really get things going again. And you all remember the start of this. When Facebook made this change. This is when, like, what had once been a pretty straightforward service for keeping up with your friends from college was suddenly flooded with games like FarmVille and a bunch of like personality tests and ****. That period of time the moms ******* drone struck. Facebook by coming down with FarmVille, sending you 5 trillion invitation? Yeah, leaving you're like high school choir concert to go harvest strawberries? Yeah, and making a bunch of ******* money for Facebook. Whatever appearance these apps took, their main purpose was the same, which was to Hoover up all of your personal data and sell it for profit. Given her Social Security number to FarmVille, and that's just a fact. Yeah. And the only person you should give your Social Security number to is me. I do encourage all, all of our listeners to find my e-mail and just just e-mail me your social. Yeah. Just to just to your tips line. Yeah. Yeah. It's like it's like that you like that thing from that, that documentary about Keith Ranieri, who we also did episodes on the vow. It's your collateral. Send me your Social Security number so I'll know that you really care. Yeah, so Facebook's employees kind of realized very quickly after this change was made and these developers start flooding the the service with all of their **** that the company's new partners were engaged in some really shady behavior. One worker who was put in charge of a team ordered to make sure developers weren't abusing user data immediately found out that they were. And I'm going to quote again from The New Yorker here, some games were siphoning off users messages and photographs. In one case, he said a developer was harvesting user information. Including that of children to create unauthorized profiles on its own website. Facebook had given away data before it had a system to check for abuse. Paracelis suggested that there be an audit to uncover the scale of the problem, but according to Parakilas, an executive rejected that idea, telling him, do you really want to see what you'll find? Which look, I can identify with that too. I recently had an issue where I left a bag of potatoes and the top counter of my of my kitchen for, I don't know, somewhere between 4:00 and seven months. And when I took the I didn't want to. I knew something was wrong. I knew something was wrong up there because of the flies and the strange smell, but I didn't want to look into it because I didn't want to see the extent of the problem. And when I finally did, I regretted learning what an issue I had made for myself in my home. Robert, you are really you? Since we last spoke, you've become very prone to a metaphor. I am. I am a living metaphor. You are living out an innovative. I innervated those potatoes you distract. Those potatoes were disrupted. Disrupted with a family of magnets. OK, we don't need to. We don't need to talk about what a problem my life has become. Parakilas told me, The New Yorker reporter quote it was very difficult to get the kind of resources that you needed to do a good job of ensuring real compliance. Meanwhile, you looked at the growth team and they had engineers coming out of their ears. All the smartest minds are focused on doing whatever they can do to get those growth numbers up now. Jamie, Jamie Loftus, I happened to read this quote while I was struggling to work in the midst of unprecedented wildfires that devastated a huge amount of the state of Oregon. It made our air quality the worst in the world for a while. On the very day I read that article, four of my friends and journalistic colleagues were held in threatened at gunpoint by militiamen who had taken to the streets of a town ferry near Portland in the middle of an evacuation, because viral Facebook memes convinced them that Antifa was starting the fires around the same time that this was happening, that my buddies were getting held at gunpoint because they were not white people. In a militia thought that was suspicious, around that same time, a tweet went viral from a Portland resident and a former Facebook employee named Beau Rin. She posted a picture of the city blotted out by thick, acrid clouds of smoke and wrote. My dad sent me this view from my childhood room in Portland. It hit me that we have been wasting our collective intelligence in tech, optimizing for profits and ad clicks. Huh? Hmm. Glad you got on the on that page, Bo. Thank you. Glad we, I mean, sometimes it just takes something to put it all in perspective, wouldn't you say? Like you're home burning down. Yeah, sometimes. And the malicious being 20 minutes from your door? Yes. Malicious that organize on Facebook? Yeah. I mean, the story went pretty viral. Yeah, they wrote articles about it. They weren't harmed, yes. That the the two people I knew best who were there were Sergio almost and just almost, and Justin Yao, who are both wonderful reporters. But yeah, it was it was not lost on me that I think of the four people who were there, three of them were not white people, and that some of the white reporters had a much easier time. Interesting things about militias, you learn anyways you think it makes you think now. I thought that quote was interesting. Anyway, opening and interesting. Eruptive, disruptive, thought provoking. Like the fires? Yeah. And like militias soldiers made me think when I could think, yeah, I like how Facebook too much. Yeah. Yeah. I threw up in my N95 mask, walking down the street. Awesome. Yeah. I've been jogging and doing pull-ups in a gas mask. Just half naked in a gas mask in my front lawn like a normal person. You're the only person I know who would have seen this as as a as a possible outcome. And for that, for that, I thank you and I curse you. Yeah. So anyway, opening Facebook up to developers made a **** load of money and membership grew. And you know, for Mark's point of view, everything was going great. But Catherine, Catherine Los, his speech writer, saw a lot of the same problems Parakilas had seen and in her memoir. He writes. The idea of providing developers with a massive platform for application promotion didn't exactly accord, I thought, with the site stated mission of connecting people to me. Connection with another person required intention. They have to personally signal that they want to talk to me and vice versa. Platform developers, though, went at Human Connection from a more automated angle. They churned out applications that promised to tell you who had a crush on you if you would just send an invitation to the application to all of your. Friends. Ohh I know. Idea was that after the application had a list of your contacts, it would begin the automated work of inquiring about people's interests and matching people who were interested in each other. Mm-hmm. Soon developers didn't even ask you if you wanted to send invitations to your friends. Simply adding the application would automatically notify all of your Facebook friends that you had added it and invite them to add it to, using each user as a vessel through which invitations would flow virally without the user's consent in this way. Users needs for friendship and connection became a powerful engine of spam, as it already was with e-mail and on the Internet long before Facebook. The same will tell you you have a crush on who has a crush on you if you just send this e-mail to your address book, ploys. We're familiar to me from Hopkins when spammers would blanket the entire e-mail server with emails in a matter of hours, spread virally by students gullibly intering the names of their crushes and their crushes. e-mail addresses so. This was the face. This was the start of Facebook making choices for its users, choices that were based on what would be best for the social network, which was keeping people on the site for as long as possible. The growth team saw that proactively connecting people to each other worked out really well for Facebook's bottom line, even though sometimes, for example, people who had been horribly abused and raped by their spouses were reconnected to those spouses who they were hiding from and had their personal data exposed to them, a thing that happened repeatedly. And still happens repeatedly, you know, but that's a small price to pay for growth in 2010. That inch. You gotta fight for that inch. And sometimes fighting for that inch means connecting abused women to the men who horribly injured them. That's like Mark talking to Priscilla when they're trying to conceive a child. He's just like, you gotta fight for my inch, honey. You gotta fight for it. Oh, Mark Zuckerberg is incapable of talking during sex. He lets out a high pitched hum that is only audible to crickets. He doesn't. Yeah, he he he's sort of got a Ken doll situation going on. Whereas he just has a sex lump that gets really hot. Yeah, yeah. She has to actually withdraw the semen from inside his sacks using a needle. I can add her. She's actually put in the little, a little. Hold on, hold on, hold on. OK on the subject, holding my vomit, put in a syringe and then suck. And then and then she just has it and then she just has it. And if you want to have the emotional equivalent of Mark Zuckerberg seemed, no, that's not. Ohh, that's a bad way to. That's not fair to the products or services. It's not. Anyway, here they are format. Mint Mobile offers premium wireless starting at just 15 bucks a month. And now for the plot twist. Nope, there isn't one. Mint Mobile just has premium wireless from 15 bucks a month. There's no trapping you into a two year contract. You're opening the bill to find all these nuts fees. There's no luring you in with free subscriptions or streaming services that you'll forget to cancel and then be charged full price for none of that. For anyone who hates their phone Bill, Mint Mobile offers premium wireless for just $15.00 a month. Mint Mobile will give you the best rate whether you're buying one or for a family. And it meant. Family start at 2 lines. All plans come with unlimited talk and text, plus high speed data delivered on the nation's largest 5G network. You can use your own phone with any mint mobile plan and keep your same phone number along with all your existing contacts. Just switch to Mint mobile and get premium wireless service starting at 15 bucks a month. Get premium wireless service from just $15.00 a month and no one expected plot twists at That's Seriously, you'll make your wallet very happy at Mint Mobile. Com slash behind now a word from our sponsor better help. If you're having trouble stuck in your own head, focusing on problems dealing with depression, or just you know can't seem to get yourself out of a rut, you may want to try therapy, and better help makes it very easy to get therapy that works with your lifestyle and your schedule. A therapist can help you become a better problem solver, which can make it easier to accomplish your goals, no matter how big or small they happen to be. So if you're thinking of giving therapy a try, better help is a great. Option it's convenient, accessible, affordable, and it is entirely online. You can get matched with a therapist after filling out a brief survey, and if the therapist that you get matched with doesn't wind up working out, you can switch therapists at any time. When you want to be a better problem solver, therapy can get you there. Visit behind today to get 10% off your first month. That's better So by now we imagine that you've seen the theories on Tik T.O.K. You maybe even heard the rumors from your friends and loved ones. But are any of the stories about government conspiracies and cover ups actually true? The answer is surprisingly or unsurprisingly, yes. For more than a decade, we here at stuff they don't want you to know have been seeking answers to these questions. Sometimes there are answers that people would rather us not explore. Now we're sharing this research with you for the first time ever in a book format, you can pre-order stuff they don't want you to know now. It's the new book from us, the creators of the podcast and video series. You can turn back now or read the stuff they don't want you to know. Available for pre-order now, it's stuff you should read or wherever you find your favorite books. We're back. OK, so in 2010 Facebook launched Facebook groups, which would allow just about anyone to create a private, walled off community to discuss just about anything, including fascism, white genocide, or the need to gather a militia together and use it to kill their political enemies. If you were a regular listener of my show, you know the next part of this story from about 2010. In 2016, the United States saw an astonishing leap in the number of active hate groups. For some perspective, just from 2015 to 2020, the SPLC estimates there were a 30% increase in the number of hate groups nationwide. All of this growth was mostly spurred on by social media, and Facebook was one of the main culprits. And they knew they were too. They didn't admit it openly, but internally they were talking about it from pretty early on. And I'm going to quote now from a report in the Wall Street Journal, a 2016 presentation that names this author of Facebook researcher and sociologist Monica Lee found extremist content thriving in more than 1/3 of large German political groups on the platform swamped with racist conspiracy. Funded in Pro Russian content, the groups were disproportionately influenced by a subset of hyperactive users, the presentation notes. Most of them were private or secret. The high number of extremist groups was concerning, the presentation says. Worse was Facebook's realization that its algorithms were responsible for their growth. The 2016 presentation says that 64% of all extremist group joins are due to our recommendation tools. No, yeah. And that most of the activity came from the platforms groups. You should join and discover algorithms quote from the presentation our recommendation systems grow the problem. Oh, OK. Well, I mean as long as the word grow is in the sentence, I think that that's good enough. Growth is in there. You're good. Yeah, is in there. So really where we're growing and and and and what the consequences are not really worried about it. Yeah. It's just like when I'm in my forerunner, drunk as **** on mezcal and firing a Kalashnikov, all that. Matters is forward movement. It doesn't matter if that forward movement is driving through the trailer that a family lives in. What matters is that I'm moving forward and shooting and drunk. You're trash, right? Thank you. Wow, a judgmental statement. I'm innovating home ownership there. I mean, this is another example of just, you know, Facebook innovating people's interests. Like, hey, do you, do you enjoy this? I'm trying to think of the old Facebook groups that you used to be able to. Join like 10 years ago or it would be like, science is my boyfriend and it's like, you enjoy science. My boyfriend **** the Smithsonian Institute. Like, yeah, like school groups. It'd be like class of 2012, stuff like that. With, like, early. I mean, it's like, I mean, obviously very much in the same line of algorithmic thinking as YouTube where it's like, oh, did you enjoy this, like, collage of Gerard Butler images? How about a man sitting at his forerunner? Spring conspiracy theories for three hours on end that yeah, that's just growth, growth, growth almost as much as I love everything that I do with the Toyota 4 runner. All hammered in a trailer park? Yeah, that's that's the real problem is innovating the trailer parks near my house. Yeah, with a Toyota and a rifle, just so. Changing the narrative around it, changing the narrative around it to screaming mainly. So yeah, throughout, right. So throughout 2016, and particularly in the wake of the election, a lot of Facebook employees began to increasingly express their concerns that the social network they were pouring their lives into might be tearing the world. Alright, because again, most of these are very nice and intelligent people who don't want to live in a planet dominated by nightmarish dictatorships and a complete collapse in the understanding of truth that allows, for example, viral pandemics to spread long after where they should have spread. Because people don't have any sort of common conception about basic reality as a result of the influence of social media. We're example, they don't like that. They don't like the people who work at Facebook that kind of bummed out about contributing to that. One observer at the time reported to the Wall Street Journal. There was this soul searching. After 2016 that seemed to me this period of really sincere oh man, what if we really did mess up the world? And 2016, yeah, yeah. I love that we're going from in the 40s, like, like the scientist who does this, who does the same thing. Going now, I am become death, the destroyer of worlds, an appropriate comment for the thing that he'd done. And then something honestly equivalent in its destructive potential. But the response this time, because everything is tacky now. Oh, what if we messed up the world? We might have ****** this up. I like yeah, lol. Yeah. Starting to think we've severely ****** ** the planet. Never mind. Like, Jesus Christ, This is why Aaron Sorkin is still working is because people are saying ****** stuff and ****** ways. Yeah, I don't. You should cut that out. History. No, no, let's we can that. We should never cut out criticizing histories. Real villain Aaron Sorkin, who I call the pole pot of cable television. Yeah, yeah, that was evil. This soul searching did not extend to Mark Zuckerberg, who after the election gave the order to pour even more resources into Facebook groups, marking that feature out as emblematic of what he saw as the future of his site. He wrote a 6000 word manifesto in 2017, which admitted to playing some role in the disinformation and bigotry flooding the body politic. So he's like, yeah, we did. We had a we had something to do with it. He also claimed that. This book was going to start fighting against this by fostering safe and meaningful communities. From CNBC quote, Zuckerberg noted that more than 100 million users were members of very meaningful Facebook groups, but he said that most people don't seek out groups on their own. There is a real opportunity to connect more of us with groups that will be meaningful social infrastructure in our lives, Zuckerberg wrote at the time. If we can improve our suggestions and help connect 1 billion people with meaningful communities, that can strengthen our social fabric since then fascinating use of the word. Meaningful, meaningful, yeah. Meaningful, meaningful. What happened next was terrible and predictable and meaningful, Jamie. Very meaningful. A flood of New Year's got introduced and even pushed into extremist groups on Facebook. The changes Mark insisted upon have been critical to the growth of Q Anon, which was able to break containment from the weird parts of the Internet and start infecting the minds of our aunts and uncles. Thanks mostly to Facebook, which took no action against it until like a month or two ago. Within two years, Facebook hosted thousands of qanon pages with 10s of millions of collective members. Yeah, I'm gonna quote now from an NBC News investigation on the matter. Facebook has been key to Q anon's growth, in large part due to the platforms groups feature, which is also seen a significant uptick in use since the social network began emphasizing it in 2017. There are 10s of millions of active groups, a Facebook spokesperson told NBC News in 2019, a number that has probably grown since the company began serving up group group posts in the user's main feeds. While most groups are dedicated to innocuous content extremists from Q Anon conspiracy theorists to anti vaccination advocates have also used the group's feature to grow their audiences and spread misinformation. Facebook aided that growth with it with its recommendations feature powered by a secret algorithm that suggests groups to users seemingly based on interests and existing group membership and growth and growth. Yeah, it's funny there. There's one of the things I like about this NBC report, which is partly authored by Brandy Zadrozny, who's done a lot of great work. On the subject is, uh, they kind of talk about how profitable spreading dangerous fascist content is for Facebook quote, a small team working across several of Facebook's departments found 185 ads that the company had accepted praising, supporting or representing Q Anon. According to an internal post shared among more than 400 employees, the ads generated about $12,000 for Facebook and four million impressions in the last 30 days. Well, you have to imagine like they have to if if they're doing. The mass of what? I mean, it has to be financially profitable because it has to offset the cost of the PR hits that they know that they're gonna eventually take for ****. So they're, again, it's just assigning, yeah, assigning a price to lives and and and brains, which is a good thing to do. Reasonable? Yeah. Seems fair to me. Yeah. So, yeah, outside Facebook, the only people who really noticed what was happening initially were a handful of researchers that studied extremist groups. And I wasn't really one of them until, like, 2019. That I that I realized Facebook groups specifically were a problem was obvious that Facebook was the issue. But the I wasn't until Facebook Group kept threatening to kill me for two years. Yeah. That did happen to you, huh? That did happen to me. Yeah. Yeah, listen to her, her, her. My year and months of podcast. What are you doing? Thankfully, the people threatening to kill you were just members of Mensa, who I trust are not competent enough to pull off an assassination. I mean, don't challenge them, but let's hope so. No this I'm throwing down the gotten. Like, no, no, no, I don't think they could do it. Yeah, sorry. So yeah, I didn't really grasp the the scale of the problem with Facebook groups and specific until 2019 when I started really looking into the boogaloo movement. And it was kind of camouflaged because there was just so much fascist content everywhere on Facebook that the fact that groups in specific were driving a lot of the expansion of fascism in this country kind of got lost in the noise. But there were other researchers. Started to realize this early on, and workers inside Facebook realized what was happening right away. In 2018, they held a meeting for Mark and other senior leadership members to reveal their troubling findings from the Wall Street Journal quote, a Facebook team had a blunt message for senior executives. The company's algorithms weren't bringing people together, they were driving people apart. Quote our algorithms exploit the human brain's attraction to divisiveness, read a slide from a 2018 presentation. If left unchecked, it warned, Facebook would feed users more and more divisive content in an effort to gain user attention and increase time on platform. So. That presentation went to the heart of a question dogging Facebook almost since its founding. Does its platform aggravate polarization and tribal behavior? The answer it found, in some cases, was yes in some case. I mean, I guess that's technically accurate in some cases, yeah. Yeah. So Facebook, in response to this meeting, starts like a massive internal effort to try to figure out, like, how its platform might be harming people. And Mark Zuckerberg, in public and private around this time started talking about his concern that sensationalism and polarization were being enabled by Facebook. And to Mark's credit, he made his employees do something about that phrase. A little bit to his credit, yeah. It's OK. We'll take away the credit in just a second. So quote fixing the polarization problem would be difficult, requiring Facebook to rethink some of its core products. Most notably, the project forced Facebook to consider how it prioritized user engagement, a metric involving time spent like shares and comments that for years had been the lodestar of its system. Championed by Chris Cox, Facebook's chief product officer at the time and a top deputy to Mr Zuckerberg, the work was carried out over much of 2017 and 18 by engineers and researchers assigned to a cross jurisdictional task force dubbed common ground. And employees in a newly created and newly created integrity teams embedded around the company integrity teams. Sounds good to me. It sounds reliable. It sounds like they made sure that integrity was accomplished via teamwork. Yeah, yeah. So the common ground team proposed a number of solutions, and to my ears, some of them were actually pretty good. One proposal was basically to. You like to to kind of try to take conversations that we're derailing groups, like conversations over hot button political issues and excise them from those groups. So basically, if if a couple of members of a Facebook group started fighting about vaccinations and like a group based around parenting, the moderators would be able to make a temporary subgroup for the argument to exist in so that other people would zoom breakout room. Yeah, so that other people wouldn't be. Which I don't know if that's a great idea, but it was something. Another idea that I do think was better was to tweak recommendation algorithms to give people a wider range of Facebook group suggestions. Yeah, but it was kind of determined that doing these things would probably help with polarization, but would come at the cost of lower user engagement and less time spent on site, which the common ground team warned about in a 2018 document that described some of their own proposals as anti growth and requiring Facebook to take a moral stance. You can guess how that all went. Yeah. Mark Zuckerberg almost immediately lost interest. Yeah, some of this, a lot of this was probably due to the fact that it would harm Facebook's growth. But another culprit that, like employees who talked to the Wall Street Journal and other publications repeatedly mentioned is the fact that he was all **** hurt about how journalists were reporting on Facebook. Because after the Cambridge Analytica scandal, they kept writing mean things about him. No. Yeah. Mr Well, Mr Marco always has. Ask himself what would what would bad haircut Emperor do and bad haircut Emperor wouldn't, you know, wouldn't slow down and this ****? Absolutely not, one person was familiar with the situation, told The Wall Street Journal. The internal pendulum swung really hard to the media hates us no matter what we do, so let's just batten down the hatches. But January of 2020, Mark's feelings had hardened enough that he announced he would stand up. Quote against those who say that new types of communities forming on social media are dividing us. According to the Wall Street Journal, people who have heard him speak privately say he argues social media bears little responsibility for polarization. Now, there may be an additional explanation from Mark shifting opinions on the matter that go beyond being just greedy and angry about bad press. And that explanation is a fella named Joel Kaplan. Do you know Joel Kaplan? You ever heard of this dude? I don't know this Joel Kaplan character? Well, in short, he's the *** **** devil in long. He's the guy that Facebook hired to head up. The US public policy in 2011 and he became the VP of Global Public Policy in 2014. And Joe was picked for these jobs because unlike most Facebook employees, he is a registered Republican with decades of experience in government. This made him the perfect person to help the social network deal with allegations of anti conservative bias was as little empathy as possible. I'm sure. Yeah yeah. In 2016 there's all these rumors that Facebook is like censoring conservative content that are proven to be untrue, but the rumors go viral on the right. And so everyone on the right forever assumes that there were true. And basically Joel becomes increasingly influential after this point because he's Mark Zuckerberg's best way out of angering the right wing, which you actually can't not do because they're they're always angry and will just yell about everything until they get to kill everyone who isn't them, because that's what life finds away. Life finds a way for them. So Joe was a policy adviser for George W Bush's 2000 campaign and a participant in the Brooks Brothers Riot, which is. The thing that was orchestrated by Roger Stone to help hide a bunch of ballots in Florida that swung the election for W he was a part of that. What the ****? That's the guy who's basically running Facebook's response to partisanship right now. I had a physical reaction to that. It's awesome. So upsetting. You worked in the White House for basically the whole Bush administration, and in 2006, he took over Karl Rove's job. So if you want to visualize Joe Kaplan, he's the guy you get. Then you can't get Karl Rove anymore. He is. Mr Karl Rove wasn't available when the worst person in the world is like, I can't do this job anymore. Joe Kaplan's like, I got you. I got you, famous monster. Like I'm trying to rub some **** over here. Wow. Infamous ***** ** **** Karl Rove. Don't worry. I will continue your good work. Joe Kaplan and now I basically run Facebook, and if you Google him, Google has him listed as American advocate. Yeah, he is an advocate of things. I was like, I was like, again, I guess somebody don't get any more specific. It's so untrue. Don't enjoy his face. Just put it out there. Joe is presently one of the most influential voices in Mark Zuckerberg's world, and he was one of the most influential voices in the entire company. When the common ground team came back with their suggestions for reducing partisanship as policy chief, he had a lot of power to approve these new changes, and he argued against all of them. His main point was that the proposed changes were in his words. Paternalistic. He also said that, basically babying people. He also said that these changes were proportionately story this. Yeah, be a daddy story. I can't go in there daddy stories that end in a genocide, Robert. Oh well, God, if it makes you feel any better, all the genocides that this is going to lead to haven't happened yet. Oh, OK. Well, there you go. Yeah, yeah. So Joel also said that these changes would disproportionately impact conservative content because it tends to be bigoted and divisive. Since the Trump administration was at this point regularly tossing threats at Facebook, this had some weight quote from Wall Street Journal. Mr Kaplan said in a recent interview that he and other executives had approved certain changes meant to improve civic discussion. In other cases where proposals were blocked. He said he was trying to instill some discipline, rigour and responsibility into the progress as he vetted the effectiveness. And potential unintended consequences of changes to how the platform operated internally. The vetting process earned a nickname. Eat your veggies? No. Which sounds paternalistic to me. Actually. Sounds like the beginning of a daddy story that ends in a genocide. Wow. OK, eat your veggies. We'll get back to Joe Kaplan in a little bit. For now, we need to talk some more about the problem of violent of how we're going to talk about how the problem of violent extremism on Facebook groups got completely out of control. So yeah, this summer, which was marked by constant militia rallies, the explosive growth of the boogaloo movement, numerous deaths as a result of violent far right actors showing up at protests with guns. Facebook finally took action in late September to ban militias from using their service because they have to be balanced. They also banned anarchists from Facebook at the same time, even though anarchists have not been tied to any acts of fatal terrorism in recent memory. Because, you know, you got to placate the right wing because they're the only ones who matter. So let's play, let's let's ban the anarchists who have been spending the last four years trying to lay out the individual actors and groups who are members of these militias that are doing stuff like taking over checkpoints and holding my friends at gunpoint. We wouldn't want the folks who are keeping track of them to be able to use Facebook. That's the wrong kind of disruptive, you see. That's the wrong kind of disruptive. And advocating, you know, that's very similar to what, the dude in that. Trailer said when I was driving my forerunner through his trailer and shooting towards his children. Not at. And I'll tell. I'll tell you what I told him. What did you what did you say? I'm an innovator, so it's Mark. I don't know. That didn't really tie into this. It worked for me. I could see it. I could see it in kind of an ozarka kind of way. I could see. Yeah. Yeah. So Mark, by the way, is on record declaring that Facebook is a guardian of free speech. Which is one of the things he cited when he refused, noted that he was refusing to Fact Check political ads in 2020. So anarchists who want to talk about operating a communal garden or, you know, share details about dangerous militias are the same as militiamen baying for the blood of protesters, but political candidates spreading malicious lies about protesters who are being assaulted and killed based on those lies? That is fine. That's fine. So back to Facebook's integrity team growth or anything like that. Let's get back to Facebook's integrity teams and they're doomed quest to stop their boss from destroying democracy. So. The engineers and data scientists on these teams and chief like mainly like the guys who are working on the newsfeed. They they yeah, they they, according to the Wall Street Journal, arrived at the polarization problem in directly asked to combat fake news, spam, clickbait, and in authentic users, the employees looked for ways to diminish the reach of such ills. 1 early discovery Bad behavior came disproportionately from a small pool of hyperpartisan users. Now, another finding was that the US saw a larger infrastructure. Of accounts and publishers that met this definition on the far right than the far left, and outside observers documented the same phenomenon. The gap meant that seemingly apolitical actions, such as reducing the spread of clickbait headlines along the title of you won't believe what happened next. It it meant that, like, doing this stuff affected conservative speech more than liberal speech now. Yeah, and obviously this ****** *** conservatives the way that Facebook works means that users who post and engage with the site more have more influence the algorithm. Sees if you're posting 1000 times a week instead of 50, it likes that engagement because engagement needs money, and so it prioritizes your content over the content of someone who posts less often. This means that a bunch of networks of Russian bots and hyper active or like a Ian Miles Chong, who's a fascist troll who lives in ******* Malaysia and tweets about how, like, everybody needs to have a gun that they can use to shoot Democrats, even though guns are legal in his country and like, makes like did very recently miss anyway. Total ***** ** **** that that that these pieces of **** who are actively attempting to urge violence and who have urged violence and cause death mobs and other countries. It means that these people, because they're just shotgunning out hundreds of posts per day, will always be more influential than local journalists and reporters who are trying to bring out factually based information. Because it's better for Facebook, for a stream of lies to spread on their platform then a smaller amount of truth. Yeah, and it also lends itself to. To just never like to be releasing content so quickly that you couldn't possibly disprove or Fact Check things fast enough because there's just it's just ******** machine. Yeah, and you know, Facebook's teams found that most of these hyper active accounts were way more partisan than normal Facebook users, and we're more likely to appear suspicious, like to engage in suspicious behavior that suggested either a bunch of people working in ships or there were bots. So these these teams, these integrity teams did like the the thing that has integrity. Uh, which was they suggested their company fixed the algorithm to not reward this kind of behavior. Now, this would lose the company a significant amount of money, and since most of these hyperactive accounts were right wing and nature, it would **** *** conservatives. So you can imagine how this idea went over with Joel Kaplan. Since Mark was terrified of right wing anger, he tended to listen to Joel about these sort of things. They eat your veggies, daddy's daddy, and the eat your veggies policy review process stymied and killed any movement on halting. This problem. So how do we feel about that? We feel we feel great. We feel great. Glad you didn't charge, glad everyone's eating their veggies. I mean, even just the the dystopian nature of, like, mobilizing these teams to be like, hey, I've ruined the world. Do you think you could stop it before it blows up? Because this is going to be a real PR issue. Why would you do that? The best of luck to the team. There was another case where, like, because basically the only way to combat this stuff is to have another person Mark Zuckerberg respects or is at least scared of yelling at him or, you know, talking politely to him. The daddies of the world, yeah, opposite of whatever Joe Kaplan is saying. And they're thankfully was someone like that in Facebook they hired in 2017, uh, Carlos Gomez Uribe, who was the former head of Netflix's recommendation system, which is obviously made a lot of money for Netflix. So this guy, Carlos Uribe is a big, important git. For Facebook, right? So he gets on staff and he immediately is like, oh, this looks like we might be destroying the world. And so he starts pushing to reduce the impact that hyperactive users had on Facebook. And one of the proposals that his team championed was called sparing sharing, which would have reduced the spread of content that was favored by these hyperactive users. And it this would obviously have had the most impact on content favored by far right and far left users. And #1, there's more far right users on Facebook than far left. So that was going to disproportionately impact them. But the people who mainly would have gained influence were political moderates. Mr Uribe called it the the happy face. That's what he called this plan. And Facebook's data scientists thought that it might actually like it might actually help fight the kind of spam efforts that Russia was doing in 2016. But Joe Kaplan and other Facebook executives pushed back because. OK, yeah, yeah. And they. They didn't want to say because, you know, Max Uribe, you couldn't like you, you you had to be careful arguing with. So instead of saying this will be bad for money or it'll make the right angry at Us, Joe Kaplan invented a hypothetical Girl Scout troop and he asked what would happen if the girls became Facebook super sharers as part of a cookie selling program. That sounds like a metaphor you would do at the beginning of an episode. Yeah. He was like, basically like, what if these Girl Scouts made a super successful account to sell their cookies? Like, we would be unfairly hurting them. If we stopped these people who are baying for the deaths of their fellow citizens and gathering militias to their banner, they're like, OK, I hear you, but what about fictional Girl Scout Fake Girl Scouts? Yeah, he thinks it's awesome. So the the debate between Mr Reba and and Joel Kaplan eventually did make it to Mark Zuckerberg. He had to make a call on this one because both of them were kind of big names in the company. Mark listen to both sides and he took the cowards way out. He approved Uribe's plan, but he also said they had to cut the weighing by 80%, which mitigated most of the positive benefits of the plan. Yeah, after this mark, according to the Wall Street Journal quote, signaled he was losing interest in the effort to recalibrate the platform in the name of social good, asking that they not bring him something like that again. 200 years of peace mark. Do that. That has big 200 years of peace? Yeah, big. 200 years of peace energy. Yeah. In 2019, Mark announced that Facebook would start taking down content that violated specific standards but would take a hands off approach to policing material that didn't clearly violate its standards. In a speech to Georgetown that October, he said you can't impose tolerance top down. It has to come from people opening up. Sharing experiences and developing a shared story for society that we all feel we're a part of. That's how we make progress together. So you know, it's like. That is just such a wild way of saying like, I don't feel I am accountable for this and once again, I'm going to delegate this to the users of the people, friends I'm actively ruining. You know what makes progress harder in my opinion, Jamie? Products and services? No, no, when fascists are allowed to spread lies about disadvantaged and endangered groups to 10s of millions of angry and armed people because your company decided sites like the Daily Caller and Breitbart. Are equivalent to the Washington Post. This is something Facebook did win at Joel Kaplan's behest. It made both companies Facebook News Partners. These are the folks that Facebook trusts to help them determine what stories are true. They get money from Facebook. They get an elevated position in the news feed. Yeah. On an unrelated note, earlier this year Breitbart News shared a video that promoted bogus coronavirus treatments and told people that masks couldn't prevent the spread of the virus. This video was watched 14 million times. Six hours before it was removed from Breitbart's page, they removed it, presumably because it violated Facebook policy, and Facebook has a two strike policy for its news partners sharing misinformation within a 90 day. When Mark was asked why Breitbart got to be a Facebook trusted partner while spreading misinformation about an active plague that was killing hundreds of thousands of Americans, mark held up the two strike policy as a shield quote. This was certainly one strike against them for misinformation, but they don't have. Others in the last 90 days. So by the policies we have, which by the way I think are generally pretty reasonable on this, it doesn't make sense to remove them. No, that's pretty great, Jamie. That's pretty awesome. But you know what's even better about this: Unethical, but still legal. Haha. What's even better about this is that Breitbart absolutely violated Facebook policies more than two times in 90 days and it was covered up. That's what's even better. Yeah, you have to imagine Breitbart is violating Facebook policies multiple times a day. Like Kaplan. Have some hide it. Yeah, that is such. I mean, it's awesome. It's awesome. I'm gonna read actually about that, First off, by citing an incredible report by BuzzFeed, who by the way, all credit to BuzzFeed, BuzzFeed and I think all, you know, BuzzFeed. And I've cited a number of great articles, including that one from the Wall Street Journal, which is really important. BuzzFeed has probably been, of all of the different media companies, the most dedicated and, like hounding Facebook like a ******* dog with a groin. British I don't know how to there. I'm very proud of Buzzfeed's reporting on Facebook. Thank you for keeping on this one, y'all good work now to remove that image from my head. But yeah, I'm going to quote from this report on the fact that. Facebook fraudulently hid the fact that one of their information partners was violating their own policies and spreading disinformation about an act of plague. And then and then you need to take an ad break, just so you know. Oh, I'll take an ad break now. We'll we'll get, we'll get, we'll get to this afterwards. Because. Hot teaser if there's one thing that prepares me to hear about how democracies, both in the nation I live and around the world are being actively murdered for the profit of a man who's already a billionaire. If there's one thing that makes that easier to take its products and services. Sweet Lullaby of a product or service. Nothing, nothing keeps me going, gets me intellectually hard like a product or a service. I want to be surrounded. I want to die surrounded by my most beloved products and services. I have a feeling that you will, because there's a good chance that a horrible wildfire will sweep through the city you live in. And sorry, that's getting too dark. Mine too, maybe. Yeah, yeah. I'm just saying, like, hey, as long as we're on the same same page there, that's great. And it's OK. If we make it out of that fire, Facebook will ensure there's lots of armed and misinformed militias waving guns wildly in the areas we attempt to evacuate through. Well, as long as my death will have been completely in vain, yes, that's what Facebook promises for all of us. And that's what products and services promise for all of us. Here we go. Mint Mobile offers premium wireless starting at just 15 bucks a month. And now for the plot twist. Nope, there isn't one. Mint Mobile just has premium wireless from 15 bucks a month. There's no trapping you into a two year contract. You're opening the bill to find all these nuts fees. There's no luring you in with free subscriptions or streaming services that you'll forget to cancel and then be charged full price for. None of that. For anyone who hates their phone Bill, Mint Mobile offers premium wireless for just $15.00 a month. Mint Mobile will give you the best rate whether you're buying one or for a family and. That meant family start at 2 lines. All plans come with unlimited talk and text, plus high speed data delivered on the nation's largest 5G network. You can use your own phone with any mint mobile plan and keep your same phone number along with all your existing contacts. Just switch to Mint mobile and get premium wireless service starting at 15 bucks a month. Get premium wireless service from just $15.00 a month and no one expected plot twists at That's Seriously, you'll make your wallet very happy. At Mint mobilcom slash behind now a word from our sponsor better help. If you're having trouble stuck in your own head, focusing on problems dealing with depression, or just you know can't seem to get yourself out of a rut, you may want to try therapy. And better help makes it very easy to get therapy that works with your lifestyle and your schedule. A therapist can help you become a better problem solver, which can make it easier to accomplish your goals, no matter how big or small they happen to be. So if you're thinking of giving therapy a try. Better help is a great option. It's convenient, accessible, affordable, and it is entirely online. You can get matched with a therapist after filling out a brief survey. And if the therapist that you get matched with doesn't wind up working out, you can switch therapists at any time when you want to be a better problem solver therapy can get you there. Visit behind today to get 10% off your first month. That's better Com behind. So by now we imagine that you've seen the theories on Tik T.O.K. You maybe even heard the rumors from your friends and loved ones. But are any of the stories about government conspiracies and cover ups actually true? The answer is surprisingly or unsurprisingly, yes. For more than a decade, we hear at stuff they don't want you to know have been seeking answers to these questions, sometimes their answers that people would rather us not explore. Now we're sharing this research with you for the first time ever in a book format, you can pre-order stuff they don't want you to know now. It's the new book from us, the creators of the podcast and video series. You can turn back now or read the stuff they don't want you to know. Available for pre-order now, it's stuff you should read or wherever you find your favorite books. All right, we're back. So we're talking about how Facebook covered up the fact that Breitbart was repeatedly spreading disinformation that should have gotten them removed as a trusted partner. Quote from BuzzFeed some of Facebook's own employees gathered evidence they say shows Breitbart, along with other right wing outlets and figures including Turning Point USA founder Charlie Kirk, Trump supporters diamond and silk, and conservative video production nonprofit Prager University has received special treatment that helped it avoid running afoul of company. Olicy, they see it as part of a pattern of preferential treatment for right wing publishers and pages, many of which have alleged that the social network is biased against conservatives. On July 22nd, a Facebook employee posted a message to the company's internal misinformation policy group noting that some misinformation strikes against Breitbart had been cleared by someone at Facebook seemingly acting on the publications behalf. A Breitbart escalation marked urgent end of day was resolved on the same day, with all misinformation strikes against Breitbart's page and against their domain cleared. That explanation, the employee wrote. The same employee said a partly false rating, applied to an Instagram post from Charlie Kirk, was flagged for priority escalation by Joel Kaplan, the company's vice president of global public policy now. The whole article itself details just a ton of other instances in this, and it's all incredibly shady. I'm not gonna go into all of it in tremendous detail because we are running out of time, but if you read the article, it's extremely clear that Joel Kaplan is directing Facebook to actively violate the company's own policies in order to keep right wing ******** peddlers spreading lies on the platform for profit. Kaplan has faced no punishment for this, although his behavior did provoke outrage from employees in and Facebook's internal chat system. They're trying to apply to Daddy. That's how it goes. Facebook employees are beginning angrier and angrier at this sort of thing throughout the year. Remember back in May when President Trump posted this message to Twitter and Facebook? Quote there is no way zero that mail in ballots will be anything less than substantially fraudulent. Mailboxes will be robbed, ballots will be forged, and even illegally printed out and fraudulently signed. The Governor of California is sending ballots to millions of people, anyone living in the state, no matter who they are or how they got there, we'll get one that will be followed up with professionals. Calling all of these people, many of whom have never even thought of voting before, how and for whom to vote. This will be a rigged election. No way I do remember that, Robert. So do you remember that Twitter to again, it's like, like the mildest. I could possibly give someone credit to that level of credit Twitter Fact Check the president's tweet. Which was not nothing. And that's all I'll say about it. Marginally. That does not qualify as nothing. Again, that qualifies. It's the most responsible action that's made as social media CEO took. Mark, on the other hand, refused to let his employees do anything similar, allowing the president's flagrant misinformation to circulate on his network. This enraged employees and they got angrier when his when the looting starts, the shooting starts. Post was let up. They created a group in workplace, their internal chat app called Let's Fix Facebook parentheses. The company. It now has about 10,000 members. One employee started a poll asking colleagues whether they agreed quote with our leadership's decisions this week regarding voting misinformation. In posts that may be considered to be inciting violence, 1000 respondents said the company had made the wrong decision on both posts, which is more than 20 times the number of responses who said otherwise. So. Facebook employees, after this, staged a digital walkout, and they, like, changed their workplace avatars to a black and white fist and called out sick in mass among hundreds of them and stuff. And, you know, I'm going to quote from BuzzFeed again here. As Facebook grappled with yet another public relations crisis, employee morale plunged. Worker satisfaction metrics measured by micro pulse surveys that are taken by hundreds of employees every week fell sharply after the ruling on Trump's looting post, according to data enter obtained by BuzzFeed. On June 1st, the day of the walkout, about 45% of employees said they agreed with the statement that Facebook was making the world better, down 25 percentage points from the week before. That same day, Facebook's internal survey showed that around 44% of employees were confident in Facebook leadership leading the company in the right direction, a 30 percentage point drop from May 25th. Responses to that question have stayed around the lower mark as of earlier this month, so. Pretty significant drop in faith in the company from its employees. And yeah, Zuckerberg, the ultimate decision maker according to Facebook's head of communications, initially defended his decision to leave Trump's looting post up without even hiding it. Like with a warning like Twitter, Mark stated quote unlike Twitter, we do not have a policy of putting a warning in front of posts that may incite violence. Because we believe that if a post incites violence, it should be removed regardless of whether or not it's newsworthy, even if it comes from a politician. So you have to wait. For there to be violence and yeah, and then be like, oh, it turns out that post was actually really bad and we should take it down there again. The amount of bodies that he needs attached to do a single thing is staggering. Four days later, Mark backtracked from BuzzFeed quote in comments that a company wide meeting on June 2nd that were first reported by Recode, Facebook's founder, said the company was considering adding labels to posts from world leaders that incite violence. He followed that up with the Facebook post three days later in which he declared Black Lives Matter and made promises that the company would review policies on content discussing excessive use of police or state force. What material effect does any of this have? One employee asked, and workplace openly challenging the CEO's commitments to review offer. Nothing material. Has anything changed for you in a meaningful way? Are you at all willing to be wrong here? Mark didn't respond to this, but on the 26th, nearly a month of June, nearly a month later, he posted a clarification to his remarks, noting that any post that is determined to be inciting violence will be taken down. Employee dissatisfaction has continued to swell over the course of the summer. One senior employee, Max Wang, even recorded a 24 minute long video for his colleagues and BuzzFeed, and another article has the all the audio for this. It's worth listening to. In the video, Max outlines why he can't morally justify working for Facebook anymore, and he's a pretty early. Employee. I think his video quotes at length from books on totalitarianism by Hannah Arendt, who is one of like the Great scholars of the Holocaust. Yeah, he shared the video on workplace with a note that started I think Facebook is hurting people at scale. Yes, yeah, yes, it is. Absolutely. Yeah. All right, yeah. Like Emperor Augustus, who had members of his own family killed for disobedience, Mark did not like being questioned and, gasp, disapproved of by his own employees. On June 11th he hosted a live Q&A where he delivered a message to employees who were angry at his enabling of hideously violent fascist rhetoric. This is a lot of this is in, like, I think, response to, yeah, the killings and such. I've been very worried about the level of disrespect and in some cases of vitriol that a lot of people in our internal community are directing towards each other as part of these debates. If you're bullying your fellow colleagues into taking a position on something, then we will fire you. Well good. You know, the the amount of consistency. I mean you got to appreciate it that I'm really glad that that employee I mean just spoke directly about because it's like at what point truly what do you have to lose like I guess except for your life depending on how Mark Zuckerberg wants to go about it. But I mean it's. I don't know. It's it is. It is so frustrating, even though it's like, I don't know what else to do other than, you know, whatever some some **** in Minecraft, but but just people are continually waiting for this person in this company to act in the best interest. It's like it's not. When has it ever happened? Name a time. Even in the face of like the most brutal public disapproval, there's too much promising. It's amazing. As you are saying all this and as I just finished the thing that I am saying. A Bloomberg story just dropped like as we were recording this episode. I'm just going to read you the I haven't read the story. I'm just gonna read you the title. Facebook accused of watching Instagram users through cameras. No. Ohh man, it ******* rules. Oh my God, it's so good. Have we talked about that before though? Because I have. I've I've had that issue with Instagram before where I'll close out Instagram and then you'll see the the little section of your. I found in the top left where it where it indicates that you're being recorded, it goes it turns like when I listeners. Let me know if you've had a similar issue. Sometimes when I close Instagram it looks like my phone just stopped recording, but it goes away really quickly. It's like I'm a millisecond that it's that it happens all the time. So that is not shocking at all. Yep, furry Doo. Woo, I don't know what I'm going to actually title this some. I don't know what I'm actually going to title this episode. The the working title that I started this under was Mark Zuckerberg needs to be tried in The Hague and hung in public until dead. But I don't think legals going to let me go with that title. I think it's clickable. I think it's clickable as ****. Yeah, I think you'd get great engagement, all that. I mean, that's what he'd want it. Yeah, we may have to go with a different type. I mean, I'm not urging illegal behavior. I'm urging that he be tried in the international. Criminal Court and then once convicted a hung until the net by the neck until dead for his crimes. Which is what you do when a world leader commits genocide, right? Right. Yeah, it is true. But I probably won't title the art of the episode that I don't know. I mean, I'm glad you put it out there, though. Let's not. Let's not take it out of the running. Yeah, there's a number of other options. I mean. Mark Zuckerberg continues to disrupt 200 years of peace. There's so many options. I can't wait for the 200 years of peace that only involve dozens of wars. I mean, it was safe. If the 200 years of peace began in 2004, imagine how much peace we have to look forward to. I think it's similar to the amount of peace I brought that trailer park there, offered this, this, this. You're you're you're a little sicko. You're a little sicko. I know it. I know it well. We're all gonna be fine. Fine. We're all gonna be great. We're all good. Yeah. You want to plug some ****? Yeah. Yeah. Thank you for disrupting my life and inner sense of peace once again. Always be disrupting. Baby. That's always been a PR. You've always been a huge disruptor. And, yeah, you could follow me on Twitter or Instagram. Uh, which is watching me right now. And then if you want to contribute to a candidate that I love, Fatima Iqbal Zubair, we're doing a live read of the Twilight script this Friday evening, 5:00 PM Pacific. That sounds very exciting. Do to distract yourself from the Void team, Edward. Yeah, see you there, see you there. And I am going to be. You know, he's lost. I had to do it. Doing the thing that I normally do, which is staring into the abyss and going, hey, hey, quit being an abyss. You really bumming us all out. Abyss. And the abyss is like you have great chemistry. The business made me a lot of money. A lot of money, which is something that I feel very, very conflicted about. The abyss is rich. That's the thing that Nietzche missed is sometimes when you stare into the abyss, you get a 6 figure salary because it's incredibly profitable to talk about the abyss on a podcast. Yeah, the abyss has fake face facial recognition software and it's pretending to be me elsewhere. Yeah. Jamie. The abyss, Loftus. That's what I've been called. You can follow us on Twitter and Instagram, where you're probably being watched at ******** pod. You can follow Robert on Twitter and I write OK you can buy stuff from RT public store and also the bechtol casty public store. Where Jamie designs all the the artwork for that and it's amazing and well, yeah, I think I covered everything. Wash your hands, wear a mask, yeah? Man, yeah. Play. Robert. Did they show you my bedazzled bolt cutters? I'll send you a picture of them by bolt cutters. No, I would love to see your bedazzle bolt cutter. Yeah, I have. I have a pair of bolt cutters that are still usable, but also mostly covered in rhinestones. I'll send it to you. Yeah, that's the episode. Hell yeah, right. Hello, I'm Erica Kelly from the podcast Southern Fried True crime, and if you want to go from podcast fan to podcast host, do what I did and check out spreaker from iheart. I was working in accounting and hating it. Then after just 18 months of podcasting with Spreaker, I was able to quit my day job. Follow your podcasting dreams. Let's break your handle the hosting, creation, distribution, and monetization of your podcast. Go to That's SPREAK. in the 1980s and 90s, a psychopath terrorized the country of Belgium. A serial killer and kidnapper was abducting children in the bright light of day. From Tenderfoot TV and iHeartRadio, this is La Monstra, a story of abomination and conspiracy. The story about the man who simply become known as. Lamaster. Listen for free on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Want to say I don't know less? Listen to stuff you should know more. Join host Josh and Chuck on the podcast packed with fascinating discussions about science, history, pop culture, and more episodes. Dive into topics like was the lost city of Atlantis Real? And how does pizza work? Say goodbye to I don't know, because after listening to stuff, you should know you will. Listen to stuff you should know on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.