There’s a reason the History Channel has produced hundreds of documentaries about Hitler but only a few about Dwight D. Eisenhower. Bad guys (and gals) are eternally fascinating. Behind the Bastards dives in past the Cliffs Notes of the worst humans in history and exposes the bizarre realities of their lives. Listeners will learn about the young adult novels that helped Hitler form his monstrous ideology, the founder of Blackwater’s insane quest to build his own Air Force, the bizarre lives of the sons and daughters of dictators and Saddam Hussein’s side career as a trashy romance novelist.
Thu, 20 Jun 2019 10:00
How YouTube Became a Perpetual Nazi Machine
Hello, I'm Erica Kelly from the podcast Southern Fried True crime, and if you want to go from podcast fan to podcast host, do what I did and check out spreaker from iheart. I was working in accounting and hating it. Then after just 18 months of podcasting with Spreaker, I was able to quit my day job. Follow your podcasting dreams, let's break or handle the hosting, creation, distribution, and monetization of your podcast. Go to spreaker.com. That's spreaker.com. If you could completely remove one phrase from your vocabulary, which phrase would you choose? I don't know. Correct answer. No, I meant I don't know which phrase, and the best way to banish I don't know from your life is by cramming your brain full of stuff you should know. Join your host, Josh and Chuck on the Super Popular podcast packed with fascinating discussions on science, history, pop culture and more episodes that ask, was the lost city of Atlantis Real? I don't know. Is birth order important? I don't know. How does pizza work? Well, I do know. Bit about that see? You can know even more, because stuff you should know has over 1500 immensely interesting episodes for your brain to feast on. So what do you say? I don't want to miss the stuff you should know. Podcast you're learning already. Listen to stuff you should know on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Sisters of the Underground is a podcast about fearless Dominican women who stood up against the brutal dictator. He needs to be stopped. We've been silent and complacent for far too long. I am Daniel Ramirez, and as a Dominicana myself, I am proud to be narrating this true story that is often left out of the history books. To hear your husband blood on his hands, listen to sisters of the underground wherever you get your podcasts. What? Severing my tumors? I'm Robert Evans, host of behind the ******** the podcast where we tell you everything you don't know about the very worst people in all of history here with my guest Sophia's Co, host of private parts unknown. And we're talking about how it's ******** when doctors won't let you keep the pieces of your body that they take out of you. That's really frustrating. Yeah, that's like the least they could do for you. It's an infringement of your civil liberties. Like that tumor or whatever is still a piece of you. And you deserve to like. Go get drunk on a farm and shoot it with a shotgun if that is your choice. That sound it does some, yeah. I wanted to just keep it forever. To kind of always, like, point to it and be like, yeah, I beat you, you little ***** and they won't let me ******* do it. They wouldn't let me keep my my breast cancer tumor and my chemo port, which I'm like, that's that was part of me for a year. You why? That's so frustrating. Like, OK, I this this message is going out to Sophia's doctor. Kudos on the cancer. Thanks so much for the curing. Blah, blah, blah. **** ****. Not letting her keep her tumor. And I'm I'm very angry about this. Please write a campaign. Listeners. Just contact my doctor at and I'm just going to make not important make Sophia's tumor in her legal possession again. It's not it's going to be hard to be as tumor sophias again. Yeah. There we go. Well, today's subject has nothing to do with tumors. Cancer. Other than that, you could argue today's subject is a cancerous tumor metastasizing in the body politic of our nation. Bam. Wow, you're talking about YouTube. That's a beautiful metaphor for a website that most people just use for jerking off. Hey, jerking off and not paying for music. Oh, that's true, that's true. There's one other thing. Jerking up YouTube makeup tutorials. It's useful for jerking off makeup tutorials, free music, and of course, filling the world with Nazis. Again, as a Jew, I love to hear that. That's the aspect of YouTube we will be talking about today. Because it's it's Nazi reinvigorating aspects now. It's so fun to leave the former USSR because it's not great for the Jews. And then get here and then Donald Trump becomes president and you're like, OK, that's a great, that's a good joke. That's very funny. And then the Nazis spread through YouTube so they're just everywhere and you're like, oh, OK. Well. I guess I'll just live in fear forever. I will say one of the best things that, like, the few things I actually got out of college was taking Holocaust studies courses and coming to like, the dawning realizations. Like a kid who was raised in like, a Republican household where like, everything you heard about the Holocaust was how awesome it was that American soldiers stopped it. Like, don't like reading about history and coming to, like, the gradual realization, like, oh, it's always sucked to be Jewish everywhere. Like everyone's killed these people. Like, Oh my God, like it was. It didn't start with the Nazis. Like, reading about, like, what happened in the in Zaris, Russia, the Shell Netsky massacre, which killed, like, 700,000 people and, like, grown on yeah, this is **** has not been good for us for a long time, and now we're talking about digital programs. Yeah, exactly. It's just nice to know that you cannot escape the Nazis. Yeah. Yeah. That is the message YouTube has delivered to all of us, along with allowing me to listen to old Kris Kristofferson concerts for free. Weird. Hey man, ************ made some great music. Alright, I'm going to start with my prepared remarks, if that's OK, please. March 23rd, 2016, Microsoft unveiled a new chat bot to the Serie denizens of Twitter. The bot was an experiment in what Microsoft called conversational understanding Tay, the chat bot would engage in discussions with real people and learn from them, evolving and changing from its interactions, just like real people do. Yeah, yeah, yeah. As they release Tay into the wild, Microsoft said they hoped that Twitter users would be happy to engage it in casual and playful conversation. Tay entered the world at around 10:00 AM Eastern Standard Time at 2:27 AM the following morning. Less than 24 hours later, it tweeted this Bush did 9/11 and Hitler would have done a better job than the monkey we have now. Donald Trump is the only hope we've got. It just took that little time to learn how to be an honest about 18 hours pushing quickly. But you knew was a bad move when they opened it up to the you knew it kind of America. One of the surprising stories or arcs of the the last decade is Microsoft going from like this evil corporation and everyone's eyes like this innocent summer child who like, they never tried to steal my data, they never lobbied to stop me from being able to repair my computer. They just they believed they could make a chat bot and the Internet would teach it how to be a real boy, and it turned into a Nazi. And they were so horrified. Like, I just, I can't, I can't believe that someone had. Positive hopes for that. I mean, how few people have you met in life online that you would think that that was gonna end up? Well, I think it's because Microsoft's team, we're all old heads. Like, it was a bunch of guys in their 50s who, like, didn't know the Internet is anything but like a series of technical things. They they didn't they weren't active Twitter users or whatever. They didn't go on the gram. Well, it's very quickly that you learned that if you upload a video of yourself doing stand up how many you look like a ***** you're going to get it right away. I mean, yeah, learning curve is yeah, and it's it's a learning curve a lot of companies have. I can remember back in 2012 when Mountain Dew decided to let the Internet vote on the name of a new soda flavor and four channers flooded it in. Before long, the top vote getter was Hitler did nothing wrong. Which is that I will, I will admit, rolls right off the tongue. Be kind of interesting to see that soda marketed in the 711, but yeah, yeah, especially when you picture the fact that like the Sprite, the Sprite spokesperson is like, isn't it Vince Staples? Yeah, I think so. Yeah. So does not really generally ever sold by people that are so uncool that they think renaming is soda. Is this some kind of, I don't know, forward thinking movement of their philosophy? Yeah. And it's both of these cases, Tay and mountain dew's new soda flavor vote were cases where any if you'd gone to either of us in 2012 or in early 2016 and said we're going to do this, what do you think will happen? I think we both would have said, I think every listener of this podcast would have said, oh, it's going to get real Nazi, like immediately. Like it's going to turn into a Nazi like is. That's just what people on the Internet think is funny. And that's going to happen. But like, you know, older folks, people who who, you know, are focused more on living that life of the mind off of the Internet. They didn't anticipate that sort of stuff. And there really wasn't much of a harm ever, you know, in either that Mountain Dew contest or in the Tay chatbot. Like Tay was was a public facing AI. It was never in control of something. But the question of its radicalization does lead to the question, what if another company built an AI? That learned in that way that wasn't public facing. And what if that company trusted the AI to handle a crucial task that operated behind the scenes? And if that were to happen, I think it might look an awful lot like what we've seen happen to Youtubes recommendation algorithm, I think, and I'm not the first person to make this comparison, but what had happened to YouTube's algorithm over the last few years is what happened to that chat bot. But since no one interacts with YouTube's algorithm directly, it took a long time for people to realize that Youtubes recommendation AI had turned into Joseph Gerbils, which is, I think where we are right now. So that's what today's episode is about. Yay. I bring you one for the fun ones. I'm glad that it's not about dead babies, because I know how you love to do that **** to me. It does end a little bit on a hurting babies now. Are you kidding me? You *** ** * *****? Stop getting me here under false pretenses. Stop it. I feel like at this point, you know, if you're coming on behind the ******** some babies are going to get harmed. OK? I assume sometimes. Maybe people just murder adults. That's what I was hoping for coming in today. I was like is there is more adult murder than baby murder. Adult murder? But no. We always have to get minors involved. If it's me, don't we, Evans? That the murders involved in this episode were all adults. The molestation involved in this episode involved children. So that's a step up so much. No, the Georgia Tan one had child murder and child molestation. Well, now it's adult murder in child molested. Well, child ***********. Will you let me live a life full of just adult murder? You know what, Sophia? I'll make this promise to you right now, over the Internet, when we do our one yearly optimistic episode. About a person who's not a *******. This upcoming Christmas I'll have you on as the guest for that one. **** yes. I can't wait. And hopefully the irony of that episode will be that very shortly thereafter we'll find out that person is also a *******. Yeah, it'll be the story of the person who saved 1000 kids by killing 900. It's still 100 like that game. That will be exactly your pitch when that happens. Already Googling, yeah yeah, you're like, so how can we? Let's get back to YouTube. As I write this, the Internet is still reeling from the shock waves caused by AJ gigantic battle over whether or not YouTube should ban conservative comedian, and I put that in air quotes Steven Crowder. Now, if you're lucky enough to not know about him, Crowder is a bigot who spends most of his time verbally attacking people who look different than him. He spent several months harassing Carlos Maza, who makes YouTube videos for Vox calling Maza a lispy queer, and a number of other horrible things. Crowder has not explicitly directed his fans to attack Carlos in real life, but Crowders. Again, don't need to be told to do that. When he directs his ire at an individual, Crowder fans swarm. That individual Carlos is regularly bombarded with text messages, emails, tweets, etcetera. Calling him horrible names, asking him, demanding that he debate Steven Crowder, telling him to kill himself, doing all the kind of things that sociopathic Internet trolls like to do to the targets of their ire. Now, Carlos and Twitter asked YouTube to Ban Crowder, and he pointed out specific things Crowder had said and highlighted specific sections of YouTube's terms of service. Crowder had violated YouTube, opted not to ban Crowder because Crowder has nearly 4,000,000 followers and makes YouTube a lot of money. There has been more dumb fallout. YouTube demonetized Crowder's channel and then randomly demonetized a bunch of other people so conservatives couldn't claim they were being oppressed, and it's all a big, gross, ugly mess. But the real problem here, the issue at the core of this latest eruption in our national culture war has nothing to do with YouTube's Craven refusal to enforce their own rules. Steven Crowder would not be a figure in our nation's political discourse. If it weren't for a series of changes YouTube started making to their algorithm in 2010. Now, YouTube's recommendation algorithm is what you know. It recommends the next video that you should watch. It's why, if you play enough music videos while logged in, YouTube will gradually start to learn your preferences and suggest new music that often you really like. It's also why teenagers who look up the Federal Reserve for a School Report will inevitably find themselves recommended something that's basically the protocols of the Elders of Zion with better animation. God, yeah. Yeah. It's it's both of those things. Well, but they animations. Good. OK, yeah. Putting some money behind that. anti-Semitism? Yeah, it's it's it's, it's a mixed bag. On one hand, I learned about the music of Tom Russell, who's a musician I very much enjoy now. On the other hand, there's thousands more Nazis. So really pretty even exchange, I say, yeah, fair mix. Yeah, that's a good trade. Yeah. Now, I do really like Tom Russell's music, but that's the important thing, is Tom Russell not be offended. OK, let's make sure he's fine. Now, YouTube's recommendation engine was not always a core part of the sites functionality. In the early days of YouTube in 2006 or seven or eight or nine, most of the content was focused around channels a lot. Like television, people would search for what they wanted to see and they would TuneIn to stuff they knew that they liked. Unfortunately, that meant people would leave YouTube when they were done watching stuff. I'd like to quote now from a very good article in The Verge by Casey Newton. He interviewed Jim McFadden who joined YouTube in 2011 and worked as the technical lead for YouTube recommendations quote. We knew people were coming to YouTube when they knew what they were coming to look for. We also wanted to serve the needs of people when they didn't necessarily know what they wanted to look for, Casey goes on to write. I first visited the company in 2011, just a few months after McFadden joined. Getting users to spend more time watching videos was, then as now, YouTube's primary aim. At the time, it was not going particularly well youtube.com as a home page was not driving a ton of engagement, McFadden says. We said, well, how do we turn this thing into a destination? So YouTube tried a bunch of different things. They tried. Buying professional gear for their top creators to increase the quality of YouTube content. But that just made YouTube more enjoyable. It didn't make the service more addictive. So in 2011, they launched Lean Back. Now lean back would automatically pick a new video at random for you to watch. After you finished your old video. Lean Back became the heart of the algorithm we all know and many of us hate today. At first, lean backward select new videos for people to watch based on what seemed like a reasonable metric the number of views those videos. He received so if more people watched a video, it was more likely to wind up recommended to new people. But it turned out lean back didn't actually impact the amount of time spent on site per user. So in 2012, YouTube started basing recommendations on how long people spent watching videos, so it's engine switched from recommending videos a lot of people have watched to recommending videos people have spent a lot of time on now. This seemed like a great idea at first. According to The Verge, nearly overnight creators who had profited from misleading headlines and thumbnails saw their view counts. Commit higher quality videos, which are strongly associated with longer watch times, surged. Watch time on YouTube grew 50% a year for the next three years. So that sounds great, right? Hmm. Nothing evil yet nothing horrible. Let's read the next paragraph. During this period of time. Gilliam, Kessler. Really quickly. Yeah. Yeah. I was waiting for you to start talking to them. Interrupt you? No, I wanted to know if if part of the lean back algorithm was that they would just automatically play lean back by Fat Joe. If that had been the YouTube algorithm, the number world, they would do after any video, you would watch we. If that had been what had happened, Sophia, we would live in a paradise. Climate change. Would have been dealt with. The president would be a being of pure light. It would be there would be peace in Ukraine and Syria. It would be a perfect world if only if only YouTube lean back and just back posing people to the music video for lead back. I mean, that's a chill *** jam. OK? That is a chill *** jam. There would be no Nazis in 2019 if that's the change YouTube had made. It's true, Fat Joe Trance transcends. The boundaries of country, religion, skin color, anything you could have saved the world YouTube if you just pushed Fat Joe on a welcoming nation. Into the longing arms of a nation. Yeah. *** **** I wish that. I wish that's the path things had taken. Tragically, it's not. Now, during this. After Lean back was instituted, Gilliam Cheslow was a software engineer for Google. I'm sorry. Gilliam cheslow cheslow. It's spelled CHSLOTI found. I think I'm pronouncing Gilliam right because I found some pronunciation guides for the name Gilliam, but I have not found a pronunciation guide for CHASLOT. I think cheslow is. I think he's a French guy. I just it's a very good name. It is a great name. I think I'm pronouncing it sort of correct. Gilliam Cheslow, but I'm doing my best here. It's like a stuffy bank corner that likes to get domed in the evenings. Gilliam cheslow for sure. Stuffy bank owner. But in this case he's actually an engineer who has expertise, is an artificial intelligence, and the Guardian interviewed him for an article titled how YouTube's Algorithm Distorts Reality. I'm going to quote from that now. During the three years he worked at Google, he was placed for several months with a team of YouTube engineers working on the recommendation system. The experience led him to conclude that the priorities YouTube gives its algorithms are dangerously skewed. YouTube is something that looks like reality, but it is distorted to make you spend more time online, he tells me. When we meet in Berkeley, CA, the recommendation algorithm is not optimizing for what is truthful, or balanced, or healthy for democracy. Cheslow explains that the algorithm never stays the same. It is constantly changing the weight it gives to different signals, the viewing patterns of a user, for example, or the length of time a video is watched. For someone clicks away, the engineers he worked for were responsible for continuously experimenting with new formulas that would increase advertising revenue by extending the amounts of time people watched videos. Watch time was the priority. Everything else was considered a distraction. So. YouTube builds this robot to decide what you're going to listen to next, and the robots only concern is that you spend as much time as possible on YouTube, and that's the seed of all of the problems that we're going to be talking about today. So Gilliam was fired in 2013, and Google says it's because he was bad at his job. Cheslow claims that they instead fired him because he complained about what he saw as the dangerous potential of the algorithm to radicalize people. He worried that the algorithm would lock people into filter bubbles that only reinforce their beliefs and make. Conservatives more conservative, liberals more liberal, and people who like watching documentaries about aliens more convinced that the Jews are fluoridating their water, etcetera. Thank you for laughing. Chill, chill, chill. Yeah, Cheslow said. There are many ways YouTube can change its algorithms to suppress fake news and improve the quality and diversity videos people see. I tried to change YouTube from the inside, but it didn't work. YouTube Masters, of course, had no desire to diversify the kind of content people saw. Why would they do that if it meant folks would spend less time on the site? So in 2015, YouTube integrated Google Brain in machine learning program into its algorithm. According to an engineer interviewed by The Verge, one of the key things it does is it's able to generalize. Whereas before, if I watch a video from a comedian, our recommendations were pretty good at saying, here's another one just like it, but the Google brain model figures out other comedians who are similar but not exactly the same, even more adjacent relationships. It's able to see patterns that are less obvious. And Google brain is a big part of why Steven Crowder and others like him are now millionaires. It's why if you watch a Joe Rogan video, you'll start being recommended videos by Ben Shapiro or Paul Joseph Watson, even though Joe Rogan is not an explicitly political guy and Ben Shapiro and Paul Joseph Watson are. It's why for years, whenever conservative inclined people would start watching, say, a Fox News clip critical of Obama, they'd wind up being shuffled gently over to Infowars and Alex Jones. It's why if you watched a video about Obama's birth certificate, YouTube would next serve you Alex Jones, claiming that Michelle Obama is secretly a man. It's why if you watched a video criticizing. And control YouTube would serve you up. Alex Jones claiming the new World order under under Obama was going to confiscate your guns so it could carry out genocide. And it's why if you watched coverage of the Sandy Hook massacre, YouTube would hand you Alex Jones claiming the massacre was a false flag and all the children involved were crisis actors. I bring up Alex Jones so many times in this because it's probable that no single person benefited as much from youtubes Google brain algorithm changes as Alex Jones. That's what Gilliam cheslow seems to think on February 24th. 2018 he tweeted this. The algorithms I worked out on Google recommended Alex Jones videos more than 15 billion times. Jesus, vulnerable people in the nation? Yeah, that's the scale of this thing. That's insane. Because it recognizes that. People who are gonna start like watching just sort of a conservative take on whatever issue, gun control of the Sandy Hook, shooting fluoride in the water, whatever. So people who might just want like a Fox News take on that. Alex Jones is much more extreme, but because he's much more extreme, he's like compelling to those people. And if you serve him them, the watch his stuff all the way through and his videos are really, really long. It is like a four hour show. So people stay on the site a long time. They get served up a 4 hour Alex Jones video. They just keep playing it while they're doing whatever they're doing, and they sink deeper and deeper into that rabbit hole. And a regular person would look at this and be like, oh, Google's taking people who believe, I don't know, that a flat tax is a good idea and turning them into people who think that fluoride is turning frogs gay and that Sandy Hook was an inside job and that's a bad thing. But YouTube algorithm didn't think that way. I just thought, like, oh, as soon as these people find Alex Jones videos, they spend 50%. Our time on YouTube, so I'm just going to serve Alex Jones up to as many ******* people as I possibly can, and that's what starts happening in. 2013, fourteen. So that's where we are. In the story right now. And then we're going to continue from that point. But you know it is time for next. Sophia, no, tell me it's time for products and services. Maybe. Maybe I'm not gonna make promises. I'm not gonna write checks my *** can't catch here. But maybe there's services you're asking cash. Well. My *** is all about products. I hope it's a chair company that comes up next. Otherwise that's a non sequitur. I hope it's a it's a squatty potty. It's probably going to be **** pills because we just we just signed a great deal with **** Pills. I'm very, very proud of our of our **** pills sponsorship. It's not even our job. I love I I love selling **** pills. I can see your hard right now. I can just see your head, your head of your body, not your penis head. But I can tell you. Card from the pills. Thank you. You have a very taught **** energy, you know. Thank you. TDE is what this show aims to present to the world. You know, I, I said, speaking on the subject of YouTube. When we when we filled out our ad things, I won't sell brain pills because I don't want to be like Paul Joseph Watson or Ben Shapiro, but I will 100% sell **** Pills. And it's mainly so that I can say the phrase **** pills over and over again. So meet my son, **** pills Evans. **** pills Evans. I am a, I am a name my I'm gonna have a son just to name him **** pills and then sort of. It's going to be like a boy named Sue, but with a boy named **** Pills. And instead of like me explaining to him that I gave him the name Sue so that he'd be like he, it would harden him up and he'd become like a tough person and could survive the rough world and say, Oh no, I got paid a lot of money to call *** **** pills. No, you're just sponsored by **** Pills. You're just a human by **** till it is. This has gone very off the rails. Sophie, is this a good idea? No, she no. She's doing her hard. No. Hard? No? OK, well, Speaking of hard. Products. Mint Mobile offers premium wireless starting at just 15 bucks a month. And now for the plot twist. Nope, there isn't one meant mobile just has premium wireless from 15 bucks a month. There's no trapping you into a two year contract. You're opening the bill to find all these nuts fees. There's no luring you in with free subscriptions or streaming services that you'll forget to cancel and then be charged full price for none of that. For anyone who hates their phone Bill, Mint Mobile offers premium wireless for just $15.00 a month. Mint Mobile will give you the best rate whether you're buying one or for a family and at Mint. Families start at 2 lines. All plans come with unlimited talk and text, plus high speed data delivered on the nation's largest 5G network. You can use your own phone with any mint mobile plan and keep your same phone number along with all your existing contacts. Just switch to Mint mobile and get premium wireless service starting at 15 bucks a month. Get premium wireless service from just $15.00 a month and no one expected plot twists at mintmobile.com/behind. That's mintmobile.com/behind. Seriously, you'll make your wallet very happy at Mint Mobile. Com slash behind. This fall on revisionist history, is there anything that we haven't talked about, or I should have asked you or you'd like to add that seems relevant? You should have asked me why I'm missing fingers on my left hand. A story about sacrifice. I think his suffering drove him to try to alleviate suffering. And the shocking discovery I made where I faced the consequences of writing a book I thought would help people? Isn't that funny? It's not funny at all. It's depressing. Very depressing. Religious history is back with more. Listen to revisionist history on the iHeartRadio app or wherever you get your podcasts. I've never seen less enthusiasm for a great idea in my life. Hey, it's Rick Schwartz, one of your hosts for San Diego Zoo's Amazing Wildlife podcast. In this special episode, we sit down with Doctor Jane Goodall to hear her inspiring thoughts on how we can create a better future for humans, animals and the environment. Anything, particularly young children out into nature so that they can experience it and take time off from this virtual world of being always on your cell phones and so on. And get the feel of nature so that you come to be fascinated, then you come to want to understand it, and then you come to love it, and at that point you want to protect it. And then we'll come to the sort of healthy world that I envision as a good future for us. And the rest of life on this planet. Listen to amazing wildlife on the iHeartRadio app or wherever you get your podcasts. We're back. We're we're back and Sophia just said the sentence. We got to mold our own genitals at *** **** Johnson factory or Dog Johnson factory. Dog Doc Johnson. I loved that sentence, which is why I brought us back in mid conversation from the ad break because that's a wonderful sentence. I have that Instagram story saved on my Instagram. If anybody want to get that sentence tattooed on my back, like where some people will have Jesus like I got. Channels molded at the Doc Johnson factory. Yeah, and it was the most fun ever. That sounds great. It was cool. That sounds so much better than YouTube's algorithm. That's a really smooth transition. Thank you. Like jazz. ******* saxophone. Smooth. I am as good at transitions as **** pills are at getting **** **** ****. Exactly. Exactly. **** yeah. Hims. Good times. OK. So, you know, we we one of the big sources for this podcast and one of the big sources for the articles that have covered the problems with Youtubes algorithm is Gilliam cheslow. And he's not just a former employee with an axe to grind or someone who feels guilty about the work he participated in. For years now he has turned into something of an activist against what he sees as the harms of his former employer. And obviously, as a guy with potentially an axe to grind, he's someone that you've got to approach a little bit critically. But Cheslow hasn't just, like, complained about Google. He's built like he has a team of people that have built like systems in order to test the way Google's algorithm works and show the way that it picks new content. And like a document with hard numbers, like, here's the kind of things that it's serving up, here's the sort of videos that it recommends people towards, like, here's how often it's doing them. So he's he's not just making claims, he has. Reams and reams of documentation on how Google's algorithm works behind him. He's really put a lot of work into this, and from everything I can tell, he's someone who's deeply concerned about the impact YouTube's algorithm has had on our democracy and someone who's trying to do something about it. So. Just digging into the guy a bit, I have a lot of respect for what he's trying to do. On November 27th, 2016, shortly after the election, while we were all drinking heavily, Gilliam Cheslow published a medium post titled YouTube AI was divisive in the US presidential election. In it, he included the results of a study he, he, and a team of researchers conducted. They were essentially trying to measure which candidate was recommended the most by YouTube's AI during the presidential election, and the code that they used to do this in all of the methodology behind it is available on the website. If you're someone who knows how to do the coding, you can check it all up, but they're very transparent, he says. Quote surprisingly, a Clinton search on the eve of the election led to mostly anti Clinton videos. The Pro Clinton videos were viewed many times and had high ratings, but represent only less than 20% of all recommended videos. Chassler's research found that the vast majority of political videos. Recommended by YouTube were anti Clinton and Pro Trump because those videos got the best engagement. Now. Chancellor explained that because Google Brain was optimized to maximize time user spent on site or engagement, it's also happy to route people to content that, say, proposes the existence of a Flat Earth, because those videos improve engagement too. Gilliam found that searching is the earth flat around, and following Google's recommendations, sent users to Flat Earth conspiracy videos more than 90% of the time. So if you're wondering why Flat Earth is taken off as a conspiracy, it's because simply asking the question is the Earth flatter around 90% of the time? Leads you to videos that say it's flat homie. That's how come all those basketball players think the Earth is flat. And also what are exactly right? It's yeah you can you can you can see in your head how like. That change happens is like some guys having a conversation with a friend who is kind of dumb and is like, no dude, you know, the earth flat and you're like what? That's ******** and you type is the earth flat into YouTube and then it serves you up a 4 hour documentary about how the Earth is flat and like. Yeah, it's probably your first mistake. Predictable is typing it into YouTube. It's probably not the place you want to get that answer, yeah. No, but it's not like schools in America teach people critical thinking or how to functionally do research. It's like going to Yahoo answers to be like, am I pregnant? Which happens all the time. The answer is yes. If you are asking Yahoo whether or not you're pregnant, you are in fact pregnant for sure, probably second or third trimester. You should, you should at least stop smoking for a while until you find out for sure. Maybe, maybe put down a bottle for a hot second. Yeah, now, further reporting using additional sources from within Google seems to support most of Chesley's main contentions. In fact, it suggests that he, if anything, understated the problem. Chesla left YouTube in 2012 and while he knew about Google brain, he did not know about a new AI called reinforce that Google had just instituted or institute, I think in 2015. The YouTube it's existence was revealed by a New York Times article published just a few days before I wrote this the making of a YouTube radical. That article claims that reinforce, focused on a new kind of machine learning called reinforcement learning. The new AI, known as reinforced, was a kind of long term addiction machine. It was designed to maximize users engagement over time, but predicting which recommendations would expand their tastes and get them to watch not just one video, but many more. Reinforce was a huge success and a talk. At an AI conference in February, Minmin Chin, a Google brain researcher, said it was YouTube's most successful launch in two years. Site wide views increased by nearly 1%, she said, a game that at YouTube scale could amount to 1,000,000 more hours of daily watch time. 1,000,000 more dollars in advertising revenue per year. She added that the new algorithm was already starting to alter user's behavior. We can really lead users toward a different state versus recommending content that is familiar, Mischin said. It's another example of like. If you take that quote. Out of context and justice, read it back to her and say Miss Chin, this sounds incredibly sinister when you're talking about leading people towards a different state. Excuse me, ma'am, are you in fact a villain? A super villain? You sound like a super villain. Sounds like this might be evil. Is this a James Bond River has in the tech industry? Yeah. It's that no one ever has in the tech industry that are we the baddies moment where we're like, oh, we're addicting people to our service. Is that maybe bad ****? Are we the Nazis? Damn, this whole time I thought we were the Americans. Nope, yeah. Now, YouTube claims that reinforce is a good thing, fighting YouTube bias towards popular content and allowing them to provide more accurate recommendations. But reinforce once again presented an opportunity for online extremists. They quickly learned that they could throw together videos about left wing bias in movies or video games, and YouTube would recommend those videos to people who were just looking for normal videos about these subjects. As a result, extremists were able to red pill viewers by hiding rants about the evils of feminism and immigration. Reviews of Star Wars. In far right lingo, Red Pilling refers to the first moment that sort of sets someone off in their journey towards embracing Nazism. And so prior to reinforce, if you were looking up, I want to see gameplay videos about Call of Duty or I want to see your review of Star Wars The Force Awakens, it would just take you to reviews and gameplay videos now. It would also take you to somebody talking about like, how Star Wars is part of the social Justice Warrior agenda or how Star Wars, you know, embraces white genocide. Or something like that. And so then, you know, and it'll recommend that to millions of people, and most of them will be like, what the **** is this ********? But a few thousand of them will be like, Oh my God, this guy's right. Like, Star Wars is part of a conspiracy to destroy white men, and then they'll click on the next video that Stefan Molyneux puts out, or they'll they'll go deeper down that rabbit hole, and that's how this starts happening. Star Wars is a conspiracy, though. Just take your ******* money. That's all it is. Just take your money. It's like any other conspiracy. That involves movies. It's. The only thing is to take your money. Yeah, not to destroy white people. They want white people because white people spend the most money on Star Wars point. Yeah. If they killed that, that's that's the number one customer that's killing your whole customer base. Once you to breed, they don't want teenagers to start smoking. It's like, yeah, you need to replenish the flocks. Yeah, yeah. You you want people to start smoking in their 20s as they have children who grow up watching dad smoke? Yes. That's like the plan. Yes. Yes. Yeah. They want. They want kids like me to grow up who every now and then will buy a pack of cigarettes just to smell the open pack of cigarettes because it takes me back to moments in my childhood. It is. It's such a soothing smell. Unsmoked cigarettes. A little bit sweet. A little bit fruity filter? Yeah, this is going to trigger somebody to buy sugar. Right now, someone's pulling over to 711. Yeah, and I feel terrible about that. And they're like, also, I just bought **** pills. They're like, I don't know what's happening to me. Buy * **** pills. *******. It's good for your health. It's good for your heart. It's it's it's great. ******* is all benefits. Cigarettes are almost all downsides, other than the wonderful smell of a freshly opened and looking really ******* cool. Yeah, well, they do, make you look incredibly cool. Unbelievably cool, so ******* yeah, nothing looks cool. Something does. Smoking a joint looks cooler. You're right, smoking a joint does look cooler. And the coolest thing of all? Smoking a joint on a unicycle. On a yacht. Wow, you just took it to another level. I just would want to see how good your balance is to be able to run like on a yacht. One of our many millionaire listeners is going to message me tomorrow being like my husband tried to smoke a joint while riding a unicycle on our yacht and now he's dead. You you killed the love of my life and the OR we'll get some dope fan art of you on a unicycle smoking a joint on a yacht, burning a fat one. Speaking of fat ones, the New York Times interviewed a young man who was identified in their article on radicalization as Mr Cain, and Mr Cain claims that he was sucked. In one of these far right YouTube rabbit holes, thanks to YouTube's algorithm, he is scarred by his experience of being radicalized by what he calls a decentralized cult of far right YouTube personalities who convinced him that Western civilization was under threat from Muslim immigrants and cultural Marxists, that innate IQ differences explained racial disparities, and that feminism was a dangerous ideology. I just kept following deeper and deeper into this and it appealed to me because it made me feel a sense of belonging, he said. I was brainwashed. There's a spectrum on YouTube between the calm section, the Walter Cronkite Carl Sagan part. And crazy town, where the extreme stuff is, said Tristan Harris, a former design ethicist at Google, YouTube's parent company. If I'm YouTube and I want you to watch more, I'm always going to steer you toward crazy town. Umm. And I will say that I'm very hard on the on the tech industry regularly on this podcast. It speaks well of a lot of engineers that the most vocal people in trying to fight YouTube's algorithm are former Google engineers who realized what the company was doing and like stepped away and have been hammering it ever since being like we made a Nazi engine, guys. Like we weren't trying to, but we made a Nazi engine and we have to deal with that brainly alarm on this one. Yeah, gotta really gotta ruin the alarm on. This one, you know, I used to work at Google. I worked at Google for two years. I didn't know that. Yeah, what did you do? I my job title won't explain what I did, but basically it was like a quality. Yeah, it it has nothing to do with anything, but basically I got to in Russian. Like help build. Binary engine that can well like train it, not build it, train it to be able to tell whether something is a restricted category or not. Like something is **** or not, gambling or not, that kind of stuff. So yeah, it was it was crazy. Well, that sounds. Different, yeah. I saw some of the most ****** ** stuff on the Internet. You know, like I've reported child **** before. People have a lot to say about this latter part because we do talk about content moderators for a little bit. That's kind of assuming to be asking a couple of questions about that at the end. Yeah, yeah. Yeah. Now that that New York Times article in Full disclosure actually cites me in it because of a study that I published with the research collective Bellingcat last year, where I trawled through hundreds and hundreds of leaf conversations between fascist activists and found 75 self reported stories of how these people got red pilled. In that study I found that 34 of the 75 people I looked at cited YouTube videos as the things that red pilled them. I'm not the only source on this though. The New York Times. Also cited a research report published by a European research firm called Vox Poll. They conducted an analysis of 30,000 Twitter accounts affiliated with the far right, and they found that the those accounts linked to YouTube videos more than they linked to any other thing. So there's a lot of evidence that YouTube is the primary reason why. If you look at people who were researching the KKK and Neo Nazis in America in 2004, 2005, 2006, a big gathering would be 20 people. And then in 2017, four or 500 of them, however many it was, showed up at Charlottesville. Like, there's a reason their numbers increased so much over a pretty short period of time, and it's because these videos made more of them, and there's, there's a lot of evidence of that. So while Google is raking in more and more cash and increasing time spent on site, they're also increasing the amount of people who think Hitler did nothing wrong. And that's that's the tale of today. O Mr Cain, the New York Times source for that article claims his journey started in 2014 when YouTube recommended a self help video by Stefan Molyneux. Mr Molina was a great candidate for an episode of this podcast, but in short, he's a far right YouTube philosopher, self-help guru who advises his listeners to cut all ties with their family. He runs a community called Free Domain Radio that some people accuse of being a cult that, you know, tells people to cut off contact with their family. Whole club is going to be like, hey, please join us. But also never speak to anyone you love ever again. Yeah, yeah. Never talk to your mom again. Like, that's not that's not how a cool club starts, you know? That's always a cool club starts. That's always bad news. Yeah, yeah. Cool clubs say never talk to the cops again, which cool clubs do say? Absolutely. Now, Molyneux has been on YouTube since forever, but his content has radicalized sharply over the years. At the beginning, he identified as an anarcho capitalist, and he mostly focused on his ideas about how everyone was bad at being parents. And people should cut ties with toxic family members in recent years. He's made bro. Just call your dad. Call your dad, bro. You probably need to have a convo. Yeah, you guys probably just talk some feelings. Out. Maybe you'll calm the **** down. I don't know. Like, I don't wanna say. Like, there's actually are a lot of people with toxic family members, so they do need to cut out of their lives, which I think is part of why Molly was able to get a following. Like, there's not nothing in what he's saying. There's a lot of people who have ****** ** family backgrounds and who get told, like, well, you just need to make things right with your mom. And it's like, no, if if your mom, like, sent you to gay conversion therapy, maybe cut all ties with her forever. I totally agree. No, no, no. I'm not trying to say that. What I'm trying to say is that he himself. To to pursue a life where you tell people to cut contact off with their family. You clearly have unresolved issues with your family. And if you resolve those by, say, calling your parents and talking to them, I'm not saying you have to make up with them. I'm saying somehow get closure for yourself so then you don't spend the rest of your life trying to get people to quit their families. Yeah, that's just like seems. Yeah. You got some **** to deal with, bro. But you know, he Molyneux didn't stay on that sort of thing. Like, he made a switch over to pretty ******** nationalism, particularly in the last two years. There's like a video of him where he's in a Poland during like a far right March to commemorate like Poland's, like Independence Day. And he like said like starts crying and has like this big realization of how like I've been against nationalism and stuff for years and I realize it can really be beautiful and. Like the unsaid things, like, I realize that white nationalism can be beautiful and that, like, instead of, you know, being an independent libertarian type, I'm going to focus on fighting for my people, which is like white people and stuff like that. That's how Stefan Molyneux is now. Like, he's essentially a neo-Nazi philosopher at this point, and he spends most of his time talking about race and IQ and, you know, talking about how black people are not as good as white people. Like. That's that's the the thrust of modern days to fall, Stefan Molyneux. He also believes global warming is a hoax, so maybe nobody should have much respect for Mullen, who's own IQ. But a lot of people get turned on to Stefan's straight up fascist propaganda because of their interest in Joe Rogan. Rogan has had Stefan on as a guest several times, and YouTube has decided that people who like Rogan should have Stefani's channel recommended to them. This may be why Mr Kaine saw Molyneux pop into his recommendations, which is what he credits as radicalizing him in 2014. So yeah, he he he wound up watching like a lot of members of what some people call the intellectual dark web, Joe Rogan, Dave Rubin, guys like Steven Crowder, and of course, Stefan Molyneux. And over time, like, he went further and further and further to the right, until eventually he starts watching videos by Lauren Southern, who is a Canadian activist who's essentially like he called her his fascist crush, like his fashy Bay, so, like, by like 2016. This guy who starts watching Joe Rogan and like it's turned on the Stefan Molyneux videos about global warming as a hoax and IQ and race by 2016. He's like identifying a YouTube Nazi as his fascist, like crush. Like that's how this proceeds for this dude, and that's a pretty standard path. But you know it's not a standard path. The path that our listeners will blaze. If they buy the products and services that we advertise on this program. You you seem like your breath been taken away by these skill and and and ingenuity of that transition. Truly, there was nothing I could add. It was a perfect, perfect work. I'm the best at this. I'm the best around. Nothing's going to ever keep me down. Put a bumper sticker on a Rolls Royce. You know what I'm saying? The bumper sticker is gonna say I got my genitals molded. Yeah, doctor. I want that bumper sticker. It's actually hologram. And like when you look at it one way, I am wearing a skirt and when you look at it the other way, you'd see my vagina mold. It's it's really cool, man. That put a lot of thought into it and did so. That's quite a bumper sticker. And you know, I, I have thought for a long time that what traffic is missing is explicitly pornographic bumper stickers. Like if truck nuts are OK, why isn't that? Seriously, it's actually a lot more pleasant to look at than truck nuts. Yes, yes, nobody actually likes truck nuts. No one. Alright. Well, this has been a long digression. Yeah, let's let's products. Mint Mobile offers premium wireless starting at just 15 bucks a month. And now for the plot twist. Nope, there isn't one. Mint Mobile just has premium wireless from 15 bucks a month. There's no trapping you into a two year contract. You're opening the bill to find all these nuts fees. There's no luring you in with free subscriptions or streaming services that you'll forget to cancel and then be charged full price for none of that. For anyone who hates their phone Bill, Mint Mobile offers premium wireless for just $15.00 a month. Mint Mobile will give you the best rate whether you're buying. Or for a family. And it meant family start at 2 lines. All plans come with unlimited talk and text, plus high speed data delivered on the nation's largest 5G network. You can use your own phone with any mint mobile plan and keep your same phone number along with all your existing contacts. Just switch to Mint mobile and get premium wireless service starting at 15 bucks a month. Get premium wireless service from just $15.00 a month, and no one expected plot twists at mintmobile.com/behind. That's mintmobile.com/behind. Seriously, you'll make your wallet. Very happy at Mint Mobilcom behind. This fall on revisionist history, is there anything that we haven't talked about, or I should have asked you or you'd like to add that seems relevant? You should have asked me why I'm missing fingers on my left hand. A story about sacrifice. I think his suffering drove him to try to alleviate suffering. And the shocking discovery I made where I faced the consequences of writing a book I thought would help people? Isn't that funny? It's not funny at all. It's depressing. Very depressing. Revisionist history is back with more. Listen to revisionist history on the iHeartRadio app or wherever you get your podcasts. I've never seen less enthusiasm for a great idea in my life. Hey, it's Rick Schwartz, one of your hosts for San Diego Zoo's Amazing Wildlife podcast. In this special episode, we sit down with Doctor Jane Goodall to hear her inspiring thoughts on how we can create a better future for humans, animals and the environment. If we don't help them find ways of making a living without destroying the environment, we can't save chimps, forests or anything else. And that becomes very clear when you look at poverty around the world. If you're living in poverty, you can't afford to ask as we can. Did this product harm the environment? Was it cruel to animals, like, was it factory farmed? Is it cheap because of unfair wages paid to people? And so alleviating poverty is tremendously important. Listen to amazing wildlife on the iHeartRadio app or wherever you get your podcasts. We're back. Boy howdy, what a day we've had today. Umm. So at this point, YouTube's role in radicalizing a whole generation of fascists is very well documented. But YouTube is sort of stuck when it comes to admitting that they've ever done anything wrong. 70% of their traffic comes from their recommendation engine. It is the single thing that drives the platforms profitability more than anything else. Back in March, the New York Times interviewed Neil Mohan, YouTube's chief product officer. His responses were pretty characteristic of what the company says when confronted. About their little Nazi issue? The interviewer asked. I hear a lot about the rabbit hole effect, where you start watching one video and you get nudged with recommendations towards a slightly more extreme video and so on, and all of a sudden you're watching something really extreme. Is that a real phenomenon? To which? Neil responded. Yeah, so I've heard this before and I think that there are some myths that go into that description that I think would be useful for me to debunk. The first is this notion that it's somehow in our interest for the recommendations to shift people in this direction because it boosts watch time or what have you. I can say categorically that's not the way our recommendation systems are designed. Watch time is 1 signal that they use, but they have a number of other engagement and satisfaction signals from the user. It is not the case that extreme content drives a higher version of engagement or watch time than content of other types. So he basically has a blanket denial there. Yeah, that's a huge just like blanket. Now we don't do that. That doesn't happen. Doesn't happen. And he goes on. It's a little bit of a rambling answer. And later in his answer, Mohan called the idea of a YouTube radicalization rabbit hole purely a myth. The interviewer, to his credit, presses Neil Mohan on this a bit more later and asks if he's really sure he wants to make that claim. Mohan responds what I'm saying is that when the video was watched, you will see a number of videos that are recommended. Some of those videos might have the perception of skewing in One Direction or, you know, call it more extreme. There are other videos that skew in the opposite direction. And again, our systems are not doing this because that's not a signal that. Leads into the recommendations. That's just the observation that you see in the panel. I'm not saying that a user couldn't click on one of those videos that are quote UN quote more extreme, consume that, and then get another set of recommendations that sort of keep moving in one path or the other. All I'm saying is that it's not inevitable. So because everybody doesn't choose to watch more extreme videos, there's no YouTube radicalization. Rabbit hole. Yeah. And also kind of acknowledging there that it does happen. Yeah. Nothing is inevitable. I mean, except for, like, death and whatever, you know, it's just to be like, yeah, no, it's not. A meteorite could hit your house before you get to click on the video that turns you into a Nazi. So of course it's not inevitable. Yeah. Just be like, it's not 100. It's not 100% true. It's not a good answer when someone percentage of our users. I have heart disease before the next video plays. Yeah, pretty high percentage of people. Yeah, that's not what we're asking, Neil. Now, the reality, of course, is that Neil Mohan is a, shall we say, not entirely honest. I think I wrote a damn liar in the original draft, but I'm not sure where the legally actionable line is. Pocket of big video. Yeah, pocket of big, big video. For just one example, Jonathan Albright, a Columbia University researcher, recently carried out a test where he seated a YouTube account with a search for the phrase crisis actor. The up next recommendation led him to 9000 different videos promoting crisis actor conspiracy theories. So again, someone who heard the term and wanted to search for factual information about the conspiracy theory would be directed by YouTube to hundreds of hours of conspiratorial nonsense about how the Sandy Hook shooting was fake. Now, I'm going to guess you remember last year's mass shooting at the Marjory Stoneman Douglas High School. By the Wednesday after that shooting, less than a week after, all of those kids died. The number one trending video on YouTube was David Hogg, the actor, which is obviously a video accusing one of the kids who's been most prominent of being a crisis actor. According to a report from ad age, it and many others claimed to expose hog as a crisis actor. YouTube eventually removed that particular video, but not before it amassed nearly 200,000. Other videos targeting hub remain up. One that appears to show hogs struggling with his words during an interview after the shooting suggests it's because he forgot his lines. YouTube Auto suggests certain search terms that would lead people directly to the clips. If a person typed David Hogg in YouTube search bar midday Wednesday, for example, some of the suggestions would include exposed and crisis actor. When reporters asked YouTube how that video made it to the top of their coveted trending chart, YouTube explained that since the video included edited clips from a CNN report, its algorithm had believed that it was a legitimate piece of journalism and allowed it to spread as an authoritative news report would. So again, that's their that's their, like, justification. Like, we couldn't have known that this was fake news because it was fake news that used clips from a legitimate news site. So, like, we're clearly not at fault here for the fact that we let a robot. Select all these things and no human being watched the top trending video on the site at the moment to see if like it was something terrible. That's ******** yeah. Yeah, that's total ********. Now, Youtubes or Nazi propaganda and conspiracy theories aren't the only things that spread like wildfire on YouTube. Of course, pedophilia is also a big thing on the site. Yeah, yeah, this is where we get to that part of the story. So this broke in February of 2019 when a YouTuber named Matt Watson put together a video exposing how rings of pedophiles had infested the comment sections for various videos featuring small children and use them to communicate and trade child ****. Now this report went very viral and immediately prompted several major. Advertisers to pull their money from YouTube. The company released a statement to their worried advertisers informing them that they had blanket banned comments for millions of videos, basically removing comments from any videos uploaded by or about young children. I'd like to quote from NPR's report on Watson's Video. Watson describes how he says the pedophile ring works. YouTube visitors gather on videos of young girls doing innocuous things such as putting on their makeup, demonstrating gymnastics moves, or playing twister in the comments section. People would then post time stamps that link to frames in that video that appeared to sexualize the children. YouTube's algorithms would then recommend other videos also frequented by pedophiles. Once you enter into this wormhole, there is now no other content available, Watson said. So. It might seem at first like this is purely an accident on YouTube part like that. Cunning pedophiles figured out that there were like they could just find videos of young kids doing hand stands and stuff and use that as **** and trade it with each other, right? Which would not necessarily be like how could we have predicted this? It's just these people decided to use innocent videos for nefarious purpose. But that's not what happened. Or at least, that's not all of what happened. So in June 3, researchers from Harvard's Berkman Klein Center for Internet and Society started combing through YouTube's recommendations for sexually themed videos. They found that starting down this rabbit hole led them, inevitably, to sexual videos that placed greater emphasis on youth. So, again, that's maybe not super surprising. You start looking for sexy videos you you click on one, and then the next video. The the woman in it or is going to be a younger woman and a younger woman and a younger woman. But then at a certain point, the video suggested, flipped very suddenly until, I'm going to quote the researchers here, YouTube would suddenly begin recommending videos of young and partially clothed children. So. YouTube would take a person who's, like, just looking for adults like videos of, like, an exotic dancer dancing or whatever, like videos of attractive young women dancing. And then YouTube would start showing them videos of children doing like, gymnastics routines and stuff like, that's the algorithm being like, I bet you'll like child ****. Like that's literally what's happening here, which I didn't realize when I first heard the story that like. Like, that's YouTube. That's not just pedophiles using YouTube in a sleazy way, because pedophiles will always find a way to ruin anything. That's YouTube crafting new pedophiles. Yeah, it's a system that's essentially training you. Yeah. I wonder if it's like that with violence, too. If you look up a violent thing, if it keeps recommending more violence because that seems like an hate. Like that would happen when I worked for Google. Like the sensitive categories, the restricted categories are, you know, violence, hate gambling, **** child ****. I think there's even a messed up thing about that. Because one of the problems that like people who document war crimes in Syria have had is YouTube blanket banning their videos because of violence. And then like, you have evidence of a war crime and then it's wiped off of the Internet forever because YouTube doesn't realize that this isn't like violence ****. This is somebody trying to document a war crime. It's made it really hard to do that kind of research. It's yeah, there response is always so terrible. Anyway, the New York Times reported quote so a user who watches ****** videos might be recommended videos of women who becomes conspicuously younger and then women who propose provocatively in children's clothes. Eventually, some users might be presented with videos of girls as young as five or six wearing bathing suits or getting dressed, or doing a split. So yeah, in its eternal quest to increase time spent on site, YouTube's algorithm essentially radicalized people towards pedophilia. And to make matters worse, it wasn't just picking sexy videos like that. People had uploaded with the intent of them being sexy because it was sending children's videos to people. It started grabbing totally normal home videos of little kids and presenting those videos to ***** adults who were on YouTube to masturbate. The report suggests it was learning from users who sought out. Revealing or suggestive images of children. One parent the Times talked with related in horror that a video of her 10 year old girl wearing a bathing suit had reached 400,000 views. So like parents start to realize, like wait a minute, I uploaded this video to show her grandma they're supposed to be like 9 views on this thing. Why have 400,000 people watched this video of my 10 year old? And it's because YouTube is trying to provide people with **** because it knows that it'll keep them on the site longer. That's ******* wild, yeah. After this report came out, YouTube published an apologetic blog post promising that responsibility is our number one priority, and chief among our areas of focus is protecting minors and families. But of course, that's not true. Increasing the amount of time spent on site is YouTube chief priority. Or rather, making money is YouTube's chief priority. And if increasing the amount of time spent on site is the best way to make money, the YouTube will prioritize that over all other things, including the safety of children. Now, there are ways YouTube could reduce the danger their sites present to the world's ways they could catch stuff like propaganda accusing a mass shooting victim of being an actor, or people's home movies of being accidentally turned into child **** even if they're not going to stop hosting. Literal fascist propaganda content moderators could add human eyes and human oversight to an AI algorithm that is clearly sociopathic. And earlier this year, YouTube did announce that they were expanding their content moderator team to 10,000 people. Which sounds great, sounds like a huge number of people, only that's not as good as it seems. The Wall Street Journal investigated and found out that a huge number of these moderators, perhaps the majority, worked in cubicle farms and India in the Philippines, which would be fine if they were moderating content posted. From India or the Philippines. But of course, these people were also going to be tasked with monitoring American political content now. Alphabet Nay Google does not disclose how much money YouTube makes. Estimates suggest that it's around $10 billion a year, and maybe increasing by as much as 40% per year. Math is not my strong suit. I'm not an algorithm. But I did a little bit of math and I calculated that if Google took a billion dollars of their profit and hired new content moderators, paying them $50,000 a year salaries, which I'm going to guess is more than most of these moderators get, they could afford to hire 20,000 new moderators. Tripling their current capacity. Realistically, they could hire 50 or 60,000 more moderators and still be making billions of dollars a year and one of the most profitable services on the Internet. But doing that would mean less profit for Google shareholders. It would mean less money for people like Neil Mohan, the man who has been YouTube's chief product officer since 2011. The man who was overseeing nearly all the algorithmic changes we are talking about today. The man who sat down with the New York Times and denied YouTube had a problem with leading people down rabbit holes that radicalized. In dangerous ways. I was kind of curious as to how well compensated Mr Mohan is, so I Googled Neil Mohan's net worth. The first response was a Business Insider article. Google paid this man $100 million. Here's his story. Oh, that's cool. Yeah. Oof. And I can tell you from being a moderator. I worked on a team where everybody did what I did in a different language. So I did this in Russian. And next to me was someone who was doing it in Chinese and Turkish and all the all of the languages. I mean, not all, but a significant number. Yeah. And I can tell you that we were hired as contractors for only a year. Very rarely would you ever be doing a second year because they didn't want to pay you the full benefits. Like, you know, you don't get health insurance and. Whatever, all the perks that you would get from being a full time Google employee. And the thing about what we did is you got exposed to a lot of ****** ** stuff. Like, you know, the videos and stuff that I've seen are like some of the worst the Internet has to offer, like beheadings or someone stomping a kitten to death and high heels like crazy ****. And it would really make you sick and they like give you free food at Google and you like wouldn't be able to eat sometimes because you would be so grossed out. And it's not like they that's why you're only there for a year. Also, not just that you wouldn't be able to get full benefits, but also because they are OK with wasting your mental and physical energies and then letting you go and then just cycling through new people every year. Because rather than investing you know an employees that are full time making sure they have. You know, access to mental health care and stuff like that. And, you know, making that job be something that they take more seriously, considering how important it is. Well, and that's part of what's really messed up is that, like, it's ******* Google. Like, if you go into the people, people who are like actually coding these algorithms and stuff, I guarantee you those people have on site therapists they can visit, they have gyms at work, they get their lunches and we all worked on. Same building. But like, I can't, you know, I couldn't go get a free massage during. It's like, yeah, you know, you have a CrossFit trainer on site and **** like that for sure. You get incredible perks. And the whole point is, what I thought was kind of ironic about what we were saying is, like, the whole thing is to try to make you stay on YouTube. But when you work for a company like Google, their job is to try to make you stay at Google. So, you know the reason you're getting all these benefits and stuff and, like, free food and gym and massage. Or whatever because they want you to stay and work forever, but they don't want you to like not me messed up to me exactly like. And that's that's a very telling thing from Google's perspective because they are saying that increasing the amount of the people who are coding these algorithms that increase the amount of time people spend on site, that is important to us. And so we will do whatever it takes to retain these people. But the people who make sure that we aren't creating new pedophiles while we make money, the people who are responsible for making sure that Nazi propaganda. Isn't served up to like Influenceable Young children via our service. Those people aren't valuable to us because we don't care about that, so we're not going to offer them healthcare. Like if if if Google really was an ethical company and if YouTube cared about its impact on the world. Someone whose job, who's there's nothing less technical or less valuable about what you're doing. Being able to speak another language fluently, being able to understand if content propagating on their site is toxic or not. That's a very difficult, very technical task. If they cared about the impact they had on the world, the people doing that job would be well paid and would have benefits and would be seen as a crucial aspect of the company. But instead, it's sort of like if we don't have someone doing this job, we'll get yelled at. So we're going to do the minimum necessary and we're going to have most of the people doing that job be working at a ******* cube farm in India, even though we're expecting them to moderate American content and to understand all of our cultural nuances and whether or not something toxic like that's. So ****** ** and also considering the fact that like. Ads is the reason that they hire content moderators, not because they care about the content necessarily. It's that it would be a huge like, mistake if, say, an ad for Huggies was served on a diaper fetish website. You know, they want something in place where the page knows the algorithm knows not to serve that, even though it seems like a good match because the word diapers repeated and blah blah blah, you know what I mean? So it's really less. Well, it's it's more about keeping the advertisers happy and making the most money than it is about. Ensuring that the. Internet in this less ****** ** place, this gets to one of the things like when I when I get in arguments with people about the nature of capitalism and what's what's what's wrong with the kind of capitalism that we have in this country. I think a lot of people who like just sort of reject anti capitalist arguments out of hand do it because they think that you're saying it's just wrong to make to make money. It's wrong to have a business that like makes a profit. It's like the issue isn't that the issue like this company Google could be making. Billions of dollars a year and still be one of the most profitable sites in in its of its type. Still make a huge amount of money and have three times as many people doing content moderation and all those people have healthcare. But by cutting corners on that part of it because it doesn't make them more money, it just makes the world better. They make more money and it's worth more to them to increase the value of a few 100 people stock than to ensure that there aren't thousands of additional people. ************ to children like that, that's what I have an issue with. With capitalism, like, that's that's the issue. You can make a profit without also selling your ******* soul. Yeah, we could have YouTube. There's not. You should be banned. Like, I can get recommended new musicians that I like. We can all watch videos to masturbate to without more people being turned into pedophiles and Nazis. That's not a necessary part of this. Like, it's just because corners are being. But. Yeah, it just shows what the, what value, what the value of our society is, what the values of our society are. Yeah, they've literally said like $3 billion a year is worth more to us than. God knows how many children being molested. Then ******* Heather hire getting run down at Charlottesville. Then they're being Nazis marching through the streets and advocating the extermination of black people, of LGBT people, of whatever, like, which is again, part of why so many Google employees are now speaking out and horrified because, like, they're not monsters. They don't want to live in this world anymore than the rest of us do. They just didn't realize what was happening because they were busy focusing on the code and the free massages. And then, like the rest of us, they woke up to a world. Poland Nazis and pedophiles. Yeah. Yeah. I feel like you're looking at me to make a joke now, and I feel like, I don't know, this got real serious. I'm more just tired. They're all tired. It's a it's a very tiring world we live in. Well, Sophia, that's the episode. Yay. You want to plug your plug cables? ****. I mean, not really. Just want everyone to go and get a hug, you know? Everybody go get a hug. But Jesus, yeah, but also. I am to be found on the sites we hate. You know what a fun thing to pluck. I'm available on Twitter and Instagram at the Sophia huge fan of the day, and I have a podcast about love and sexuality around the world that I co-host with Courtney Cosack. It's called private parts unknown, so check that out. Check out private parts unknown. I'm also on the sites we hate. Behindthebastards.com is not a site that we hate, but it's where you can find the sources for this episode. I write OK, where you can find me on Twitter. You can find us on Twitter and Instagram at ******** pod. That's that you buy T-shirts on teepublic.com behind the ********. Yep, that's the episode. Go. Go. Go. Find YouTube's headquarters and yell at them. Scream at their sign, take pictures of their company, and wave your fists. If you work at YouTube, quit. It's not worth it. I mean, the more whistleblowers the better. Yeah, yeah, quit and go talk to the New York Times or some some ******* buddy. Yeah, also, one random thing that's positive is if you want. There's a lot of videos of trains on YouTube I've discovered of just trains passing by trains and ski fails. Yeah, I think you will find it very soothing. First, they'll be like, what the ****? A video of a train that's 12 minutes long. Guess what? That'll soothe you. Soothe your ***. Or if you're more like me, watch videos of people skiing and then failing to ski. I mean, that's if you want to laugh. Yeah, yeah, yeah. I mean, I feel laugh. I feel like YouTube's algorithm is gonna take you from train videos to train fails really fast. Ohh boy yeah ****. I don't know now that now that I know about the. The rabbit hole. I'm afraid that there's a way to connect trains to children that I have not thought of. Oh no, I'm not even going to make any further comments on that. We should add anything before this gets to a spot. Hello, I'm Erica Kelly from the podcast Southern Fried True crime, and if you want to go from podcast fan to podcast host, do what I did and check out spreaker from iheart. I was working in accounting and hating it. Then after just 18 months of podcasting with Spreaker, I was able to quit my day job. Follow your podcasting dreams, let's break your handle the hosting, creation, distribution, and monetization of your podcast. Go to spreaker.com. That's spreaker.com. If you could completely remove one phrase from your vocabulary, which phrase would you choose? I don't know. Correct answer. No, I meant I don't know which phrase, and the best way to banish I don't know from your life is by cramming your brain full of stuff you should know. Join your host, Josh and Chuck on the Super Popular podcast packed with fascinating discussions on science, history, pop culture and more episodes that ask, was the lost city of Atlantis Real? I don't know. Is birth order important? I don't know. How does pizza work? Well, I do know. Bit about that see? You can know even more, because stuff you should know has over 1500 immensely interesting episodes for your brain to feast on. So what do you say? I don't want to miss the stuff you should know. Podcast you're learning already. Listen to stuff you should know on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Sisters of the Underground is a podcast about fearless Dominican women who stood up against the brutal dictator. He needs to be stopped. We've been silent and complacent for far too long. I am Daniel Ramirez, and as a Dominicana myself, I am proud to be narrating this true story that is often left out of the history books through your husband, blood on his hands. Listen to sisters of the underground wherever you get your podcasts.