Wojcicki told '60 Minutes' that Google employs 10,000 people to focus on "controversial content." She described their schedule, which includes time for therapy. Stahl also said there are reports that the "monitors" are "beginning to buy the conspiracy theories."
"What we really had to do was tighten our enforcement of that to make sure we were catching everything and we use a combination of people and machines," Wojcicki explained. "So Google as a whole has about 10,000 people that are focused on controversial content."
Lesley Stahl: I'm told that it is very stressful to be looking at these questionable videos all the time. And that there's actually counselors to make sure that there aren't mental problems with the people who are doing this work. Is that true?Wojcicki on Section 230, stopping 70% of controversial content:
Susan Wojcicki: It's a very important area for us. We try to do everything we can to make sure that this is a good work environment. Our reviewers work five hours of the eight hours reviewing videos. They have the opportunity to take a break whenever they want.
Lesley Stahl: I also heard that these monitors, reviewers, sometimes, they're beginning to buy the conspiracy theories.
Susan Wojcicki: I've definitely heard about that. And we work really hard with all of our reviewers to make sure that, you know, we're providing the right services for them.
Lesley Stahl: Once you watch one of these, YouTube's algorithms might recommend you watch similar content. But no matter how harmful or untruthful, YouTube can't be held liable for any content, due to a legal protection called Section 230.
The law under 230 does not hold you responsible for user-generated content. But in that you recommend things, sometimes 1,000 times, sometimes 5,000 times, shouldn't you be held responsible for that material, because you recommend it?
Susan Wojcicki: Well, our systems wouldn't work without recommending. And so if--
Lesley Stahl: I'm not saying don't recommend. I'm just saying be responsible for when you recommend so many times.
Susan Wojcicki: If we were held liable for every single piece of content that we recommended, we would have to review it. That would mean there'd be a much smaller set of information that people would be finding. Much, much smaller.
Lesley Stahl: She told us that earlier this year, YouTube started re-programming its algorithms in the U.S. to recommend questionable videos much less and point users who search for that kind of material to authoritative sources, like news clips. With these changes Wojcicki says they have cut down the amount of time Americans watch controversial content by 70%.
Reader Comments
Folks: Realize that Google/Utube, et al, only get away with this because people are naturally lazy.
If you familiarize yourself with Google's search protocols (most specifically their quasi boolean search capabilities) you can avoid their 'recommended/programmed 'results from the 30% of MSM lies.) I've been doing it for years, and recall and SOTT's Scott/Scottie noted how he even used the following simple technique to search on google for specific SOTT results.
Boolean is equivalent to basic logic. If you go to Google's advanced search at [Link] & enter your search according to the questions, you will learn that '+' means 'must have'; "-" means 'cannot have', etc. Likewise you'll see if you wish a search to include one of several words, you can insert this ( A | B | C).
If you wish only results from SOTT, you enter at the end of your search, < site:SOTT.net >
SOTTfolk are each and all smart enough to do this.
RC