google logo
© flickr.com/ Kristina Alexanderson
Google's efforts to filter out positions which they think are fake news, like climate skeptic posts, have hit an unexpected snag: Google have just noticed large groups of people across the world hold views which differ from the views championed by the Silicon Valley monoculture.
Alphabet's Eric Schmidt: It can be 'very difficult' for Google's search algorithm to understand truth

Catherine Clifford
2:38 PM ET Tue, 21 Nov 2017

In the United States' current polarized political environment, the constant publishing of articles with vehemently opposing arguments has made it almost impossible for Google to rank information properly.

So says billionaire Eric Schmidt, Chairman of Google's parent company, Alphabet, speaking at the Halifax International Security Forum on Saturday.

"Let's say that this group believes Fact A and this group believes Fact B and you passionately disagree with each other and you are all publishing and writing about it and so forth and so on. It is very difficult for us to understand truth," says Schmidt, referring to the search engine's algorithmic capabilities.

"So when it gets to a contest of Group A versus Group B - you can imagine what I am talking about - it is difficult for us to sort out which rank, A or B, is higher," Schmidt says.

...

In cases of greater consensus, when the search turns up a piece of incorrect or unreliable information, it is a problem that Google should be able to address by tweaking the algorithm, he says.

...

The problem comes when diametrically opposed viewpoints abound - the Google algorithm can not identify which is misinformation and which is truth.

That's the rub for the tech giant. "Now, there is a line we can't really get across," says Schmidt.

...

However, platforms like Facebook and Twitter have a different issue, sometimes referred to as the "Facebook bubble" or as an echo chamber. Because those companies' algorithms rely, at least in part, on things like "friends" and followers to determine what's displayed in their news feeds, the users are part of the problem.

"That is a core problem of humans that they tend to learn from each other and their friends are like them. And so until we decide collectively that occasionally somebody not like you should be inserted into your database, which is sort of a social values thing, I think we are going to have this problem," the Alphabet boss says.

...
As a climate skeptic and IT expert I'm finding this Google difficulty highly entertaining.

What people like Google's Schmidt desperately want to discover is a generalised way of detecting fake news. They believe in their hearts that climate skepticism for example is as nutty as thinking the moon landings were faked, but they have so far failed to find a common marker which allows their personal prejudices to be confirmed as objective reality.

Google could and likely does simply impose their prejudices, explicitly demoting climate skeptic articles and specific websites to the bottom of their list - but they feel guilty about doing this, because they know imposing their personal views on the search algorithm is cheating. Explicitly imposing personal prejudices on their search ranking algorithm forces Google to admit to themselves that those views are prejudices. It bothers them that they have not yet discovered a way to objectively justify those prejudices by applying a generalised filter to their underlying data.

To put it another way, in the case of climate skepticism I suspect Google's problem is they have discovered there are lots of published mainstream peer reviewed papers which support climate skeptic positions. This is likely messing up their efforts to classify climate skepticism as not being part of mainstream science.

The mounting evidence US tech giants are refusing to accept is that their Silicon Valley monoculture might be wrong about a few issues. They will likely continue to burn millions of dollars worth of software developer time chasing unicorns, because as long as they can convince themselves they are working on a solution, they don't have to admit to themselves that they might have made a mistake.