By Jaron Lanier, from You Are Not a Gadget, published in January by Knopf. Lanier, a computer scientist, popularized the term "virtual reality." His "Moving Beyond Muzak" was published in the March 1998 issue of Harper's Magazine.
The central faith embedded in Web technologies whereby users not only consume information but widely generate it is the idea that the Internet as a whole is coming alive and turning into a superhuman creature. The designs guided by this perverse kind of faith leave people in the shadows. Computers will soon get so big and fast, and the Internet so rich with information, that people will be obsolete, either left behind like the characters in Rapture novels or subsumed into some cyber-superhuman something. Silicon Valley culture has taken to enshrining this vague idea and spreading it the only way technologists can: in the design of software.
If you believe the distinction between the roles of people and of computers is starting to dissolve, you might express that - as some friends of mine at Microsoft once did - by designing features for a word processor that are supposed to know what you want - for example, when you want to start an outline within your document. You might have had the experience of Microsoft Word suddenly determining, at the wrong moment, that you are creating an indented outline. The real function of this feature isn't to make life easier for you. Instead, it promotes a new philosophy: that the computer is evolving into a life-form that can understand people better than people can understand themselves. If you believe this, then working for the benefit of the computing cloud over that of the individual puts you on the side of the angels.
People degrade themselves all the time in order to make machines seem smart. Before the 2008 stock-market crash, bankers believed in supposedly intelligent algorithms that could calculate credit risks before the bank makes bad loans; we ask teachers to teach to standardized tests so a student will look good to an algorithm. We have repeatedly demonstrated our species's bottomless ability to lower our standards to make information technology good, but every manifestation of intelligence in a machine is ambiguous. The same ambiguity that motivated dubious academic artificial intelligence projects in the past has been repackaged as mass culture. Did that search engine really know what you want, or are you playing along, lowering your standards to make it seem clever? The fragments of human effort that have flooded the Internet are perceived by some to form a hive mind or noosphere - terms used to describe what is thought to be a new superintelligence. A significant number of AI enthusiasts, after a protracted period of failed experiments in tasks like understanding natural language, eventually found consolation in the hive mind, which yields better results because there are real people behind the curtain. Wikipedia, for instance, works through what I call the "oracle illusion," in which knowledge of the human authorship of a text is suppressed in order to give the text superhuman validity.
From an engineering point of view, the difference between a social-networking site and the Web as it existed before such sites is a small detail. You could always create a list of links to your friends on your website, and you could always send emails to a circle of friends announcing whatever you cared to. All the social-networking services offer is a prod to use the Web in a particular way, according to a particular philosophy. If you read something written by someone who used the term "single" in a custom-composed, unique sentence, you will inevitably get a first whiff of the subtle experience of the author, something you would not get from a multiple-choice database. The benefits of semi-automated self-presentation are illusory. Enlightened designers leave open the possibility of metaphysical specialness either in humans or in the potential for unforeseen creative processes that we can't yet capture in software systems. That kind of modesty is the signature quality of being human-centered.
An individual who is receiving a flow of reports about the romantic status of a group of friends must learn to think in terms of the flow if it is to be perceived as worth reading at all. Am I accusing all those hundreds of millions of users of social-networking sites of reducing themselves in order to be able to use the services? Well, yes, I am. I know quite a few people, most of them young adults, who are proud to say that they have accumulated thousands of friends on Facebook. Obviously, their statements can be true only if the idea of friendship is diminished.
The wave of financial calamities that took place in 2008 was cloud-based. No one in the pre - digital-cloud era had the mental capacity to lie to himself in the way we routinely are able to now. The limitations of organic human memory and calculation put a cap on the intricacies of self-delusion. In finance, the rise of computer-assisted hedge funds and similar operations has turned capitalism into a search engine. You tend the engine in the computing cloud, and it searches for money. In the past, an investor had to be able to understand at least something about what an investment would actually accomplish. No longer. There are now so many layers of abstraction between the elite investor and actual events that he no longer has any concept of what is actually being done as a result of his investments.
True believers in the hive mind seem to think that no number of layers of abstraction in a financial system can dull the system's efficacy. The crowd works for free, and statistical algorithms supposedly take the risk out of making bets if you are a lord of the cloud. But who is that lord who owns the cloud that connects the crowd? Not just anybody. A lucky few (for luck is all that can possibly be involved) will own it. Entitlement has achieved its singularity and become infinite.
The Facebook Kid and the Cloud Lord are serf and king of the new order. In each case, human creativity and understanding, especially one's own creativity and understanding, are treated as worthless. Instead, one trusts in the crowd, in the algorithms that remove the risks of creativity in ways too sophisticated for any mere person to understand. A hedge-fund manager might make money by using the computational power of the cloud to create fantastical financial instruments that make bets on derivatives in such a way as to invent the phony virtual collateral for stupendous risks. This is a subtle form of counterfeiting, and it is precisely the same maneuver a socially competitive teenager makes in accumulating fantastical numbers of "friends" through a service like Facebook. But let's suppose you disagree that the idea of friendship is being reduced. Even then one must remember that the customers of social networks are not the members of those networks. The real customer is the advertiser of the future, but this creature has yet to appear in any significant way. The whole artifice, the whole idea of fake friendship, is just bait laid by the cloud lords to lure hypothetical advertisers - we might call them messianic advertisers - who could someday show up.
The hope of a thousand Silicon Valley start-ups is that firms like Facebook are capturing extremely valuable information called the "social graph." Using this information, an advertiser might be able to target all the members of a peer group just as they are forming their habits, opinions about brands, and so on. Peer pressure is the great power behind adolescent behavior, goes the reasoning, and adolescent choices become life choices. So if someone could crack the mystery of how to make perfect ads using the social graph, an advertiser would be able to design peer-pressure biases in a population of real people who would then be primed to buy whatever the advertiser is selling for their whole lives.
The situation with social networks is layered with multiple absurdities. The advertising idea hasn't made any money so far, because ad dollars appear to be better spent on search-specific placements and on relevant Web pages. If the revenue never appears, then a weird imposition of a database-as-reality ideology will have colored generations of teen peer group and romantic experiences for no business or other purpose. If, on the other hand, the revenue does appear, evidence suggests that its impact will be negative. When Facebook has attempted to turn the social graph into a profit center in the past, it has created ethical disasters. A famous example was 2007's Beacon. This was a suddenly imposed feature that was hard to opt out of. When a Facebook user made an online purchase from a variety of participating retailers, the event was broadcast to his friends. The motivation was to find a way to package peer pressure as a service that could be sold to advertisers. But it meant that, for example, there was no longer a way to buy a surprise birthday present. The commercial lives of Facebook users were no longer their own.
Personal reductionism has always been present in information systems. You have to declare your status in reductive ways when you file a tax return. Most people are aware of the difference between reality and database entries when they file taxes, yet you perform the same kind of self-reduction in order to create a profile on a social-networking site. You fill in the data: profession, relationship status, and location. In this case digital reduction becomes a causal element, mediating between new friends with whom most information is exchanged online. That is new.
It might at first seem that the experience of youth is now sharply divided between the old world of school and parents and the new world of social networking on the Internet, but actually school now belongs on the new side of the ledger. Education has gone through a parallel transformation, and for similar reasons. Information systems need to have information in order to run, but information underrepresents reality. Demand more from information than it can give and you end up with monstrous designs. Under the No Child Left Behind Act of 2002, for example, U.S. teachers are forced to choose between teaching general knowledge and "teaching the test." The best teachers are thereby often disenfranchised by the improper use of educational-information systems.
What computerized analysis of all the country's school tests has done to education is exactly what Facebook has done to friendships. In both cases, life is turned into a database. Both degradations are based on the same philosophical mistake, which is the belief that computers can presently represent human thought or human relationships. These are things computers cannot currently do. Whether one expects computers to improve in the future is a different issue. In a less idealistic atmosphere it would go without saying that software should be designed only to perform tasks that can be successfully carried out at a given time. That is not the atmosphere in which Internet software is designed, however. When technologists deploy a computer model of something like learning or friendship in a way that has an effect on real lives, they are relying on faith. When they ask people to live their lives through their models, they are potentially reducing life itself.
There is, unfortunately, only one entity that can maintain its value as everything else is devalued under the banner of the noosphere. At the end of the rainbow of open culture lies an eternal spring of advertisements. Advertising is elevated by open culture from its previous role as an accelerant and placed at the center of the human universe. Advertising is now singled out as the only form of expression meriting genuine commercial protection in the new world to come. Any other form of expression is to be remashed, anonymized, and decontextualized to the point of meaninglessness. Ads, however, are to be made ever more contextual, and the content of the ad is absolutely sacrosanct. No one dares to mash up ads served in the margins of their website by Google. The centrality of advertising to the new digital hive economy is absurd, and it is even more absurd that this isn't more widely recognized. The most tiresome claim of the reigning digital philosophy is that crowds working for free do a better job at some things than antediluvian paid experts. Wikipedia is often given as an example. If that is so, why doesn't the principle dissolve the persistence of advertising as a business?
A functioning, honest crowd-wisdom system ought to trump paid persuasion. If the crowd is so wise, it should be directing each person optimally in choices related to home finance, the whitening of yellow teeth, and the search for a lover. All that paid persuasion ought to be mooted. Every penny Google earns suggests a failure of the crowd - and Google is earning a lot of pennies.
If you want to know what's really going on in a society or ideology, follow the money. If money is flowing to advertising instead of to musicians, journalists, and artists, then a society is more concerned with manipulation than with truth or beauty. If content is worthless, then people will start to become empty-headed and contentless. The combination of hive mind and advertising has resulted in a new kind of social contract. The basic idea of this contract is that authors, journalists, musicians, and artists are encouraged to treat the fruits of their intellects and imaginations as fragments to be given without pay to the hive mind. Reciprocity takes the form of self-promotion. Culture is to become precisely nothing but advertising.
At the time the Web was born, in the early 1990s, a popular trope was that a new generation of teenagers, reared in the conservative Reagan years, had turned out to be exceptionally bland. The members of "Generation X" were characterized as blank and inert. The anthropologist Steve Barnett saw in them the phenomenon of pattern exhaustion, in which a culture runs out of variations of traditional designs in their pottery and becomes less creative. A common rationalization in the fledgling world of digital culture back then was that we were entering a transitional lull before a creative storm - or were already in the eye of one. But we were not passing through a momentary calm. We had, rather, entered a persistent somnolence, and I have come to believe that we will escape it only when we kill the hive.
Reader Comments
to our Newsletter