≡ Menu

You Are Not a Gadget by Jaron Lanier

You Are Not a Gadget

RATING: 8/10…READ: August 2, 2011

A humanistic perspective on the direction technology is taking us & what we can do about it. Lanier rejects the singularity argument by prominent technologist such as Kevin Kelly; this work is a contrarian view to such thoughts. A great book that not only dissects our problems, but offers solutions as well.

Get at Amazon


Part 1: What is a person?

The most important thing about a technology is how it changes people.

It is impossible to work with information technology without also engaging in social engineering.

One might ask, “If I am blogging, twittering, and wikiing a lot, how does that change who I am?” or “If the hive mind” is my audience, who am I?”

While lock-in may be a gangster in the world or railroads (think London Tube’s no air conditioning), it is an absolute tyrant in the digital world (think MIDI – hard to change standards)

Lock-in removes design options based on what is easiest to program, what is politically feasible, what is fashionable, or what is created by chance.

Lock-in removes ideas that do not fin into the winning digital representation scheme, but it also reduces or narrows the ideas it immortalizes, cutting away the unfathomable penumbra (uncertainty) of meaning that distinguishes a word in natural language from a command in a computer program.

Instead of people being treated as the sources of their own creativity, commercial aggregation and abstraction sites presented anonymized fragments of creativity as products that might have fallen from the sky or been dug up from the ground, obscuring the true sources.

Finance was transformed by computing clouds. Success in finance became increasingly about manipulating the cloud at the expense of sound financial principles.

Pop culture has entered into a nostalgic malaise. Online culture is dominated by trivial mashups of the culture that existed before the onset of mashups, and by fandom responding to the dwindling outposts of centralized mass media. It is a culture of reaction without action.

Suggestions to improve:

-Don’t post anonymously unless you really might be in danger.

-If you put effort into Wikipedia articles, put even more effort using your personal voice and expression outside of the wiki to help attract people who don’t yet realize that they are interested in the topics you contributed to.

-Create a website that expresses something about who you are that won’t fit into the template available to you on a social networking site.

-Post a video once in a while that took you one hundred times more time to create than it takes to view.

-Write a blog post that took weeks of reflection before you heard the inner voice that needed to come out.

If you love a medium made of software, there’s a danger that you will become entrapped in someone else’s recent careless thoughts.

The typical belief of the direction of technology: singularity…the belief that computers and robots will construct copies of themselves again and again, improving each iteration and eventually host all consciousness.

If you believe in the rapture (the we’re fucked scenario), you’ll just give up and want to speed destruction up. If you believe in the singularity, you’ll want to move away from humanistic motives to design for technologies wants. (Neither is preferable)

Kevin Kelly says that we don’t need authors anymore, that all the ideas of the world, all the fragments that used to be assembled into coherent books by identifiable authors, can be combined into one single, global book. Chris Anderson proposes that science should no longer seek theories that scientists can understand, because the digital cloud will understand better anyway.

“Information wants to be free.” –Stewart Brand, founder of the Whole Earth Catalog

“Information doesn’t deserve to be free.” –Lanier

Information which exists independently of the culture of an observer, is not the same as the kind we can put in computers, the kind that supposedly wants to be free.

Cybernetic totalists love to think of the stuff as if it were alive and had its own ideas and ambitions. But what if information is inanimate? What if it’s even less than inanimate, a mere artifact of human thought? What if only humans are real, and information is not?

If you want to make the transition from the old religion, where you hope God will give you an afterlife, to the new religion, where you hope to become immortal by getting uploaded into a computer, then you have to believe information is real and alive. So for you, it will be important to redesign human institutions like art, the economy, and the law to reinforce the perception that information is alive. You demand that the rest of us live in your new conception of a state religion. You need us to deify information to reinforce your faith.

If you have a conversation with a simulated person presented by an AI program, can you tell how far you’ve let your sense of personhood degrade in order to make the illusion work for you?

Did that search engine really know what you want, or are you playing along, lowering your standards to make it seem clever?

Ray Kurzweil wants the global computing cloud to scoop up the contents of our brains so we can live forever in virtual reality. When my friends and I built the first virtual reality machines, the whole point was to make this world more creative, expressive, empathic, and interesting. It was not to escape it.

When people are told that a computer is intelligent, they become prone to changing themselves in order to make the computer appear to work better, instead of demanding that the computer be changed to become more useful.

Treating computers as intelligent autonomous entities ends up stranding the process of engineering on its head. We can’t afford to respect our own designs so much.

The circle of empathy: An imaginary circle of empathy is drawn by each person. It circumscribes the person at some distance, and corresponds to those things in the world that deserve empathy.

-If someone falls within your circle of empathy, you wouldn’t want to see him or her killed Something that is clearly outside the circle is fair game.

-Ex. Most people would place all other people within the circle, but most are willing to see bacteria killed when we brush our teeth.

-The deepest controversies involve whether something or someone should lie just inside or just outside the circle: such as the idea of slavery or abortion.

-When you change the contents of your circle, you change the conception of yourself.

-The center of the circle shifts as its perimeter is changed.

-To expand the circle indefinitely can lead to oppression, because the rights of potential entities (as perceived by only some people) can conflict with the rights of indisputable real people. Ex. Abortion

We all have to live with our imperfect ability to discern the proper boundaries of our circles of empathy.

If a church or government were doing these things, it would feel authoritarian, but when technologists are the culprits, we seem hip, fresh, and inventive.

It seems to me that even if we could network all the potential aliens in the galaxy—quadrillions of them, perhaps—and get each of them to contribute some seconds to a physics wiki, we would not replicate the achievements of even one mediocre physicist, much less a great one. [In regards to Clay Shirky’s Cognitive Surplus]

They [technologist] want to live in an airtight reality that resembles an idealized computer program, in which everything is understood and there are no fundamental mysteries.

If you read something written by someone who used the term “single” in a custom-composed, unique sentence, you will inevitably get a first whiff of the subtle experience of the author, something you would not get from a multiple choice database.

-Yes, it would be a tiny bit more work for everyone, but the benefits of semiautomated self-presentation are illusionary. If you start out by being fake, you’ll eventually have to put in twice the effort to undo the illusion if anything good is to come of it.

Adopting a metaphysically modest approach would make it harder to use database techniques to create instant lists of people who are say emo, single, affluent. But I don’t think that would be such a great loss.

A real friendship ought to introduce each person to unexpected weirdness in the other. Each acquaintance is an alien, a well of unexplored difference in the experience of life that cannot be imagined or accessed in any way but through genuine interaction. The idea of friendship in database-filtered social networks is certainly reduced from that.

The only hope for social networking sites from a business point of view is for a magic formula to appear in which some method of violating privacy and dignity becomes acceptable. [the social graph]

Collectives can be just as stupid as any individual—and, in important cases, stupider. The interesting question is whether it’s possible to map out where the one is smarter than the many.

Every authentic example of collective intelligence that I am aware of also shows how that collective was guided or inspired by well-meaning individuals. [MLK, Ghandi, Founding Fathers]

The balancing of influence between people and collectives is the heart of the design of democracies, scientific communities, and the many other long-standing success stories.

The “wisdom of crowds” effect should be thought of as a tool. The value of the tool is its usefulness in accomplishing a task. The point should never be the glorification of the tool.

Crowd improvement suggestions: limits on the ability of members of the crowd to see how others are about to decide on a question, in order to preserve independence and avoid mob behavior; a crowd should never be allowed to frame its own questions, and its answers should never be more complicated than a single number or multiple choice answer.

If you win anonymously, no one knows, and if you lose, you just change your pseudonym and start over, without having modified your point of view one bit.

User interface designs that arise from the ideology of the computing cloud make people—all of us—less kind. Trolling is not a string of isolated incidents, but the status quo in the online world.

The reason that computer viruses infect PCs more than Macs is not that a Mac is any better engineered, but that it is relatively obscure. PCs are more commonplace. This means that there is more return on the effort to crack PCs.

What computerized analysis of all the country’s school tests has done to education is exactly what Facebook has done to friendships. In both cases, life is turned into a database. Both degradations are based on the same philosophical mistake, which is the belief that computers can presently represent human thought or human relationships.

It’s the people who make the forum, not the software. Without the software, the experience would not exist at all, so I celebrate that software as flawed as it is. But it’s not as if the forum would really get much better if the software improved. Focusing too much on the software might even make things worse by shifting the focus from the people.

Part 2: What will money be?

A blog of blogs is more exalted than a mere blog. If you have seized a very high niche in the aggregation of human expression—in the way that Google has with search, fir instance—then you can become superpowerful. The same is true for the operator of a hedge fund. “Meta” equals power in the cloud.

One persistent dark side of industrialization is that any skill, no matter how difficult to acquire, can become obsolete when the machine improves.

During the past decade and a half, since the debut of the web, even during the best years of the economic boom times, the middle class in the United States declined. Wealth was ever more concentrated.

A functioning honest crowd-wisdom system ought to trump paid persuasion. If the crowd is so wise, it should be directing each person optimally in choices related to home finance, the whitening of yellow teeth, and the search for a lover. All the paid persuasion ought to be mooted. Every penny Google earns suggests a failure of the crowd—and Google is earning a lot of pennies.

If you want to know what’s really going on in a society or ideology, follow the money.

If money is flowing to advertising instead of musicians, journalists, and artists, then a society is more concerned with manipulation than truth or beauty.

If content is worthless, then people will start to become empty-headed and contentless.

If some free video of a silly stunt will draw as many eyeballs as the product of a professional filmmaker on a given day, then why pay the filmmaker? If an algorithm can use cloud-based data to unite those eyeballs with the video clip of the moment, why pay editors or impresarios? In the new scheme there is nothing but location, location, location. Rule the computing cloud that routes the thoughts of the hive mind, and you’ll be infinitely wealthy!

The time has come to ask, “Are we building a digital utopia for people or machines?”

If we choose to pry culture away from capitalism while the rest of life is still capitalistic, culture will become a slum. In fact, online culture increasingly resembles a slum in disturbing ways. Slums have more advertising than wealthy neighborhoods, for instance. People are meaner in slums; mob rule and vigilantism are commonplace.

“Help Desk”: knowledge management, data forensics, software consulting, and so on, can provide us with a way to imagine a world in which capitalism and advance technology can coexist with a fully employed population of human beings.

America: everyone wants to be lord of the computing cloud.

In the past, an investor has to be able to understand at least something about what an investment would actually accomplish. Maybe a building would be built, or a product would be shipped somewhere, for instance. No more. There are so many layers of abstraction between the new kind of elite investor and actual events on the ground that the investor no longer has any concept of what is actually being done as a result of investments.

Making money in the cloud doesn’t necessarily bring rain to the ground.

Big studios rather play the Big-N game of user-generated content, such as youtube…don’t have to pay artists…mitigate the risk of “content is king” from individuals.

The problem in each case [in relation to hacking in to a bank and adding money to your account or copying digital files] is not that you stole from a specific person but that you undermined the artificial scarcities that allow the economy to function.

In the same way, creative expression on the internet will benefit from a social contract [a small paywall/access fee] that imposes a modest degree of artificial scarcity on information.

Instead of collections of bits being offered as a product, they would be rendered as a service.

Even if a robot that maintains your health will only cost a penny in some advanced future, how will you earn that penny? Manual labor will be unpaid, since cheap robots will do it. In the open culture future, your creativity and expression would also be unpaid, since you would be a volunteer in the army of the long tail. That would leave nothing for you.

Digital socialists must avoid the trap of believing that a technological makeover has solved all the problems of socialism just because it can solve some of them. Getting people to cooperate is not enough.

If I know my neighbor is getting music, or cable TV, or whatever, for free, it becomes a little harder to get me to pay for the same things. So for that reason, if all of us are to earn a living when the machines get good, we will have to agree that it is worth paying for one another’s elevated cultural and creative expressions.

One fine day your ISP could offer you an option: you could stop paying your monthly access charge in exchange for signing up for the new social contract in which you pay for bits.

From a technological point of view, it is true that you can’t make a perfect copy-protection scheme.

Possible Future directions:

Telegigging: live performances / “holographic” projectors

Songles: A songle is a dongle for a song. A dongle is a little piece of hardware that you plug into a computer to run a piece of commercial software. It’s like the physical key you have to buy in order to make the software work. —could create scarcity, be sold as fashion, bundled with beer, sneakers, etc.

—-could have more crafted songles for Opera vs pop star of moment

Formal Financial Expression: using AI techniques to create formal versions of certain complicated or innovative contracts that define social instruments:

1. most transactions would continue to be described traditionally. If a transaction followed a cookie-cutter design, then it would be handled just as now. Ex. The sale of stocks would be handled just as now.

2. Highly inventive contracts, such as leveraged default swaps or schemes based on high-frequency trades, would be created in an entirely new way. They would be denied ambiguity. They would be formally described.

–The economy is a tool, and there’s no reason it has to be as open and wild as the many open and wild things of our experience. But it also doesn’t have to be as tied down as some might want. It can and should have an intermediate level of complexity.

–This sort of transaction representation has already been done internally within some of the more sophisticated hedge funds.

Part 3: The unbearable thinness of flatness

Ideal computers can be experienced when you write a small program. They seem to offer the infinite possibilities and an extraordinary sense of freedom.

Real computers are experienced when we deal with large problems. They can trap us in tangles of code and make us slaves to legacy—and not just in matters of obscure technological decisions. Real computers reify our philosophies through the process of lock-in before we are ready.

Let’s suppose in the 1980s I had said, “In a quarter century, when the digital revolution has made great progress and computer chips are millions of times faster than they are now, humanity will finally win the prize of being able to write a new encyclopedia and a new version of UNIX!” It would have sounded utterly pathetic.

The Internet was originally conceived during the Cold War to be capable of surviving a nuclear attack. Parts of it can be destroyed without destroying the whole, but that also means that parts can be known without knowing the whole. The core idea is called “packet switching.”

A packet is a tiny portion of a file that is passed between nodes on the Internet in the way a baton is passed between runners in a relay race. The packet has a destination address. If a particular node fails to acknowledge receipt of a packet, the node trying to pass the packet to it can try again elsewhere. The route is not specified, only the destination. This is how the Internet can hypothetically survive an attack. The nodes keep trying to fin neighbors until each packet is eventually routed to its destination.

Freedom is moot if you waste it. If the Internet is really destined to be no more than an ancillary medium, which I would view as a profound defeat, then it at least ought to do whatever it can not to bite the hand that feeds it—that is, it shouldn’t starve the commercial media industries.

What makes something fully real is that it is impossible to represent it to completion.

A digital sound sample in angry rap doesn’t correspond to the graffiti but to the wall.

Context has always been part of expression because expression becomes meaningless if the context becomes arbitrary.

You could come up with an invented language in which the letters that compose the words to John Lennon’s “imagine” instead spell out the instructions for cleaning a refrigerator. Meaning is only ever meaning in context.

There are two primary strands of cybernetic totalism. In one strand, the computing cloud is supposed to get smart to a superhuman degree on its own, and in the other, a crowd of people connected to the cloud through anonymous, fragmentary contact is supposed to be the superhuman entity that gets smart.

There won’t be an orgy of creativity in an overly open version of synthetic biology, because there have to species for sex to make sense.

It seems to me that if Wikipedia suddenly disappeared, similar information would still be available for the most part, but in more contextualized forms, with more visibility for the authors and with greater sense of style and presence.

If Wikipedia is treated at the overarching, primary text of the human experience, then of course it will, as if by decree, become “more convenient” than other texts.

Part 4: Making the best of bits

Computationalism: the world can be understood as a computational process, with people as subprocesses.

Gadgets are inert tools and are only useful because people have the magical ability to communicate meaning through them.

A set of one thousand records in a database that refer to one another in patterns would not be meaningful without a person to interpret it; but perhaps a quadrillion or a googol of database entries can mean something in their own right, even if there is no being explaining them.

“If you can perceive the hive mind to be recommending music to you, for instance, then the hive mind is effectively a person.”

I believe humans are the result of billions of years of implicit, evolutionary study in the school of hard knocks. The cybernetic structure of a person has been refined by a very large, very long, and very deep encounter with physical reality.

From this point of view, what can make bits have meaning is that their patterns have been hewn out of so many encounters with reality that they aren’t really abstractable bits anymore, but are instead a nonabstract continuation of reality.

Realism is based on specifics, but we don’t yet know—and might never know—the specifics of personhood from a computational point of view. The best we can do right now is engage in the kind of storytelling that evolutionary biologist sometimes indulge in.

Colors and sounds can be measured with rulers, but odors must be looked up in a dictionary.

Olfaction as the city center and the other sensory systems as sprawling suburbs, which grew as the brain evolved and eventually became larger than the old downtown.

The hope that language would be like a computer program has died. Instead, music has changed to become more like a computer program.

If a tiny vocabulary has to be stretched to cover a lot of territory, then any difference at all between the qualities of words is practically a world of difference. The brain is so desirous of associations that it will then amplify any tiny potential linkage in order to get a usable one.

If we had infinite brains, capable of using an infinite number of words, those words would mean nothing, because each one would have too specific a usage.

Part 5: Future Humors

When we want to understand ourselves on naturalistic terms, we must make use of naturalistic philosophy that accounts for a degree of irreducible complexity, and until someone comes up with another idea, compuationalism is the only path we have to do that.

Treating people as nothing other than parts of nature is an uninspired basis for designing technologies that embody human aspirations.

The inverse error is just as misguided: it’s a mistake to treat nature as a person. That is the error that yields confusions like intelligent design.

Neoteny is an evolutionary strategy exhibited to varying degrees in different species, in which the characteristics of early development are drawn out and sustained into an individual organism’s chronological age.

Ex. A baby that learns behaviors

The twenties are the new teens, and people in their thirties are often still dating, not having settled on a mate or made a decision about whether to have children or not.

The design of online technology has moved from answering the desire for attention to addressing an even earlier development stage.

Separation anxiety is assuaged by contrast connection. Young people announce every detail on their lives on services like Twitter not to show off, but to avoid the closed door at bedtime, the empty room, the screaming vacuum of an isolated mind.

Software development doesn’t necessarily speed up in sync with improvements in hardware. It often instead slows down as computers get bigger because there are more opportunities for errors in bigger programs. Development becomes slower and more conservative when there is more at stake, and that’s what is happening.

Moore’s law can be expected to accelerate progress in medicine because computers will accelerate the speeds of processes like genomics and drug discovery. This means healthy old age will continue to get healthier and last longer and that the “youthful” phase of life will also be extended.

People live longer as technology improves, so cultural change actually slows, because it is tied more to the outgoing generational clock than the incoming one.

Cephalopods + Childhood = Humans + Virtual Reality

While individual cephalopods can learn a great deal within a lifetime, they pass on nothing to future generations. Each generation begins afresh, a blank slate, taking in the strange world without guidance other than instincts bred into their genes.

Suppose we had the ability to morph at will, as fast as we can think. What sort of language might make that possible? Would it be the same old conversation, or would we be able to “say” new things to one another?

—postsymbolic communication.