How to Coach Yourself and Others Beware of Manipulation | Page 258

9. Economic Manipulation 9.1. The Manipulation Matrix The Art Of Manipulation - Nir Eyal Let’s admit it, we in the consumer web industry are in the manipulation business. We build products meant to persuade people to do what we want them to do. We call these people “users” and even if we don’t say it aloud, we secretly wish every one of them would become fiendishly addicted. Users take our technologies with them to bed. When they wake up, they check for notifications, tweets, and updates before saying “good morning” to their loved ones. Ian Bogost, the famed game creator and professor, calls the wave of habit-forming technologies the “cigarette of this century” and warns of equally addictive and potentially destructive side-effects. When Is Manipulation Wrong? Manipulation is a designed experience crafted to change behavior — we all know what it feels like. We’re uncomfortable when we sense someone is trying to make us do something we wouldn’t do otherwise, like when at a car dealership or a timeshare presentation. Yet, manipulation can’t be all bad. If it were, what explains the numerous multi-billion dollar industries that rely heavily on users wilfully submitting to manipulation? If manipulation is a designed experience crafted to change behavior, then Weight Watchers, one of the most successful massmanipulation products in history, fits the definition. Much like in the consumer web industry, Weight Watchers customers’ decisions are programed by the designer of the system. Yet few question the morality of Weight Watchers. But what’s the difference? Why is manipulating users through flashy advertising or addictive video games thought to be distasteful while a strict system of food rationing is considered laudable? A More Addictive World Unfortunately, our moral compass has not caught-up with what technology now makes possible. Ubiquitous access to the web, transferring greater amounts of personal data at faster speeds than ever before, has created a more addictive world. Addictiveness is accelerating and according to Paul Graham of Y Combinator, we haven’t had time to develop societal “antibodies to addictive new things.” Graham puts responsibility on the user: “Unless we want to be canaries in the coal mine of each new addiction—the people whose sad example becomes a lesson to future generations—we’ll have to figure out for ourselves what to avoid and how.” But what of the people who make these manipulative experiences? The corporations who unleash these addictive technologies are, after all, made up of human beings with a moral sense of right and wrong. We too have families and kids who are susceptible to addiction and manipulation. What shared responsibilities do we code slingers and behavior designers have to our users, to future generations, and to ourselves? 257