facebook people tech harris google attention companies

techsuch May 9, 2021 0 Comments

‘Our minds can be hijacked’: the tech insiders who fear a smartphone dystopiaJustin Rosenstein had tweaked his laptop’s operating system to block Reddit,banned himself from Snapchat, which he compares to heroin, and imposed limitson his use of Facebook. But even that wasn’t enough. In August, the 34-year-old tech executive took a more radical step to restrict his use of socialmedia and other addictive technologies.Rosenstein purchased a new iPhone and instructed his assistant to set up aparental-control feature to prevent him from downloading any apps.He was particularly aware of the allure of Facebook “likes”, which hedescribes as “bright dings of pseudo-pleasure” that can be as hollow as theyare seductive. And Rosenstein should know: he was the Facebook engineer whocreated the “like” button in the first place.A decade after he stayed up all night coding a prototype of what was thencalled an “awesome” button, Rosenstein belongs to a small but growing band ofSilicon Valley heretics who complain about the rise of the so-called“attention economy”: an internet shaped around the demands of an advertisingeconomy.These refuseniks are rarely founders or chief executives, who have littleincentive to deviate from the mantra that their companies are making the worlda better place. Instead, they tend to have worked a rung or two down thecorporate ladder: designers, engineers and product managers who, likeRosenstein, several years ago put in place the building blocks of a digitalworld from which they are now trying to disentangle themselves. “It is verycommon,” Rosenstein says, “for humans to develop things with the best ofintentions and for them to have unintended, negative consequences.”Rosenstein, who also helped create Gchat during a stint at Google, and nowleads a San Francisco-based company that improves office productivity, appearsmost concerned about the psychological effects on people who, research shows,touch, swipe or tap their phone 2,617 times a day.There is growing concern that as well as addicting users, technology iscontributing toward so-called “continuous partial attention”, severelylimiting people’s ability to focus, and possibly lowering IQ. One recent studyshowed that the mere presence of smartphones damages cognitive capacity – evenwhen the device is turned off. “Everyone is distracted,” Rosenstein says. “Allof the time.”> It is very common for humans to develop things with the best of intentions> that have unintended, negative consequencesJustin Rosenstein, creator of the ‘like’ buttonBut those concerns are trivial compared with the devastating impact upon thepolitical system that some of Rosenstein’s peers believe can be attributed tothe rise of social media and the attention-based market that drives it.Drawing a straight line between addiction to social media and politicalearthquakes like Brexit and the rise of Donald Trump, they contend thatdigital forces have completely upended the political system and, leftunchecked, could even render democracy as we know it obsolete.In 2007, Rosenstein was one of a small group of Facebook employees who decidedto create a path of least resistance – a single click – to “send little bitsof positivity” across the platform. Facebook’s “like” feature was, Rosensteinsays, “wildly” successful: engagement soared as people enjoyed the short-termboost they got from giving or receiving social affirmation, while Facebookharvested valuable data about the preferences of users that could be sold toadvertisers. The idea was soon copied by Twitter, with its heart-shaped“likes” (previously star-shaped “favourites”), Instagram, and countless otherapps and websites.It was Rosenstein’s colleague, Leah Pearlman, then a product manager atFacebook and on the team that created the Facebook “like”, who announced thefeature in a 2009 blogpost. Now 35 and an illustrator, Pearlman confirmed viaemail that she, too, has grown disaffected with Facebook “likes” and otheraddictive feedback loops. She has installed a web browser plug-in to eradicateher Facebook news feed, and hired a social media manager to monitor herFacebook page so that she doesn’t have to.Justin Rosenstein, the former Google and Facebook engineer who helped buildthe ‘like’ button: ‘Everyone is distracted. All of the time.’ Photograph:Courtesy of Asana Communications“One reason I think it is particularly important for us to talk about this nowis that we may be the last generation that can remember life before,”Rosenstein says. It may or may not be relevant that Rosenstein, Pearlman andmost of the tech insiders questioning today’s attention economy are in their30s, members of the last generation that can remember a world in whichtelephones were plugged into walls.It is revealing that many of these younger technologists are weaningthemselves off their own products, sending their children to elite SiliconValley schools where iPhones, iPads and even laptops are banned. They appearto be abiding by a Biggie Smalls lyric from their own youth about the perilsof dealing crack cocaine: never get high on your own supply.* * *One morning in April this year, designers, programmers and tech entrepreneursfrom across the world gathered at a conference centre on the shore of the SanFrancisco Bay. They had each paid up to $1,700 to learn how to manipulatepeople into habitual use of their products, on a course curated by conferenceorganiser Nir Eyal.Eyal, 39, the author of Hooked: How to Build Habit-Forming Products, has spentseveral years consulting for the tech industry, teaching techniques hedeveloped by closely studying how the Silicon Valley giants operate.“The technologies we use have turned into compulsions, if not full-fledgedaddictions,” Eyal writes. “It’s the impulse to check a message notification.It’s the pull to visit YouTube, Facebook, or Twitter for just a few minutes,only to find yourself still tapping and scrolling an hour later.” None of thisis an accident, he writes. It is all “just as their designers intended”.He explains the subtle psychological tricks that can be used to make peopledevelop habits, such as varying the rewards people receive to create “acraving”, or exploiting negative emotions that can act as “triggers”.“Feelings of boredom, loneliness, frustration, confusion and indecisivenessoften instigate a slight pain or irritation and prompt an almost instantaneousand often mindless action to quell the negative sensation,” Eyal writes.Attendees of the 2017 Habit Summit might have been surprised when Eyal walkedon stage to announce that this year’s keynote speech was about “something alittle different”. He wanted to address the growing concern that technologicalmanipulation was somehow harmful or immoral. He told his audience that theyshould be careful not to abuse persuasive design, and wary of crossing a lineinto coercion.But he was defensive of the techniques he teaches, and dismissive of those whocompare tech addiction to drugs. “We’re not freebasing Facebook and injectingInstagram here,” he said. He flashed up a slide of a shelf filled with sugarybaked goods. “Just as we shouldn’t blame the baker for making such delicioustreats, we can’t blame tech makers for making their products so good we wantto use them,” he said. “Of course that’s what tech companies will do. Andfrankly: do we want it any other way?”> We’re not freebasing Facebook and injecting Instagram hereNir Eyal, tech consultantWithout irony, Eyal finished his talk with some personal tips for resistingthe lure of technology. He told his audience he uses a Chrome extension,called DF YouTube, “which scrubs out a lot of those external triggers” hewrites about in his book, and recommended an app called Pocket Points that“rewards you for staying off your phone when you need to focus”.Finally, Eyal confided the lengths he goes to protect his own family. He hasinstalled in his house an outlet timer connected to a router that cuts offaccess to the internet at a set time every day. “The idea is to remember thatwe are not powerless,” he said. “We are in control.”But are we? If the people who built these technologies are taking such radicalsteps to wean themselves free, can the rest of us reasonably be expected toexercise our free will?Not according to Tristan Harris, a 33-year-old former Google employee turnedvocal critic of the tech industry. “All of us are jacked into this system,” hesays. “All of our minds can be hijacked. Our choices are not as free as wethink they are.”Harris, who has been branded “the closest thing Silicon Valley has to aconscience”, insists that billions of people have little choice over whetherthey use these now ubiquitous technologies, and are largely unaware of theinvisible ways in which a small number of people in Silicon Valley are shapingtheir lives.A graduate of Stanford University, Harris studied under BJ Fogg, a behaviouralpsychologist revered in tech circles for mastering the ways technologicaldesign can be used to persuade people. Many of his students, including Eyal,have gone on to prosperous careers in Silicon Valley.Tristan Harris, a former Google employee, is now a critic of the techindustry: ‘Our choices are not as free as we think they are.’ Photograph:Robert Gumpert/The GuardianHarris is the student who went rogue; a whistleblower of sorts, he is liftingthe curtain on the vast powers accumulated by technology companies and theways they are using that influence. “A handful of people, working at a handfulof technology companies, through their choices will steer what a billionpeople are thinking today,” he said at a recent TED talk in Vancouver.“I don’t know a more urgent problem than this,” Harris says. “It’s changingour democracy, and it’s changing our ability to have the conversations andrelationships that we want with each other.” Harris went public – givingtalks, writing papers, meeting lawmakers and campaigning for reform afterthree years struggling to effect change inside Google’s Mountain Viewheadquarters.It all began in 2013, when he was working as a product manager at Google, andcirculated a thought-provoking memo, A Call To Minimise Distraction & RespectUsers’ Attention, to 10 close colleagues. It struck a chord, spreading to some5,000 Google employees, including senior executives who rewarded Harris withan impressive-sounding new job: he was to be Google’s in-house design ethicistand product philosopher.Looking back, Harris sees that he was promoted into a marginal role. “I didn’thave a social support structure at all,” he says. Still, he adds: “I got tosit in a corner and think and read and understand.”He explored how LinkedIn exploits a need for social reciprocity to widen itsnetwork; how YouTube and Netflix autoplay videos and next episodes, deprivingusers of a choice about whether or not they want to keep watching; howSnapchat created its addictive Snapstreaks feature, encouraging near-constantcommunication between its mostly teenage users.> I have two kids and I regret every minute that I’m not paying attention to> them because my smartphone has sucked me inLoren Brichter, app designerThe techniques these companies use are not always generic: they can bealgorithmically tailored to each person. An internal Facebook report leakedthis year, for example, revealed that the company can identify when teens feel“insecure”, “worthless” and “need a confidence boost”. Such granularinformation, Harris adds, is “a perfect model of what buttons you can push ina particular person”.Tech companies can exploit such vulnerabilities to keep people hooked;manipulating, for example, when people receive “likes” for their posts,ensuring they arrive when an individual is likely to feel vulnerable, or inneed of approval, or maybe just bored. And the very same techniques can besold to the highest bidder. “There’s no ethics,” he says. A company payingFacebook to use its levers of persuasion could be a car business targetingtailored advertisements to different types of users who want a new vehicle. Orit could be a Moscow-based troll farm seeking to turn voters in a swing countyin Wisconsin.Harris believes that tech companies never deliberately set out to make theirproducts addictive. They were responding to the incentives of an advertisingeconomy, experimenting with techniques that might capture people’s attention,even stumbling across highly effective design by accident.A friend at Facebook told Harris that designers initially decided thenotification icon, which alerts people to new activity such as “friendrequests” or “likes”, should be blue. It fit Facebook’s style and, thethinking went, would appear “subtle and innocuous”. “But no one used it,”Harris says. “Then they switched it to red and of course everyone used it.”Facebook’s headquarters in Menlo Park, California. The company’s famous‘likes’ feature has been described by its creator as ‘bright dings of pseudo-pleasure’. Photograph: Bloomberg/Bloomberg via Getty ImagesThat red icon is now everywhere. When smartphone users glance at their phones,dozens or hundreds of times a day, they are confronted with small red dotsbeside their apps, pleading to be tapped. “Red is a trigger colour,” Harrissays. “That’s why it is used as an alarm signal.”The most seductive design, Harris explains, exploits the same psychologicalsusceptibility that makes gambling so compulsive: variable rewards. When wetap those apps with red icons, we don’t know whether we’ll discover aninteresting email, an avalanche of “likes”, or nothing at all. It is thepossibility of disappointment that makes it so compulsive. It’s this that explains how the pull-to-refresh mechanism, whereby users swipedown, pause and wait to see what content appears, rapidly became one of themost addictive and ubiquitous design features in modern technology. “Each timeyou’re swiping down, it’s like a slot machine,” Harris says. “You don’t knowwhat’s coming next. Sometimes it’s a beautiful photo. Sometimes it’s just anad.” * * *The designer who created the pull-to-refresh mechanism, first used to updateTwitter feeds, is Loren Brichter, widely admired in the app-building communityfor his sleek and intuitive designs.Now 32, Brichter says he never intended the design to be addictive – but wouldnot dispute the slot machine comparison. “I agree 100%,” he says. “I have twokids now and I regret every minute that I’m not paying attention to thembecause my smartphone has sucked me in.”Brichter created the feature in 2009 for Tweetie, his startup, mainly becausehe could not find anywhere to fit the “refresh” button on his app. Holding anddragging down the feed to update seemed at the time nothing more than a “cuteand clever” fix. Twitter acquired Tweetie the following year, integratingpull-to-refresh into its own app.Since then the design has become one of the most widely emulated features inapps; the downward-pull action is, for hundreds of millions of people, asintuitive as scratching an itch.Brichter says he is puzzled by the longevity of the feature. In an era of pushnotification technology, apps can automatically update content without beingnudged by the user. “It could easily retire,” he says. Instead it appears toserve a psychological function: after all, slot machines would be far lessaddictive if gamblers didn’t get to pull the lever themselves. Brichterprefers another comparison: that it is like the redundant “close door” buttonin some elevators with automatically closing doors. “People just like to pushit.”All of which has left Brichter, who has put his design work on the backburnerwhile he focuses on building a house in New Jersey, questioning his legacy.“I’ve spent many hours and weeks and months and years thinking about whetheranything I’ve done has made a net positive impact on society or humanity atall,” he says. He has blocked certain websites, turned off push notifications,restricted his use of the Telegram app to message only with his wife and twoclose friends, and tried to wean himself off Twitter. “I still waste time onit,” he confesses, “just reading stupid news I already know about.” He chargeshis phone in the kitchen, plugging it in at 7pm and not touching it until thenext morning.“Smartphones are useful tools,” he says. “But they’re addictive. Pull-to-refresh is addictive. Twitter is addictive. These are not good things. When Iwas working on them, it was not something I was mature enough to think about.I’m not saying I’m mature now, but I’m a little bit more mature, and I regretthe downsides.”Not everyone in his field appears racked with guilt. The two inventors listedon Apple’s patent for “managing notification connections and displaying iconbadges” are Justin Santamaria and Chris Marcellino. Both were in their early20s when they were hired by Apple to work on the iPhone. As engineers, theyworked on the behind-the-scenes plumbing for push-notification technology,introduced in 2009 to enable real-time alerts and updates to hundreds ofthousands of third-party app developers. It was a revolutionary change,providing the infrastructure for so many experiences that now form a part ofpeople’s daily lives, from ordering an Uber to making a Skype call toreceiving breaking news updates. Loren Brichter, who in 2009 designed the pull-to-refresh feature now used bymany apps, on the site of the home he’s building in New Jersey: ‘Smartphonesare useful tools, but they’re addictive … I regret the downsides.’ Photograph:Tim Knox/The GuardianBut notification technology also enabled a hundred unsolicited interruptionsinto millions of lives, accelerating the arms race for people’s attention.Santamaria, 36, who now runs a startup after a stint as the head of mobile atAirbnb, says the technology he developed at Apple was not “inherently good orbad”. “This is a larger discussion for society,” he says. “Is it OK to shutoff my phone when I leave work? Is it OK if I don’t get right back to you? Isit OK that I’m not ‘liking’ everything that goes through my Instagram screen?”His then colleague, Marcellino, agrees. “Honestly, at no point was I sittingthere thinking: let’s hook people,” he says. “It was all about the positives:these apps connect people, they have all these uses – ESPN telling you thegame has ended, or WhatsApp giving you a message for free from your familymember in Iran who doesn’t have a message plan.”A few years ago Marcellino, 33, left the Bay Area, and is now in the finalstages of retraining to be a neurosurgeon. He stresses he is no expert onaddiction, but says he has picked up enough in his medical training to knowthat technologies can affect the same neurological pathways as gambling anddrug use. “These are the same circuits that make people seek out food,comfort, heat, sex,” he says.All of it, he says, is reward-based behaviour that activates the brain’sdopamine pathways. He sometimes finds himself clicking on the red icons besidehis apps “to make them go away”, but is conflicted about the ethics ofexploiting people’s psychological vulnerabilities. “It is not inherently evilto bring people back to your product,” he says. “It’s capitalism.”That, perhaps, is the problem. Roger McNamee, a venture capitalist whobenefited from hugely profitable investments in Google and Facebook, has growndisenchanted with both companies, arguing that their early missions have beendistorted by the fortunes they have been able to earn through advertising.> It’s changing our democracy, and it’s changing our ability to have the> conversations and relationships we wantTristan Harris, former design ethicist at GoogleHe identifies the advent of the smartphone as a turning point, raising thestakes in an arms race for people’s attention. “Facebook and Google assertwith merit that they are giving users what they want,” McNamee says. “The samecan be said about tobacco companies and drug dealers.”That would be a remarkable assertion for any early investor in SiliconValley’s most profitable behemoths. But McNamee, 61, is more than an arms-length money man. Once an adviser to Mark Zuckerberg, 10 years ago McNameeintroduced the Facebook CEO to his friend, Sheryl Sandberg, then a Googleexecutive who had overseen the company’s advertising efforts. Sandberg, ofcourse, became chief operating officer at Facebook, transforming the socialnetwork into another advertising heavyweight.McNamee chooses his words carefully. “The people who run Facebook and Googleare good people, whose well-intentioned strategies have led to horrificunintended consequences,” he says. “The problem is that there is nothing thecompanies can do to address the harm unless they abandon their currentadvertising models.”Google’s headquarters in Silicon Valley. One venture capitalist believes that,despite an appetite for regulation, some tech companies may already be too bigto control: ‘The EU recently penalised Google $2.42bn for anti-monopolyviolations, and Google’s shareholders just shrugged.’ Photograph: RaminTalaie/The GuardianBut how can Google and Facebook be forced to abandon the business models thathave transformed them into two of the most profitable companies on the planet?McNamee believes the companies he invested in should be subjected to greaterregulation, including new anti-monopoly rules. In Washington, there is growingappetite, on both sides of the political divide, to rein in Silicon Valley.But McNamee worries the behemoths he helped build may already be too big tocurtail. “The EU recently penalised Google $2.42bn for anti-monopolyviolations, and Google’s shareholders just shrugged,” he says.Rosenstein, the Facebook “like” co-creator, believes there may be a case forstate regulation of “psychologically manipulative advertising”, saying themoral impetus is comparable to taking action against fossil fuel or tobaccocompanies. “If we only care about profit maximisation,” he says, “we will gorapidly into dystopia.” * * *James Williams does not believe talk of dystopia is far-fetched. The ex-Googlestrategist who built the metrics system for the company’s global searchadvertising business, he has had a front-row view of an industry he describesas the “largest, most standardised and most centralised form of attentionalcontrol in human history”.Williams, 35, left Google last year, and is on the cusp of completing a PhD atOxford University exploring the ethics of persuasive design. It is a journeythat has led him to question whether democracy can survive the newtechnological age.He says his epiphany came a few years ago, when he noticed he was surroundedby technology that was inhibiting him from concentrating on the things hewanted to focus on. “It was that kind of individual, existential realisation:what’s going on?” he says. “Isn’t technology supposed to be doing the completeopposite of this?” That discomfort was compounded during a moment at work, when he glanced at oneof Google’s dashboards, a multicoloured display showing how much of people’sattention the company had commandeered for advertisers. “I realised: this isliterally a million people that we’ve sort of nudged or persuaded to do thisthing that they weren’t going to otherwise do,” he recalls.He embarked on several years of independent research, much of it conductedwhile working part-time at Google. About 18 months in, he saw the Google memocirculated by Harris and the pair became allies, struggling to bring aboutchange from within.> It is not inherently evil to bring people back to your product. It’s> capitalismChris Marcellino, former Apple engineerWilliams and Harris left Google around the same time, and co-founded anadvocacy group, Time Well Spent, that seeks to build public momentum for achange in the way big tech companies think about design. Williams finds ithard to comprehend why this issue is not “on the front page of every newspaperevery day.“Eighty-seven percent of people wake up and go to sleep with theirsmartphones,” he says. The entire world now has a new prism through which tounderstand politics, and Williams worries the consequences are profound.The same forces that led tech firms to hook users with design tricks, he says,also encourage those companies to depict the world in a way that makes forcompulsive, irresistible viewing. “The attention economy incentivises thedesign of technologies that grab our attention,” he says. “In so doing, itprivileges our impulses over our intentions.”That means privileging what is sensational over what is nuanced, appealing toemotion, anger and outrage. The news media is increasingly working in serviceto tech companies, Williams adds, and must play by the rules of the attentioneconomy to “sensationalise, bait and entertain in order to survive”.Tech and the rise of Trump: as the internet designs itself around holding ourattention, politics and the media has become increasingly sensational.Photograph: John Locher/APIn the wake of Donald Trump’s stunning electoral victory, many were quick toquestion the role of so-called “fake news” on Facebook, Russian-createdTwitter bots or the data-centric targeting efforts that companies such asCambridge Analytica used to sway voters. But Williams sees those factors assymptoms of a deeper problem.It is not just shady or bad actors who were exploiting the internet to changepublic opinion. The attention economy itself is set up to promote a phenomenonlike Trump, who is masterly at grabbing and retaining the attention ofsupporters and critics alike, often by exploiting or creating outrage.Williams was making this case before the president was elected. In a blogpublished a month before the US election, Williams sounded the alarm bell onan issue he argued was a “far more consequential question” than whether Trumpreached the White House. The reality TV star’s campaign, he said, had heraldeda watershed in which “the new, digitally supercharged dynamics of theattention economy have finally crossed a threshold and become manifest in thepolitical realm”.Williams saw a similar dynamic unfold months earlier, during the Brexitcampaign, when the attention economy appeared to him biased in favour of theemotional, identity-based case for the UK leaving the European Union. Hestresses these dynamics are by no means isolated to the political right: theyalso play a role, he believes, in the unexpected popularity of leftwingpoliticians such as Bernie Sanders and Jeremy Corbyn, and the frequentoutbreaks of internet outrage over issues that ignite fury among progressives.All of which, Williams says, is not only distorting the way we view politicsbut, over time, may be changing the way we think, making us less rational andmore impulsive. “We’ve habituated ourselves into a perpetual cognitive styleof outrage, by internalising the dynamics of the medium,” he says. It is against this political backdrop that Williams argues the fixation inrecent years with the surveillance state fictionalised by George Orwell mayhave been misplaced. It was another English science fiction writer, AldousHuxley, who provided the more prescient observation when he warned thatOrwellian-style coercion was less of a threat to democracy than the moresubtle power of psychological manipulation, and “man’s almost infiniteappetite for distractions”.Since the US election, Williams has explored another dimension to today’sbrave new world. If the attention economy erodes our ability to remember, toreason, to make decisions for ourselves – faculties that are essential toself-governance – what hope is there for democracy itself?“The dynamics of the attention economy are structurally set up to underminethe human will,” he says. “If politics is an expression of our human will, onindividual and collective levels, then the attention economy is directlyundermining the assumptions that democracy rests on.” If Apple, Facebook,Google, Twitter, Instagram and Snapchat are gradually chipping away at ourability to control our own minds, could there come a point, I ask, at whichdemocracy no longer functions?“Will we be able to recognise it, if and when it happens?” Williams replies.“And if we can’t, then how do we know it hasn’t happened already?”

Leave a Reply

Your email address will not be published.