We speak of the great technological revolutions of history in terms of the machine, not the human. We forget the millions of workers who operated the machines that powered our industrial revolution, and we neglect to acknowledge the billions of us who are simultaneously the machine, the market, and the laboratory of the digital revolution.
Without a doubt the technological growth of the past quarter century has given us- as a species- capabilities we could only have written about in science-fiction terms. Tell the child of the 1980s that within their life, they would have access to the total sum of humanity’s knowledge on portable supercomputers, and you would have been laughed away; but here we are. This revolution has come at a cost however; to our psychology, our wellbeing, our democracy and even the structure of society itself.
As philosopher and computer scientist Jaron Lanier notes, “What might once have been called advertising must now be understood as continuous behaviour modification on a titanic scale…. The core process that allows social media to make money and that also does the damage to society is behaviour modification. Behaviour modification entails methodical techniques that change behavioural patterns in animals and people. It can be used to treat addictions, but it can also be used to create them.” (Ten Arguments For Deleting Your Social Media Accounts Right Now)
To learn more about how technology has stolen our attention; and to learn what we can do, I spoke to James Williams (Writer & researcher on the philosophy and ethics of technology, author of ‘Stand out of Our Light’), Jamie Bartlett (Author and Director of the Centre for the Analysis of Social Media) and Professor Adam Alter (Author & Associate Professor of Marketing at New York University’s Stern School of Business)
Q: To what extent are we designing technologies in ways that resonate with our goals?
[James Williams] Digital technologies are now an unavoidable part of the infrastructure of life and society, and we have the capacity to design them in ways that promote, understand and respect people’s goals and values in ways we simply couldn’t in the pre-digital era. With many of these technologies, though, our deeper human goals get treated as secondary, or are ignored altogether, in favour of lower ‘engagement’ goals such as views or clicks. This is, in large part, because many of the technologies that direct our attention and action serve the business model of advertising.
If you think about what people actually care about in life, what they actually want to do with themselves…. the things we’ll regret on our deathbeds if we haven’t done….those are the things technology exists to help us do. At its best, it can enable us to live better lives.
I’m very pro-technology in the sense that I believe in its possibilities, but lately my stance towards it has been more like a car that’s going off the road and is about to hit a tree… and I want to help steer it back onto the road. We have the ability to bring technology into better alignment with our goals and values – but it’s concerning me that we still seem to be going in another direction.
The regulations and laws we have in place around information technologies largely still assume an environment of information scarcity, but now we’re in a world of information abundance, a world which we’re ill-prepared for at a psychological as well as societal level. Our informational environment has flipped from scarcity to abundance in a similar way that our food environment has. The heuristics we had living on the plains of Africa in an environment of food scarcity – caloric retention, the ability to easily notice food, etc – served us well there. But in the environment of Netflix, Ben & Jerrys Ice Cream and La-Z-Boy recliners…. These same heuristics give us less than ideal outcomes.
Q: To what extent are apps and platforms designed to keep us addicted?
[Prof. Adam Alter] Our apps and platforms are addictive by design. We know this both because the tech titans behind them have admitted it publicly, and because the dominant apps and social media platforms use the same suite of techniques that are well-proven to ensnare us, from variable reward schedules (rewards that arise at unpredictable intervals—e.g., likes on a social media post) to arbitrary goals (e.g., Snapchat’s “streak” feature, which rewards users for communicating with each other at least once a day).
Q: How much is technology distracting us?
[Jamie Bartlett] At an individual level, we know people check their phones and devices extremely regularly; but we also hear anecdotally that people can’t read as much- or as deeply- without being distracted. The average UK adult checks their phone an incredible 200-300 times a day.
Distraction is one of those epidemics that we didn’t realise existed until quite recently.
At a societal level, it’s inevitable that our distraction will be impacting our ability to communicate with each other, and will lead to a lowering of focus and attention across our whole population.
Q: How does technology distract us from our goals?
[James Williams] We don’t yet have the language to talk about this clearly, so I’ll preface it with that. In one sense, though, it can be understood as the undermining of the human will. It starts with distraction in the moment where our ability to focus is fragmented – but then it runs deeper into how it distracts us into living according to certain habits, values and norms. At an even deeper level it can frustrate our ability to figure out what we want in the first place.
The political earthquakes of the past couple of years have really brought to the forefront of societal awareness how powerful the persuasion industry has become. This project that was dedicated to advancing the effectiveness of commercial persuasion has, of course, been applied to political persuasion with enormous consequences for society that we couldn’t have imagined.
If we can’t give attention to the right things, we can’t do the right things, have the right conversations, or reach the right outcomes. Our technological distraction is a first order political problem, worldwide. If digital media has become the lens through which we understand and engage with others, we need to figure out how to make that the right kind of lens.
[Prof. Adam Alter] Developers build a series of hooks into their products that are designed to make those products irresistible. I described variable feedback and arbitrary goals above; the smartest developers also make their experiences social, combining elements of social support and social competition to pull users back into the platform. That’s true of social media, of emailing and messaging (it’s not acceptable, socially, to stop using either one), and of video games.
Q: What are the ethical implications of concentrated political power in technology companies?
[James Williams] The concentration of power in technology corporations is a moral and political problem that we simply don’t have a precedent for. More people use Facebook than speak English for example, so the implications of Facebook, as just one platform, are at the scale of language itself.
The terms that have arisen to talk about types of undesirable persuasion via these technologies are disturbingly vague. For instance, election ‘meddling.’ Meddling seems like a terrible word for describing the undermining of a people’s political will. The kid at the back of class meddles when he should be paying attention…. There’s a weird fuzziness to these kinds of words because we don’t have yet have a coherent grammar of influence.
The absence of a language for and understanding of technological influence is also creating the space for knee-jerk responses that inadvertently give these companies even more power than they had before. When people call on Facebook to ban certain types of content, you’re giving them the ability to determine what counts as truth, what speech is appropriate and what is not. These are decisions properly made in the domain of society and political institutions, not corporations.
We give a lot of tacit legitimacy to these platforms, and it’s an urgent political question for our time. For example, even if you don’t use Twitter, you’ll notice that so many news stories now cite a tweet or a response to a tweet as a primary source of information about a thing, not the thing itself. Even the infrastructure of journalism is changing in a way that ought to concern everyone.
[Prof. Adam Alter] If you know you’re creating a platform that is genuinely addictive; that will rob people of hours and days of their lives; that is designed to be hard to resist; and that’s designed that way for profit, it’s hard to argue this is ethical by any definition. Some developers argue that people should exert self-control, but if you develop a platform that hooks the majority of the population, you’re obviously stacking the deck against individual consumers.
Q: What are the societal consequences of information overload and distraction?
[Jamie Bartlett] There are two slightly different problems wrapped-up in the same general issue of constant checking of devices. If you imagine that checking is the new epidemic, and the symptom is people checking devices constantly to learn who’s said what, what notifications they have, who’s replied to what. This information overload means we are so bombarded with information that we can’t make sense of it and so we become tribal, emotional and irrational, surviving using heuristics rather than facts. You can see this reflected in how our politics is changing – it’s hard to identify what’s driving what, but we do know that politics is becoming more tribal, emotional and polarized – and that is, in part, because of information overload.
The pure fact of being constantly distracted also has consequences. In the time of the written word, there was a cultural predisposition to the logical ordering of information- we sort-of thought in long-form; carefully, thoughtfully, concentrated and with focus. We don’t concentrate for such long spells of time any more, and the inevitable result is that the quality of our debate goes down. We’re more irritable, we’re more swayed by the crowd, we’re less likely to pay attention to what politicians are really saying; and that- to me- is a pretty good description of how our politics feels.
[Prof. Adam Alter] Our boredom threshold has decline to the point where we’re unable to stand idle in an elevator for ten seconds without pulling out our phones. There’s also an epidemic of social avoidance, particularly among younger people, as the skills required for face-to-face interaction are more demanding and need to be honed across time. If you spend all your time behind a screen, you aren’t honing those skills, and the idea of communicating with a person in real time who happens to be in the same room seems overwhelming.
Q: How has the founding myth of technology companies served to enable the distraction we now face?
[Jamie Bartlett] The founding myth is these are technology companies or social platforms rather than advertising companies which is, essentially, what they are now. Perhaps if we knew that from the start, we would have been faster to react to the techniques they use- profile creation, targeting of individuals, targeted adverts, and tracking.
The mood is changing though. More and more people actually understand what is going on, and more people are not happy about it.
Q: How could we better design apps & platforms in a mental-health positive way?
[Prof. Adam Alter] The best thing developers can do—and some are already starting to do this—is to eschew these hooks and instead build devices that respect consumers and their time. Of course this isn’t likely to happen without pressure, because it’s easier to make money when you capture people’s attention, and impossible to win the race for attention when you aren’t engaging in the arms race that drives your competitors.
Q: How can we make a meaningful change against the attention economy?
[James Williams] Right now our relation with this stuff is more or less that of serfdom. In a way, we’re just tilling their attentional fields while they give us enough benefits to subsist on, but not enough to really thrive.
The people I talk to in the tech industry do seem to care about the ethics of all this, but their conceptions of those ethics tend to be pretty limited. The most common question I get asked by designers is ‘where’s the line…?’ – ‘…if I’m trying to hook people on my app, how far can I go before I’m hooking people too much and it becomes an addiction…’ There’s a desire for clear boundaries, a desire for a very legalistic conception of ethics, but the reality is far fuzzier.
My mentor at Oxford, Luciano Floridi, makes the distinction between ethics and infraethics. You can look at the ethics of individual actors, like a designer or a business leader, but the infraethics refers to the ethics of the infrastructure; whether that be organisational structure, business models, funding dependencies, or the competitive environment of an industry as a whole. It’s the degree to which the environment fosters the right ethical decisions to begin with.
All of these things need to change – the business models, the mechanisms of financing and growing these companies, their habits of design, and so on. All of this ultimately needs to be rooted in a clear sense of what we want technology to do for us at a fundamental level, and how we can bake that existential purpose into the design of our technology.
But I’m pretty convinced that until we have a better language for talking about the nature of this problem, we can’t even begin to have a coherent conversation about any of it. The whole thing is already getting derailed by imprecise talk about ‘addiction,’ the ‘think of the children’ register of moral panic, and petty retributivist thinking that’s more focused on status-downranking the heads of tech companies than actually improving people’s lives.
My number one fear in all of this is that resistance against the attention economy will just get swallowed up by the forces of the attention economy itself – and so far that’s exactly what seems to be happening.
Q: Why are people so surprised by the conduct of big tech corporations?
[James Williams] It’s not of course isolated to tech corporations. But many in the tech industry have set fairly high moral standards for themselves in the past, and it’s psychologically rewarding to point out how someone isn’t meeting their own standards and judge them for it. It’s also hard to not be outraged when the news is telling us we ought to be outraged about something.
Another piece is the that most of this infrastructure of persuasion has been built and operated in the background for so long, and was suddenly brought to the foreground by the outcomes of the Brexit referendum and the 2016 US election. Now you have all these people who say, ‘oh, we didn’t realise this was happening… this is bad!’ and people who work in the industry replying with, ‘wait, this isn’t new? Why the outrage now?’ And it’s true – when people used similar marketing methods during the Obama election, those agencies got awards. The logic goes that if the outcome isn’t good, it must have been achieved by some means that weren’t good. Let me be clear, I’m not a fan of the means, but we have to detach the means from the political outcome.
Then there’s the further fact that most people who use these free services don’t understand the fundamental nature of the transaction they’re entering into, that they’re paying for the service with their attention.
The key question for me is what those persuasive mechanisms and dynamics are that are undermining human will at an individual and collective level – and precisely how those mechanisms are incompatible with the assumptions and requirements of democracy. I feel like that’s the clear conversation we’re missing. What kinds of psychological manipulation are okay as business models, given that they’ll almost certainly be used for political persuasion as well?
Q: Did technology companies set-out to manipulate people like this?
[Jamie Bartlett] In the 1990s and 2000s we saw decisions made that led to these tools; Facebook, YouTube and Google becoming free for people. I think these decisions were made with ostensibly positive and democratizing aspirations; companies wanted to allow as many people to access their platforms as possible, and charging was seen as a way in which people could be excluded. The problem emerged that by making it free, they still had to pay for it somehow. The advertising model was almost accidental- built on the premise that people watch tv and listen to the radio for free, but put up with ads. I don’t think anyone really appreciated how this would play out, and how damaging this principle would become. It’s a tragedy of naivety rather than malevolence
Q: Do we need to see attention as a resource?
[Jamie Bartlett] At an individual level, people increasingly recognize that their quality of life, and powers of concentration are being detrimentally harmed; this is especially true when you see parents worrying about their kids.
We are also seeing more apps being built to help people manage their addictions and their internet use more effectively; perhaps tech companies are starting to recognize that this is a consumer problem, and it’s not good for their brands to be seen as developing addictive technologies.
I hope that we begin to see our attention as a very important resource, but whether we do or not remains to be seen.
Q: To what extent should we see addiction as a public health issue?
[Jamie Bartlett] What we’re seeing about distraction feels like the early days of tobacco research where people were coming up with studies that showed the long-term negative effects of smoking. The problem for us is that it’s a little early; we’ve simply not had long enough to work out what’s happening, and what the consequences are. It’s definitely possible that studies will conclude that technological distraction is a public health issue, but at the very least, governments, schools and parents have to be more proactive at managing particularly kids internet use. I hope we start to get companies and governments making suggestions about best practice, offering tools, and making it easier to limit internet and app use; and perhaps there’s even the possibility that regulation may step into this too.
Q: What can we do as individual users to direct technology the right way?
[James Williams] There are no maps here, only compasses. In specific terms it’s very hard to know what actions will have much causal efficacy, aside from maybe using and evangelising the hell out of ad-blockers. In broad terms, though, what you can do is resist the exploitation of your attention at every turn, reject advertising as the affront to the integrity of your mind that it is, demand great accountability from those who wield great power over your thought and behaviour, suspect your outrage, lean into awe and wonder wherever you can find it, and remember that from the perspective of the system, every click is a vote that what you’re looking at is important. And remember that we are still just getting started with all this. The web isn’t even 10,000 days old yet.
[Jamie Bartlett] We have to understand that all of us have a role to play; if more of us log-off, use technology limiting applications and ad-blockers, the social media platforms will have to change their business models. We have the power to drive change in the market through our use.
Jaron Lanier has argued about switching off our social accounts completely, but my take is more of a middle-ground where we download ad blockers, and use more privacy enhancing software.
We have to also realise that however we push back against these companies, we will create new and unexpected problems; and we have to question if the new problems are worse than our old problems! For example, more and more people are starting to use anonymized browsers such as TOR, and that’s creating problems for law enforcement- but I just so happen to think that those problems are not as bad as the current challenges to our democracy and society.
These to me are more pressing problems than the alternative problem which is going to be law enforcement is going to be a bit harder for them to follow up around the internet.
[Prof. Adam Alter] The best thing we can do as individual consumers is to instil habits that remove us from screens for at least part of the day. Begin with dinner time—make sure your devices are as far away as possible while you eat. Perhaps expand that to the hour or two before bed and the hour after you wake up in the morning. It’s difficult at first but people who do this consistently report feeling happier and looking forward to that tech-free part of the day. It’s important to use that time to exercise or have face-to-face conversations with friends or loved ones, to spend time outdoors, or otherwise do things that are removed from screens.
[bios]
James Williams writer and researcher focused on the philosophy and ethics of technology. He’s particularly interested in the question of how we can live free and authentic lives in environments of highly persuasive design.
His first book, ‘Stand Out of Our Light: Freedom and Resistance in the Attention Economy,’ was recently published by Cambridge University Press as part of the inaugural Nine Dots Prize, which he was honored to have won last year.
Previously, James worked at Google for over ten years, where he received the Founder’s Award, the company’s highest honor, for my work on search advertising. Before that, he earned a master’s in product design engineering and an undergrad in literature.
He has also been a member of Balliol College, Oxford; a visiting researcher at the Oxford Uehiro Centre for Practical Ethics; a tutor in the Oxford Computer Science department; and a visiting fellow at the Centre for Research in the Arts, Social Sciences, & Humanities (CRASSH), University of Cambridge. James is also a co-founder of the Time Well Spent campaign, an effort that aims to bring the design ethos of our digital technologies into greater alignment with our deeper human interests.
Jamie is Director of the Centre for the Analysis of Social Media. His principle research interests are:
- The use of social media by political movements and law enforcement agencies
- Social media monitoring and analytics
- Internet culture, dark net, crypto-currencies
- Surveillance technology, machine learning, automated sentiment analysis and big data
- Privacy, law, social media research ethics, reform to RIPA 2000
Jamie’s work focuses on the ways in which social media and modern communications and technology are changing political and social movements, with a special emphasis on terrorism and radical political movements.
The Centre for the Analysis of Social Media is a collaboration between Demos and the University of Sussex. The Centre combines automated data extraction and sentiment analysis with social science statistics, analysis and ethics, to produce insightful and robust policy research.
In 2014, Jamie published The Dark Net with William Heinemann about hidden internet subcultures. He writes frequently in a number of national and international outlets, including The Telegraph, The Guardian, and the Sunday Times. He is currently working on his second book, ‘Radicals’ which will be released in 2017.
Prior to working for Demos, Jamie was a research associate at the international humanitarian agency Islamic Relief and conducted field research in Pakistan and Bangladesh. Jamie holds Master’s Degrees from the London School of Economics and the University of Oxford.
Adam Alter is an Associate Professor of Marketing at New York University’s Stern School of Business, with an affiliated appointment in the New York University Psychology Department.
Adam’s academic research focuses on judgment and decision-making and social psychology, with a particular interest in the sometimes surprising effects of subtle cues in the environment on human cognition and behavior. His research has been published widely in academic journals, and featured in dozens of TV, radio and print outlets around the world.
He received his Bachelor of Science (Honors Class 1, University Medal) in Psychology from the University of New South Wales and his M.A. and Ph.D. in Psychology from Princeton University, where he held the Charlotte Elizabeth Procter Honorific Dissertation Fellowship and a Fellowship in the Woodrow Wilson Society of Scholars.
Adam is the New York Times bestselling author of two books: Irresistible (March, 2017), which considers why so many people today are addicted to so many behaviors, from incessant smart phone and internet use to video game playing and online shopping, and Drunk Tank Pink (2013), which investigates how hidden forces in the world around us shape our thoughts, feelings, and behaviors.
He has also written for the New York Times, New Yorker, Washington Post, Atlantic, WIRED, Slate, Huffington Post, and Popular Science, among other publications. He has shared his ideas on NPR’s Fresh Air, at the Cannes Lions Festival of Creativity, and with dozens of companies, including Google, Microsoft, Anheuser Busch, Prudential, and Fidelity, and with several design and ad agencies around the world.[/bios]