If you want to make big bucks on the Internet, there are three things you need. Data, advertisements, and purpose. We, the current collection of students, are the first to grow up without knowing a world deprived of the Internet. We’re still learning what the impact is on our mental wellbeing. So in a way we are guinea pigs, although some people call us another name: The Snowflake Generation. This is a story of anxiety, addiction, and manipulation. This is the story of a world without choice.

In the end, it all boils down to money. Some technology companies take the traditional route of selling products. Apple sells computers and phones. Microsoft sells software and phones. Amazon – well Amazon just sells everything. Yet the man who invented the World Wide Web, Tim Berners-Lee, didn’t create it to make money. He wanted to fashion a platform for sharing ideas and information and he made it free for anyone to use. Even today many tech gurus hold the key belief you shouldn’t have to pay for services on the Internet. That said these people would also like to make a living. The big question is how tech companies can make money while keeping their services free? The answer is a bummer – literally.

BUMMER is a humorous little acronym coined by computer scientist and philosopher Jaron Lanier. It stands for “Behaviours of Users Modified, and Made into an Empire for Rent”. It’s a business model, and it’s how companies like Facebook and Google, the main companies using BUMMER, generate income while keeping their services free for users. It’s simple. People like you and me use Facebook and Google, and we leave behind a trail of data. Then algorithms collect, or rather harvest, our data and assign us and the data versions of ourselves to a group. Perhaps you’re in a group that tends to message friends after watching a compilation of old vines. (Useful for an advertiser trying to spread a message.) Or maybe people in your group are more likely to click on ads after watching videos of funny animals. It’s discreet manipulation, and it works.  The algorithms work out how people act, but not why they act that way. Once the system knows how you behave, these BUMMER companies, Facebook and Google, can say to advertisers, ‘Hey, we can make this user more likely to click your link just by showing them a clip of a funny animal’, for example.

For Apple, Amazon, and Microsoft, what they sell is the product, and the customer is you. But companies using BUMMER sell you. You are the product for Facebook and Google. You are not their customer. You are what they’re flogging to external parties and advertisers you might not even have heard of. This is how some of the largest tech companies run – economics is driving manipulation. And the thing is, there isn’t a way to escape it. So our generation? It’s doomed.

Sometimes I like to pretend Mark Zuckerberg is evil. I imagine him sitting on a swivel chair, cat on lap, in front of a bank of monitors all beeping and flashing. The tip of a cigar glows in the low-lit room and Mark sighs contentedly, leaning back in his chair. On the wall is a dartboard with photos of stereotypical millennials pasted over the top. Mark feels good. He has all the power. And ah, yes, he is 100% evil.

The problem is, Zuckerberg probably isn’t evil. I doubt he or any other tech superstars have malicious motives. Many of them simply show their service has a purpose, collect data from the people who use it and sell that data to advertisers. It’s how they make money. Zuckerberg has a knack with computers and coding and he probably knows it. But he’s not a psychiatrist or a social historian, and that’s another thing we can assume he knows.

As you probably guessed by what BUMMER stands for, Jaron Lanier isn’t so keen on the business model. In his latest book, he points out the way it modifies your behaviour is not with obvious and up-front methods, but using tiny changes in users’ online experience to alter their behaviour for profit. The reason its behaviour manipulation and not modification is because these companies never asked if they could alter your behaviour. Some people get quite scared about all of this. I suppose it’s never comforting to find out companies are storing your data and using it to manipulate you. Some people even read the terms of agreement before they install an app or sign up to a site. (What kind of sane and incredibly responsible people are they?) Weird. But the thing is, that won’t solve the problem. Even if you never sign up to any of the BUMMER sites, companies will continue to harvest your data and be able to manipulate you. Let me tell you how.

I wonder if you’ve walked through Piccadilly Circus since October 2017. The month is important because that’s when a particularly special advertising screen was turned on after a groundbreaking makeover. It’s called Piccadilly Lights, and for Landsec, the company that owns it, it represents a “landmark” in their history. It’s the largest of its type in Europe, but what makes it so incredible is that it’s watching you. Yep, there are two cameras within the actual screen that determine how old you are, your gender, and even your mood. It also has WiFi, so anyone passing by can interact with the screen (and give up some of their data). As soon as you turn your face away from the screen it’ll forget your specific face, but remembers your data without being able to connect it back to you. With this, it can tailor what’s on the screen to who’s below. And it doesn’t stop there. It also recognises the brands of passing cars, and so adapts adverts to drivers too. Landsec attempts to reassure us with their Piccadilly Lights privacy policy. They write, “no images of your face are stored by the system”, as if what people should be afraid of is the hoarding of their data. The thing is, a person’s raw data isn’t worth much alone. The NUS’ new app TOTUM uses a similar method to Piccadilly Lights. Ali Milani, the Vice President of Union Development at NUS, told Concrete the app “won’t harness individual data” but will store “collective data”. Of course, the NUS isn’t making money from this data, and they’re developing an Ethical Digital Charter to regulate their new app, which Milani believes will be “part of the solution” in the ongoing data protection debate.

Many sites store your data whether they’re social media websites, streaming sites, or search engines. (If you want to see what Facebook has stored on you, go to this link: www.facebook.com/help/1701730696756992, or for Google, this one: google.com/takeout.) In truth, unless you’re doing something illegal it doesn’t really matter how much of your data sites have stored. It’s what they do with it you might be more afraid of.

In 2014 Facebook bought WhatsApp for $19bn. Users don’t pay for the app, there are no adverts, and every message is encrypted, right? Well, sort of. Only in 2016 did the UK’s Information Commissioner’s Office compel WhatsApp to stop sharing users’ data with Facebook. The EU followed suit, fining Facebook €110m in 2017. Again Jaron Lanier points out BUMMER companies’ “wealth is made entirely of the data you gave them”. Some people may assume Facebook paid all that money, $19bn, to gain access to users’ personal messages. But user manipulation isn’t only the fuel for advertisers. It also reinforces addiction.

The BUMMER companies need data to survive. They need it to make sure they can manipulate you into altering your opinions or clicking on adverts. But they also need your data to ensure you’re highly addicted to their site. They want you to need them. They want you to crave them. And this is where a bit of light chemistry comes into play.

Dopamine is a chemical stimulus and in popular culture, the pleasure chemical. For tech gurus, this pleasure chemical is part of their everyday vocabulary. Understanding dopamine seems to be a necessity in the field of technology. In Digital Behavioural Design authors Dalton Comb and Brown explain the purpose of the pleasure chemical:

The dopamine molecule is responsible for two things: putting a smile on the user’s face and inducing them to be more likely to do that behaviour again.

Dopamine isn’t just a turn of phrase. Many from Silicon Valley are starting to voice concern BUMMER has a real and physical impact on users. The BUMMER algorithms keep users hooked by heightening emotions. Of course, the easiest emotions to stimulate are negative ones: paranoia, anxiety, and self-doubt. In 2017 Facebook’s original president Sean Parker revealed to Axios, “We need to sort of give you a little dopamine hit every once in a while… you’re exploiting a vulnerability in human psychology… [and] God only knows what it’s doing to our children’s brains.”

Yet dopamine isn’t evil, it’s natural. And seeking pleasure is a key principle of survival. If you’re thirsty, you want to drink. Hungry? You want to eat. Aroused? You want to have sex. The body knows we need to drink, eat, and procreate, so associates pleasure with all these activities. Our bodies offer pleasure when we perform necessities for humanity’s survival. In essence, we’re already addicted to the actions permitting our existence. This survival mechanism is a susceptibility BUMMER companies tap into.

In terms of technology, pleasure induced by dopamine is the feeling of receiving a reward such as a ‘like’, ‘match’, or even an unexpected email. That it is ‘unexpected’ is key. Companies can’t hand out pleasure left right and centre. To keep users hooked, rewards must surprise them. It’s the lure of pleasure and the possibility of reward that retains people’s attention. This is how BUMMER addiction begins, helped along by something we see, hear, and feel each day: notifications. Users start to wonder how to boost their rewards. Many become obsessed with finding out the secret to reaping more bouts of pleasure. It’s the same way a slot machine works. You win a little bit, and you spend the rest of your time trying to find the secret formula to win big. In both cases, you’ll probably never find one. Addiction isn’t the desire for satisfaction; it is the need for it. You search for pleasure, find it, and crave more of it. It’s a feedback loop. On BUMMER sites you might feel a short burst of pleasure, but don’t kid yourself. Wait a moment. Dopamine might be natural but that doesn’t mean it’s good for you. BUMMER’s rewards aren’t healthy at all. Addiction and anxiety come hand in hand. Each emboldens the other, again and again, over and over. The effects they have on our lives are more than serious. Their implications are grave.

Sometimes I wonder if I should stop using Facebook and Google. Then I wake up. Get real. I need Facebook. I need Google. It’s too hard to stop using these BUMMER sites. It would mean no Google search, Google Maps, Gmail, Google Docs, YouTube, or Android. (That one is for all you people using Android phones over Apple.) And no Facebook? That’s goodbye to Messenger, Facebook organised events, remembering people’s birthdays, as well as WhatsApp and Instagram. If there were separate companies with separate functions it would be easier to quit. I could drop one without dropping them all. But Facebook and Google are too sneaky for that. It’s nearly impossible to cease using their sites because you’ll always need at least one of the services they provide. These companies aren’t necessary for survival, but we are addicted to them. They are the tools necessary for continuing your preferred way of life.

BUMMER companies bring out the worst in people. It’s likely you’re already addicted owing to BUMMER. I know I am. There are great aspects to the Internet, but wherever you visit you’ll often find some extreme views. Between thirsting after more rewards, feeling guilty for wasting time online, and wondering if you’re good enough compared to a photoshopped icon, you might stumble on groups of people with frightening opinions. What makes the Internet a scary place is how these people came to hold such outlandish views. “Paranoia [is]… an efficient way of corralling attention” writes Lanier. (That’s my headline sorted then.) For the BUMMER algorithms it’s simple: encourage extreme opinions and paranoia to keep users hooked. It doesn’t matter if the source of this paranoia is true or false. If a method works, the algorithms will use it.

There is little doubt social media, but more specifically the BUMMER model used by Facebook and Google has a negative effect on our mental wellbeing. A spokesperson for UEA told Concrete “social media can add to the pressure students feel at university”. The CEO of Anxiety UK, Nicky Lidbetter, agrees. After a study into the relationship between technology and anxiety, she said technology is “a tipping point, making people feel more insecure and more overwhelmed.” Time spent online has an adverse effect on mental wellbeing, and we’re using the Internet more and more.

On average the majority of adults using the Internet spend 24 hours a week online, according to a 2018 Ofcom report. What’s worse is 25 to 34-year-olds spend 29 hours a week online, and 16 to 24-year-olds an average of 34.3 hours. That’s almost a day and a half of screen time. In March this year 18 to 24-year-olds spent the most time online per day of any age group, but what really matters is the sites people visit. In the same month, the UK population spent most of their online time on sites owned by two companies: Google and Facebook, the BUMMER companies. Being on these sites means BUMMER algorithms are manipulating you constantly, at a detriment to your mental wellbeing.

Mental health problems at universities are a growing problem. While it’s true in the UK 45 to 49-year-olds have the highest suicide rate, three-quarters of adult mental health problems begin before you’re 24. What ties mental health problems with students, even more is that students aged 20 to 24 have a lower level of mental wellbeing in comparison to non-students of the same age, and it’s deteriorating yearly.

More people are becoming conscious of mental health problems, and rightly so, because everyone has a varying level of mental wellbeing. UEA explain in their current Mental Health and Wellbeing Strategy there is “a distinction between ‘mental wellbeing’, which we all have, and a ‘mental health problem’, which only some of us [have]”. Sadly the number of UEA students with suicidal thoughts per year increased from 29 in 2012/13 to 48 in 2016/17. The developing concern across the UK has prompted a mental health charter this year led by the charity Student Minds and Universities Minister Sam Gyimah. Thankfully the number of student suicides at UEA over the past few years is very low, but across the UK the number is rising.

Although mental health problems and suicide often stem from a variety of factors, it’s shocking to find out how 18 to 34-year-olds feel when they don’t have Internet access. The same Ofcom report states 15% consider themselves more productive without Internet, and 19% say they’re less distracted. More concerning is 20% feel stressful without it, and 39% feel cut off from the rest of the world. It’s not good press for Facebook and Google, but I doubt they’ll care too much. This summer Facebook rolled out a large advertising campaign in the UK to convince users they value you for the friendships you make over anything else. I suppose it’s true: the more people on Facebook means the more friends you can make. That means the more data Facebook can harvest and the more people they can manipulate. I must admit they didn’t say it quite like that in the ads.

It’s almost impossible to stop using Facebook and Google. To anyone managing it, tell me your secret. The BUMMER companies have a hold over billions of people, and we’re the first generation with little memory of a time without them. Some people call us the Snowflake Generation because apparently, we’re delicate, or we ask for help. But we live in a world centred on manipulation. It’s more than okay to ask for help. Realistically it’s the normal thing to do. A spokesperson for UEA told Concrete  “the ‘snowflake’ generation isn’t a term we use because it dismisses students’ concerns without due consideration and care.” But you know what? I’ll allow you to call me a snowflake because maybe I am one. But I’d rather be a snowflake comfortable to push back against BUMMER’s manipulation than someone who would take it on the chin and stay silent and may face real-life consequences.

Snowflake or guinea pig, it makes no difference. One of the many causes of mental health problems within the student population is the BUMMER model.  Companies using the model such as Facebook and Google may find ways to bypass any future rules and regulations. It wouldn’t be such a surprise. So the solution isn’t more red tape. The solution is finding something much more difficult to sidestep. The best bet is public knowledge. In the same way we all know an email from a far-flung prince asking you for money is probably a scam, society has got to step in and educate future generations about technology. Short-term actions just won’t do. You can try to live without social media. You can try to live without Google. You can move away to a place without any advertising screens, a place bereft of the Internet in which you can live free from manipulation. But, if you’re a less impulsive sort of person, think long term. If you can educate yourself you can educate others. And then everyone can hold an informed opinion. Do with it what you want. At least you’ll have a choice. And anyway, even if our generation is condemned, future students deserve protection from this subversive manipulation used every day by companies like Facebook and Google.

 

Like Concrete on Facebook to stay up to date