If you want to make big bucks on the Internet, there are three things you need: data, advertisements, and purpose. We, the current collection of students, are the first to grow up without knowing a world deprived of the Internet. We’re still learning what the impact is on our mental wellbeing. We’re guinea pigs, otherwise known as The Snowflake Generation.This is a story of anxiety, addiction, and manipulation – of a world without choice.

It all boils down to money. Some technology companies take the traditional route of selling products. Apple sells computers and phones. Microsoft sells software and phones. Amazon just sells everything. Yet Tim Berners-Lee, who invented the World Wide Web, didn’t create it for money. He wanted a platform for sharing ideas – free for anyone to use. Even today many tech gurus hold the belief you shouldn’t have to pay for services on the Internet. That said, these people also want to make a living. The big question is how can tech companies make money while keeping their services free? The answer is a bummer – literally.

BUMMER is a humorous acronym coined by computer scientist and philosopher Jaron Lanier. It stands for “Behaviours of Users Modified, and Made into an Empire for Rent”. It’s a business model, and how companies like Facebook and Google generate income while keeping their services free. It’s simple. People like us use Facebook and Google and leave behind a trail of data. Then algorithms collect, or rather harvest, our data and assign us to a group. Perhaps you’re in a group that tends to message friends after watching a compilation of old vines. (Useful for an advertiser trying to spread a message.) Or maybe your group is more likely to click on ads after watching videos of animals. It’s discreet manipulation, and it works.  The algorithms work out how people act, but not why they act that way. Once the system knows how you behave, these BUMMER companies, Facebook and Google, can say to advertisers, ‘Hey, we can make this user more likely to click your link just by showing them x’, and they have data to show it works.

For Apple, Amazon, and Microsoft, what they sell is the product, and the customer is you. But companies using BUMMER sell you. You are what they’re flogging to external parties and advertisers you might never have heard of. This is how some of the largest tech companies run – economics drives manipulation. There isn’t a way to escape it. So our generation? It’s doomed.

Sometimes I like to pretend Mark Zuckerberg is evil. I imagine him sitting on a swivel chair, cat on lap, in front of a bank of monitors all beeping and flashing. The tip of a cigar glows in the low-lit room and Mark sighs, leaning back in his chair. On the wall is a dartboard with photos of stereotypical millennials pasted over the top. The problem is, Zuckerberg probably isn’t evil. I doubt he or any other tech superstars have malicious motives. Zuckerberg simply has a knack with computers and we all fall for it.

As you probably guessed by what BUMMER stands for, Jaron Lanier isn’t so keen on the business model. In his latest book, he points out the way it modifies your behaviour is not with obvious and up-front methods, but using tiny changes in users’ online experience to alter their behaviour for profit. The reason it’s behaviour manipulation and not modification is because the companies never ask to alter your behaviour – they just do it. Some people get quite scared about it. They even read the terms of agreement before they install an app or sign up to a site. But that won’t solve the problem. Here’s why.

I wonder if you’ve walked through Piccadilly Circus since October 2017? The month is important because that’s when a particularly special advertising screen was turned on after a groundbreaking makeover. It’s called Piccadilly Lights, and for Landsec, the company that owns it, it represents a “landmark” in their history. It’s the largest of its type in Europe, but what makes it so incredible is that it’s watching you. There are two cameras within the screen determining your age, gender, and even your mood. It also has WiFi, so any passerby can interact with the screen (and give up some of their data). As soon as you turn your face away it’ll forget what you specifically look like, but remember your data. With this, it can tailor what’s on the screen to who’s below. And it doesn’t stop there. It also recognises the brands of passing cars, and so adapts adverts to drivers too. Landsec attempts to reassure us with their Piccadilly Lights privacy policy. They write, “no images of your face are stored by the system”, as if what people should be afraid of is the hoarding of their data. The thing is, a person’s raw data isn’t worth much alone. The NUS’ new app TOTUM uses a similar method to Piccadilly Lights. Ali Milani, the Vice-President of Union Development at NUS, told Concrete the app “won’t harness individual data” but will store “collective data”. Of course, the NUS isn’t making money from this data, and they’re developing an Ethical Digital Charter to regulate their new app, which Milani believes will be “part of the solution” in the ongoing data protection debate.

Many sites store your data whether they’re social media websites, streaming sites, or search engines. (If you want to see what Facebook has stored on you, go to this link: www.facebook.com/help/1701730696756992, or for Google, this one: google.com/takeout.) In truth, unless you’re doing something illegal it doesn’t matter how much of your data sites store. It’s what they do with it you might be afraid of.

In 2014 Facebook bought WhatsApp for $19bn. Users don’t pay for the app, there are no adverts, and every message is encrypted, right? Well, sort of. Only in 2016 did the UK’s Information Commissioner’s Office compel WhatsApp to stop sharing users’ data with Facebook. The EU followed suit, fining Facebook €110m in 2017. Again Jaron Lanier points out BUMMER companies’ “wealth is made entirely of the data you gave them”. Some people may assume Facebook paid all that money, $19bn, to gain access to users’ personal messages. But user manipulation isn’t only the fuel for advertisers. It reinforces addiction.

Dopamine is a chemical stimulus and in popular culture, the pleasure chemical. For tech gurus this pleasure chemical is part of their everyday vocabulary. Understanding dopamine seems a necessity in the field of technology. In Digital Behavioural Designauthors Dalton Combs and Brown explain the purpose of the pleasure chemical:

“The dopamine molecule is responsible for two things: putting a smile on the user’s face and inducing them to be more likely to do that behaviour again.”

Many from Silicon Valley are starting to voice concern BUMMER has a real and physical impact on users. The BUMMER algorithms keep users hooked by heightening emotions. Of course, the easiest emotions to stimulate are negative ones: paranoia, anxiety, and self-doubt. In 2017 Facebook’s original president Sean Parker revealed to Axios, “We need to sort of give you a little dopamine hit every once in a while… you’re exploiting a vulnerability in human psychology… [and] God only knows what it’s doing to our children’s brains.”

Seeking pleasure is a key principle of survival. If you’re hungry, you want to eat. Thirsty? You want to drink. Aroused? You want to have sex. The body knows we need to eat, drink, and procreate, so associates pleasure with all these activities. Our bodies offer pleasure when we perform necessities for survival. In essence we’re already addicted to actions permitting existence. This survival mechanism is a susceptibility BUMMER companies tap in to.

In terms of technology, pleasure induced by dopamine is the feeling of receiving a reward such as a ‘like’, ‘match’, or even an unexpected email. That it is ‘unexpected’ is key. To keep users hooked, rewards must surprise them. It’s the lure of pleasure and the possibility of reward retaining people’s attention. This is how BUMMER addiction begins, helped along by something we see, hear, and feel each day: notifications. Users start to wonder how to boost their rewards. Addiction isn’t the desire for satisfaction; it is the need for it. You search for pleasure, find it, and crave more of it. It’s a feedback loop. On BUMMER sites you might feel a short burst of pleasure, but don’t kid yourself. Wait a moment. Dopamine might be natural but that doesn’t mean it’s good for you. BUMMER’s rewards aren’t healthy at all.

I wonder if I should stop using Facebook and Google. Then I wake up. Get real. I need Facebook. I need Google. It’s too hard to stop using these BUMMER sites. It would mean no Google search, Google Maps, Gmail, Google Docs, YouTube, or Android. (That’s for all you people using Android phones over Apple.) And no Facebook? That’s goodbye to Messenger, Facebook organised events, and remembering people’s birthdays, as well as WhatsApp and Instagram. If there were separate companies with separate functions it would be easier to quit. I could drop one without dropping them all. But Facebook and Google are too sneaky for that. It’s nearly impossible to cease using their sites because you’ll always need at least one of the services they provide.

BUMMER companies bring out the worst in people. Maybe you’re already addicted to social media and technology owing to BUMMER. I know I am. There are great aspects to the Internet, but what makes it a scary place is how some people come to hold such outlandish views. “Paranoia [is]… an efficient way of corralling attention”, writes Lanier. For the BUMMER algorithms it’s simple: encourage extreme opinions and paranoia to keep users hooked. It doesn’t matter if the source of this paranoia is true or false. If a method works, the algorithms use it.

There’s little doubt social media, but more specifically the BUMMER model used by Facebook and Google has a negative effect on mental wellbeing. A spokesperson for UEA told Concrete “social media can add to the pressure students feel at university”. The CEO of Anxiety UK, Nicky Lidbetter, agrees. After a study into the relationship between technology and anxiety, she said technology is “a tipping point.” Time spent online has an adverse effect on mental wellbeing, and we’re using the Internet more and more.

On average the majority of adults using the Internet spend 24 hours a week online, according to a 2018 Ofcom report. What’s worse is 25 to 34-year-olds spend 29 hours a week online, and 16 to 24-year-olds an average of 34.3 hours. That’s almost a day and a half of screen time. In March 2018 18 to 24-year-olds spent the most time online per day of any age group. Importantly, in the same month, the UK population spent most of their online time on sites owned by two companies: Google and Facebook, the BUMMER companies.

Mental health problems at universities are a growing problem. While it’s true in the UK 45 to 49-year-olds have the highest suicide rate, three-quarters of adult mental health problems begin before you’re 24. What ties mental health problems with students even more is that students aged 20 to 24 have a lower level of mental wellbeing in comparison to non-students of the same age, and it’s deteriorating yearly.

More are becoming conscious of mental health problems, and rightly so, because everyone has a varying level of mental wellbeing. UEA explain in their current Mental Health and Wellbeing Strategy there is “a distinction between ‘mental wellbeing’, which we all have, and a ‘mental health problem’, which only some of us [have]”. Sadly the number of UEA students with suicidal thoughts per year increased from 29 in 2012/13 to 48 in 2016/17. The developing concern across the UK has prompted a mental health charter this year led by the charity Student Minds and Universities Minister Sam Gyimah. Thankfully the number of student suicides at UEA over the past few years is very low, but across the UK the number is rising.

Although mental health problems and suicide often stem from a variety of factors, it’s shocking to find out how 18 to 34-year-olds feel when they don’t have Internet access. That same Ofcom report states 15% consider themselves more productive without Internet, and 19% say they’re less distracted. More concerning is 20% feel stressful without it, and 39% feel cut off from the rest of the world. It’s not good press for Facebook and Google, but I doubt they’ll care too much. This summer Facebook rolled out a large advertising campaign in the UK to convince users they value you for the friendships you make over anything else. I suppose it’s true: the more people on Facebook means the more friends you can make. That means the more data Facebook can harvest and the more people they can manipulate.

A spokesperson for UEA told Concrete  “the ‘snowflake’ generation isn’t a term we use because it dismisses students’ concerns without due consideration and care.” But you know what? Call me a snowflake, because maybe I am one. But I’d rather be a snowflake comfortable to push back against BUMMER’s manipulation than someone who would take it on the chin and stay silent, and may face real-life consequences.

The solution isn’t more red tape. Companies using BUMMER may bypass any future rules and regulations. It wouldn’t be a surprise. The solution is finding something much more difficult to sidestep – public knowledge. In the same way we all know an email from a far-flung prince asking you for money is probably a scam, society must educate future generations about technology. Short-term action just won’t do. If you can educate yourself you can educate others. Even if our generation is condemned, future students deserve protection from this subversive manipulation used every day by companies like Facebook and Google.

Like Concrete on Facebook to stay up to date