There is growing concern about the impact of social media on young people, from parents and teachers, and from young people themselves. But it can be hard to understand how social media could affect someone, as young people can have very different experiences. Two young people using the same App can see very different content, and it can be hard to know why.

We’ve created blogs to help you understand how the technology itself works in a way that can create risks, which could lead to harms, such as mental health issues.

We hope this helps create a better understanding of how the technology works, and better conversations about what might be happening online, which can build resilience.

In this blog, our focus is on algorithms.

A hand holding a pencil to a blank notepad, with pencil shavings on it

For Parents, Carers and Professionals

Understanding How Technology has Changed

It may seem obvious, but it is worth thinking about how technology has changed in recent years. If we take the example of TV, when you were younger you may have had only a few channels to watch, and very little before the evening, and nothing overnight. As TV expanded, there is now not only the possibility of being able to watch something 24/7, but of also having hundreds of channels to watch. The ability to record programmes and watch them later has evolved into the streaming Apps, such as Netflix, where everything is now ‘on demand’ and immediately available at the push of a button. So instead of a broadcaster deciding and sending to you, over the airwaves, programmes to watch, through the Internet you can effectively be your own TV channel, choosing what you watch at every stage.

If you compare that with the developments of digital technologies, in the early days of the Internet, you would search for what you wanted, which felt magical. Companies such as Google, developed search engines to help you find what you were looking for more easily. Similarly, you might find old friends, or people who shared your interests on social media, and in many ways, you were in control of your online activities.

But something changed…money was needed to fund the development and use of the technologies So, bit by bit, money from advertising was how companies were able to provide and develop their technology, with the user only needing to pay for a device and access the Internet.

But how do you keep people engaged and onscreen so that you could show the advertisers that their adverts are being seen? And how do the different technology companies compete with each other for those advertiser payments?

Getting Your Attention; Keeping You On Screens

“the nature of the internet has itself changed. In the early days, it was considered a ‘pull’ (or ‘lean forward’) technology – in contrast with television, the classic ‘push’ or ‘sit back’ technology. The early approach to digital skills therefore emphasised skills such as search and evaluation, because users had to deliberately seek each piece of content they wanted. Today, the situation is reversing: social media is becoming as much a push as a pull technology (while television has become a pull technology, with a multiplicity of broadcast and streaming channels). New skills are now required, including that of managing ‘news feed’ algorithmically driven by platforms financially incentivised to provide a never-ending stream of personalised content to the user.”

ySKILLS: Young people’s online engagement and mental health: the role of digital skills (2022)

There have been many different techniques that the technology companies have been using to keep us engaged with our devices, and looking at screens, so that advertising can work. ‘Persuasive design’ is often used as a term for describing how the ‘attention economy’ works.

One technique was to shift us from searching for content, whether for information or an entertaining video, to that content, such as news, articles or videos, being inserted into a stream of content within an App; your ‘news feed’. This can work well, and can be almost like a personal shopper who finds you the interesting, or unexpected items that you may like. But technology companies also found that risky or extreme content, which may be sexual in nature, or hateful, can engage us even more and keep us looking at our screens, and so it may be promoted within Apps.

The way that content is identified ‘For you’ is through the use of algorithms, which use different pieces of information and artificial intelligence to try and work out what you will find engaging. How you engage with the content sent to you (by watching or reading it; clicking on links; sharing it) may then ‘train’ the artificial intelligence further (a process also called ‘machine learning’) in theory, to make better estimates about what you like to see. It is not always possible to understand these processes, especially for young people. Machine learning, in particular, can be like a ‘black box’, with it seeming impossible to work out what it prioritises. A successful algorithm aims to keep you looking and scrolling, seeing adverts, and then doing something. This may not just be to buy something; it may be to share a video about a controversial issue.

There are two key messages from this:

  1. No two people using an App, such as TikTok or Instagram, are likely to see exactly the same content, even though all users are subject to it’s features. Of course, they may share content such as videos with each other, or search for something, but apart from that, it is largely the algorithms that send them what they see.
  2. For some people, it may be impossible for them to work out why they are getting certain types of content, which may become more extreme, which they cannot get rid of, and it can affect their mental health. But it is not their fault. Many young people are alert to this when online, just in case the algorithms send them something extreme or distressing.

Part of the difficulty is that content can have hidden messages attached to what might appear to be an innocent or positive video, with other messages hidden in the comments, or as a hashtag (e.g. #selfharm), which allows people to identify or search for an issue. This can be particularly important on issues such as eating disorders or self-harm, but may also include extreme beliefs, conspiracy theories or fake news.

But often it is impossible to work out why you might be seeing certain types of content.

“Platform algorithms are often ‘out of sync’ with and insensitive to the young person’s state of mind or ability to cope, leading to experiences of ‘triggering’ (when particular online content proves upsetting because of prior mental health difficulties), unwanted re-exposure to such content, and setbacks in their mental health. Algorithms can act as a distorting mirror, magnifying problematic content and pushing young people with mental health vulnerabilities down a spiral of ever-more overwhelming, upsetting or extreme content that they find hard to break away from.”

ySKILLS: Young people’s online engagement and mental health: the role of digital skills

The Impact of Algorithms

Why does this matter? Well imagine being a young teenager again, and you keep getting messages that a certain look was how you had to look. This may be to be thinner, or perhaps more muscular if a young man. Or that hateful ideas about a religious community were normal. Or that it was okay to self-harm, as lots of people were doing it.

In a recent stem4 survey, 55% of young people aged 12-21 years believed that algorithms pushed specific content to them on social media, because of their online activities, though a further 21% were not sure. Yet 23% of young people felt that the algorithms had a negative impact on their mood, causing them stress, anxiety or depression, and that this ‘impacts upon me a lot’. The impact of this upon their self-esteem, and body image can be profound and worrying.

Only 28% stated that algorithms didn’t have a negative impact on their mood, which may have been the hope when introducing them into an App. But because extreme content can be more engaging (think of outrageous newspaper headlines) an algorithm may work out that even young people want extreme content, even though it is distressing.

There are 4 critical impacts on how algorithms can make content more extreme, and have a negative impact on mental health:

  1. Content can get more extreme over time
  2. It does not match how the young person feels in the present as recommendations are partly based on past behaviour
  3. Trending can normalise extreme content or make it hard for young people to avoid it
  4. You may be able to “clean” the algorithms by engaging with positive content, but not always.

Algorithms can be felt to be unforgiving, because of their use of past activity, when a young person may not have been able to really understand what they were engaging with, but it is not always so clear.

“In the case of a 14-year-old girl, the algorithms picked up her interest in baking when she was 12 years old and started recommending dieting and fitness content, which gradually became more extreme, and she developed an obsession with counting calories:

“I’m like 12, thinking about it. I eat three meals, and then these people are having a strawberry for brunch, and that was a big thing. I don’t know why, maybe due to my age, but I was like, maybe I’m doing something wrong. And then I was focusing more on food and calorie counting. I didn’t even know what MyFitnessPal is, that calorie tracking app. I found out from TikTok, and I thought, that’s great, I can track how well I’m doing. And then if I do good, that’s a good thing. I’m like these people I’m seeing on my feed.”

She had noticed the change but was not sure why it had happened, and it took her a long time to realise the effect the extreme diets being recommended to her had had on her own relationship with food.”

ySKILLS: Young people’s online engagement and mental health: the role of digital skills

What you can do

Here are some helpful tips that can help you talk to a young person about their online experiences, and the impacts of algorithms:

Don’t – Sit down with them, face to face, and ask what extreme content they may have seen. That may be awkward for both of you, and may make them want to hide what they found difficult.

Do – Find an opportunity to talk where you can be side by side, such as on a walk, or by giving them a lift in the car. Try to get across a genuine interest in how they are feeling, and not what they have been doing.

Don’t – Focus too much on the impact of extreme content, and what they may have seen.

Do – Ask them to help you understand what algorithms are, and how they determine what happens in a news feed or ‘For You’ page.

Don’t – Be negative about algorithms, and appear judgemental. Try to appear open minded about the good and the bad.

Do – Ask if they find algorithms positive, or whether sometimes they see things they wished they’d never seen. Acknowledge that you think you yourself might find some content hard to cope with, but you would try to stay calm and not judge.

Don’t – Suggest that what is happening is because of what they have done, and that the situation could have been easily avoided.

Do – Check if they feel they can ‘clean’ the algorithms by, for example, clicking on more positive content, or if they feel trapped, and unable to avoid content that is ‘triggering’ for them.

Don’t – Leave them feeling hopeless about overcoming any difficult experiences, or appear too shocked (which can be difficult).

Do – Ask them if they would like to plan a way forward together with you, if they have been distressed by some of what has happened online. Let them know help is always available, and that they may want to check out stem4’s ‘Asking for Help’ booklet, or the Worth Warrior App, which can help them build positive self-esteem and a positive body image.

By showing interest in how they feel, and on the impacts of social media, it may help them feel a bit more at ease about talking with someone they trust about what has been pushed to them. That could help them process what has happened, and overcome anything difficult.

Of course, as our survey shows, not everyone will feel the impact of algorithms is negative, and for them, that may well be true. But for others, who feel trapped in a downwards spiral, exposed to a distorted view of the world, through no fault of their own, an understanding adult can lessen the loneliness of fighting the technology alone.

It is so easy to make mistakes online, through the simple push of a button, but even if some of the things that happen have followed a moment of curiosity or questionable judgement, that may not be the reason for risky content being pushed their way.

If in the last century, you were told that what young people saw on television was decided by robots, which we didn’t completely understand, you might have reservations….yet that is close to what young people are exposed to today, and every day.

A hand holding a pencil to a blank notepad, with pencil shavings on it

For Young People

Don’t Let Algorithms Change You

Growing up with social media can be exciting, especially in your teens, when you are trying to work out who you are, what you like, and who you would want to spend time with.

Most young people are highly skilled in their use of technology, such as social media, but those skills may not be enough in relation to the algorithms that work out what content to send you.

For some this is not a problem, as they like their news feed, but for others, the way the algorithms act can feel distressing, punishing and unforgiving, and it can be hard to know how to change what you receive.

Here are some tips to help you feel more resilient if faced with a stream of challenging content.

1. Check-in with yourself

Try and find ways of reflecting on how you feel about many areas of your life, and include your news feed, or ‘For You’ recommendations. A diary can be a good way of doing this. It doesn’t have to be intensive, but just a way of staying connected with how you are, when so much is happening, and so fast. If something made you feel bad, make sure you register that, somewhere.

2. Hold on to your values and who you are

If you love something, such as a performer or band, or something as positive as good food, don’t feel under pressure to change that, especially if you are hearing that others don’t share your tastes in music, or think it is better to eat less. This is harder, if the person expressing those views is an influencer, with millions of followers. But don’t lose your unique views and values; they will sustain you.

3. Some digital skills can help

Lots of young people find ways of managing their news feed, by being careful about what they click on. Some will click on just positive content, to try and train the algorithm not to send negative, or extreme content. But that may not work, and sometimes you may want to understand more about difficult issues. So don’t stifle your curiosity.

But looking at any comments left, unusual hashtags, may give you some idea as to how the algorithms will process that particular post, and then send you more like it.

4. Plan ahead

Chances are, with the best digital skills in the world, you will sometimes see something extreme and upsetting, perhaps because it is trending, or someone shared it. Some young people even go looking for extreme videos, to try and strengthen themselves, and feel more resilient. But they can forget what it feels like to see it for the first time, and so later share it.

It is important to think about who you might turn to for support if upset, as it can really help to talk about what you’ve seen or what has happened, with someone you trust. This may be a teacher at school, or the friend of an older brother or sister, or a cousin or aunt. You may feel comfortable talking with your parents.

But the key issue is not to have something happen, and not know who you might talk with, as being on your own with it is not fair to you.

5. It’s not your fault

Technology is now so complicated, and is used in so many complicated ways, it is impossible for anyone to get everything right. This is especially true for young people who are developing and having to learn so much on their journey to becoming an adult.

When it comes to algorithms, don’t lose time trying to work out what you may or may not have done to get your news feed. Even if you can trace a change in your news feed to a certain time, and something you did, you can never know how much was down to you, and how much was just down to how algorithms work and change.

So, if it gets too much, report what upsets you, talk to people you trust, and break away from the news feed to get some perspective. That might mean having a break from social media, or switching Apps. This can feel like a huge step for many young people, as they feel very loyal to the Apps they’ve been into. But if you think of an App like you would a friend, be clear what you are prepared to put up with, and if it is not good, call it out, and make some changes.

stem4 would like to thank Dr Mariya Stoilova, Research Officer, ySKILLS Project Team, Department of Media and Communications, LSE (The London School of Economics and Political Science) for helpful suggestions on this blog.

You can read more about the ySKILLS Research here.

Did you find this page helpful?
Yes
No