Half of teenagers (46%) aged 13-17 who use social media have seen posts that they believe should not be allowed, new research published by the Chartered Institute of Marketing (CIM) has revealed.
The survey of over 2,500 adults and teenagers published ahead of the close of the Government’s consultation on online harms, shows that 95% of young people aged 13-17 have a social media account, with the most popular being YouTube (79%), followed by Instagram (73%), Snapchat (66%) and Facebook (45%).
Despite many children coming across potentially harmful posts on social media platforms, very few are doing anything about them. Almost two thirds (62%) of teenagers who have seen content they think they shouldn’t have, say they either rarely or never report these posts. Only 7% say they always do.
Seeing these posts does seem to be discouraging some children from engaging on social media; close to half (44%) agree that they would be put off from engaging in discussion and conversations online. But very few are prepared to give up their accounts; two in three (66%) said that seeing posts on social media that should not be allowed would not make them want to delete their account, while more than half (52%) said it would not put them off signing up for an account in the first place.
The survey put similar questions to adults and found that almost half (44%) of those who had seen harmful content on social media say they rarely or never report it. While only one in five (20%) say that they always report it.
Across Wales, 27% of adults say they have seen harmful content online in the last six months, with 40% never or rarely reporting it.
Who is responsible?
When it comes to who should be protecting children under the age of 18 from harmful or inappropriate content on social media, the public place responsibility on parents and social media companies.
Three quarters of people over 18 say it is the responsibility of parents/guardians (76%) and social media companies (74%) to protect children on social media.
However, most people believe strongly that social media companies should be removing harmful content from social media.
- Who’s responsible? Eight in ten (83%) said that social media companies have a responsibility to monitor for harmful content on social media. Many people also felt there was a role for government (49%), and individuals themselves (57%).
- Who pays? When it came to paying for dealing with harmful content on social media the vast majority of the public felt this was the responsibility of social media companies. 67% of adults said the cost of monitoring and regulating harmful content on social media should be borne by the social media companies themselves, compared with only 14% who said government was responsible.
Revenue from marketing and advertising is the main source of income for most social media companies and the Chartered Institute of Marketing believes more must be done to protect users on social media if UK businesses are to continue to spend their marketing revenues reaching customers through social media platforms.
Sameer Rahman, Wales Chair at CIM, said:
“At present, only one in five adults routinely reports harmful content on social media, yet our research shows we could make a huge difference if we all took the simple action of hitting the report button when we see something we shouldn’t.
The government must do more to educate people, especially children, on how to report inappropriate posts and highlight the importance of doing so whenever you see it. The number of children who have not only seen harmful content on social media, but failed to report it, is alarming.
New regulations will hold social media companies to account for this kind of content, but we don’t believe we should have to wait for the regulations, this is something that can happen now.”
The research also demonstrates the prevalence and impact of harmful content being seen by adults on social media:
- Harmful content: Three in ten (29%) adults said that they had seen content that could be damaging if seen by children, encourage illegal activity or be considered abusive or offensive, in the last 6 months. Only one in five (21%) said that they had not seen harmful content, while a third (32%) were not sure or couldn’t recall.
- Who’s seeing it? Younger adults are much more likely to recall seeing harmful content than older generations; 46% of 18-24 year olds say they had seen it in the last 6 months, compared with only 16% of those aged 55 and over. Those people most active on social media are the most likely to have seen harmful content. Among those who are active on all three of the most popular platforms, Facebook, Instagram and Twitter, 44% say that they had seen harmful content in the past six months.
- Stifling debate: Three quarters of people who use social media (74%) say that the presence of abusive or offensive content can put them off engaging in discussions on social media, while more than half (52%) agree that it would make them consider deleting their account.