News

Social Media Sites Could Face A UK Ban If They Don't Restrict These Harmful Images

by Emily Dixon
Studio Firma/Stocksy

Health secretary Matt Hancock has said that the UK could legislate against social media platforms if they don't remove images relating to self-harm and suicide. During an appearance on The Andrew Marr Show, as the BBC reports, Hancock said Parliament "must act" if various social media sites don't control the appearance of potentially harmful material. His comments come after the death of 14-year-old Molly Russell, who died by suicide in November 2017. Russell's father, Ian, believes her death was influenced by graphic images encouraging self-harm which she viewed on platforms including Pinterest and Instagram. "There’s no doubt that Instagram played a part in Molly’s death," he told the Times.

He further criticised social media algorithms which may have exposed her to an increased amount of troubling imagery, saying, "In the same way that someone who has shown an interest in a particular sport may be shown more and more posts about that sport, the same can be true of topics such as self-harm or suicide."

Speaking to Marr about legislating against such images, Hancock said, "I think that lots of people feel powerless in this situation, but of course we can act." He continued, "It would be far better to do it in concert with the social media companies, but if we think they need to do something that they’re refusing to do, then we can and we must legislate."

Javier Pardina/Stocksy

"We must act to make sure that this amazing technology is used for good, not leading to young girls taking their own lives," he said.

When Marr asked if that could include banning social media sites, or raising their taxes, Hancock said, "Parliament does have that sanction." He added, "It’s not where I’d like to end up in terms of banning them, because of course there’s a great positive to social media too. But we run our country through Parliament, and we will and we must act if we have to."

In a letter to social media platforms including Twitter, Pinterest, Snapchat, and Facebook, which owns Instagram, Hancock wrote, "We must act now, so technology is seen to improve lives, and stop it causing harm." Hancock, as quoted in the Times, continued, "This is a critical moment — as a supporter of digital technology, I don’t want the benefits of technology to be lost because of reasonable concerns about its risks. But most importantly, I don’t want another family to have to go through the agony of losing a child this way."

Guille Faingold/Stocksy

Facebook's regional director for Northern Europe, Steve Hatch, told the BBC's Amol Rajan that he was "deeply sorry" about Russell's death, calling the issue of removing graphic imagery "really complicated." Instagram does not automatically ban all images of self-harm, he said, explaining, "What the experts tell us in this is that when those images are posted by people who are in a really difficult situation, often they can be posted because they are seeking help or they’re seeking support."

"Those images are allowed on the platform," Hatch said. "But then we also directly follow up with the support that those individuals can see."

"What we don’t allow is anything that is sensationalising or glamorising," he said. "We’ve always got to work harder to take down the wrong kind of images."

If you’re in the UK and you or someone you know is experiencing thoughts of suicide or self-harm, call the Samaritans on 116 123 or email jo@samaritans.org. You can also call mental health charity Mind on 0300 123 3393. If you’re in the US, you can call the National Suicide Prevention Lifeline on 1-800-273-8255 or chat for support at www.988lifeline.org. You can find other international helplines at www.befrienders.org.