They knew what you cared about the most because the algorithm could work it out. Thanks to your likes, saves, dwell time, as well as the accounts and hashtags you were following, Instagram could find more and more content to satisfy your interests. Your feed would be carefully and algorithmically curated, and that content shown to you before anything else.
The app wanted to keep us all scrolling and capture our attention and it did this by serving up our preferred content. Falling down rabbit holes is a feature, not a bug.
What if, though, that rabbit hole and what you ‘cared about most’ was less than ideal? Worse still, what if you were being shown content that had no business being on social media platforms because the safety features weren’t fit for purpose? What if the age recommendations were meaningless, based on little more than guess work and a law that dated back to 1998 – and was never intended to be for social media accounts?
Molly Russell took her own life in November 2017 after viewing ‘life-sucking content’. She died, aged just 14. The content she was engaging with was so disturbing that the adults involved in her inquest found it difficult to view. Molly was depressed and in the months before she died she had viewed thousands of suicide and self harm images. A trick Pinterest uses to make sure you get back on the app is to email updates on your favourite topic. Molly got an email titled ‘10 depression pins you might like’ just before she died.
The lawyer for Molly’s family, Merry Varney, cried recounting the content she had seen saying it had invaded her thoughts. The independent psychiatrist Dr Venugopal said he had trouble sleeping for weeks. The head of health and wellbeing for Meta (Instagram and Facebook parent company) Elizabeth Lagone said she believed that the content viewed by Molly before she died was safe.
The coroner decided otherwise and concluded “Molly Rose Russell died from an act of self-harm while suffering from depression and the negative effects of online content”. He went on to say that the content she viewed “contributed to her death in a more than minimal way”.
Where does this leave parents? There are multiple organisations who advise parents on how to protect their children and help them use social media safely. They send webinars to parents, via schools, telling us how to help children ‘stay safe’. Their websites are full of tips and ideas on how to manage screen time and how to chat to our children about online harms.
All very commendable but I think their advice is bogus and I ignore it. Why? Because all their advice stems from the idea that 13 is a legally acceptable age to have a social media account. This is a fallacy. It is based on the 1998 US law The Children’s Online Privacy Protection Act (COPPA) which was not created with social media in mind. It was intended to prevent online platforms from collecting the personal data of kids under the age of 13 for ad targeting and tracking. It urgently needs reviewing.
I have no doubt change is coming. The Children’s Commissioner said recently: “This generation hasn’t known a world without social media, smartphones and 24-hour communication. I am not satisfied enough is being done to keep children safe online.”
However, until governments get to grips with the Online Safety Bill, parents need to stop placing their trust in Mark Zuckerberg, TikTok and Twitter. Far better, I believe, is to look at the advice about gambling. Children cannot consent to being manipulated by the highs and lows of gambling. Social media apps give users a similar dopamine hit. That, plus the infinite scroll keeps us hooked on the content.
We’ve given our children an addictive substance, designed deliberately to be that way, and believe with the right bits of advice they should be able to control themselves. Molly was old enough to have an account and was using the platform in the way the algorithm intended.
I hope the finding of the coroner and inquest will prove to be a landmark moment. Our children should not be guinea pigs. As Oliver Sanders KC said to Meta’s Elizabeth Lagone about allowing this content into “the bedrooms of depressed children… You have no right to. You are not their parent. You are just a business in America.”
Read more Parenting Truths from Emilie Silverwood-Cope every month in the Cambridge Independent.