Instagram plans to extend its ban on images depicting self-harm and suicide to include
It goes without saying that it shouldn’t have taken Molly’s tragic death, which was partially caused by such violent imagery on Instagram, to bring about change in the way social media networks regulate themselves – and arguably this still isn’t enough. What about, for example, a non-physical ‘depiction’ of self-harm, such as text expressing a desire to commit self-injury or suicide? It’s clear that the wider issue is not just the posts themselves, but the sort of environment that can be cultivated on such networks in which posts of this kind are normal or even encouraged. The Mental Health Foundation, a UK Charity, said it “would also like to see them try to support distressed users who are posting such content in the first instance.”
Though Instagram has taken this positive step towards creating safer online environments, other big platforms like Facebook and Twitter have not been quick to follow suit. Troublingly, the existence of such violent content could suggest that the internet can be used as a medium to inflict harm to a potentially unlimited audience, creating a vicious cycle of poor mental health that manifests in posting violent content online. How, then, do we reconcile this with the ever-growing prevalence of social media in our daily lives? Is it irresponsible to allow current social media practices to continue, given that it it’s essentially only a matter of time before we come across troubling content?
Perhaps one reason why regulations have been so piecemeal and slow to come about is that the relationship between social media and mental health is not fully understood. Norman Lamb MP, Chair of the Science and Technology Committee, said “worryingly, social media companies—who have a responsibility towards young users—seem to be in no rush to share data that could help tackle the very real harms,” adding that “self-regulation will no longer suffice.” Additionally, our generation is in a unique position historically, where it’s difficult to disentangle which pressures are from these new technologies and which are from external sources, like climate change, political situations that will impact our future such as Brexit, as well as struggles to access the job market or get on the housing ladder. In 2014, a study found that 6% of people had self-harmed, a 2% increase from 2000 – and the majority of those were in the age group of 16-24. However, the study also found that around half the people who self-harmed were not getting the support they needed, demonstrating the need for improvement in our mental health services.
It may also be argued that there is not a clear cause and effect here – we must question the social context that leads people to seek out such content, such as pre-existing mental health issues. Instagram, however, is very potent in exacerbating these issues. Molly Russell’s father, Ian Russell, has stated that she was “self-sufficient” and looked to the internet for support with her depression, which then may have led to her accessing content which pushed her to commit suicide. This suggests that more needs to be done to support those struggling with mental health before they reach the point of having to look to informal remedies such as reading the posts of Instagram users who are unqualified to provide such advice. Regulations are crucial, as shown by the Molly Russell case, but they are not yet sufficient nor do they remove the burden of us to collectively examine our relationship to social media; they should be the beginning, not the end, of the discussion.
Rose Molyneux