The Digital Minister, Margot James, will today launch a code of conduct for social media companies, who will be forced by law to sign it, to protect young and vulnerable users. In a speech planned for today's 'Safer Internet Day Conference' she will accuse online giants of:
"Behaving as if they are above the law and creating an environment for bullying and abuse... voluntary codes have failed and firms should not be allowed to blame users for the damaging content posted on social media sites. The tragic death of Molly Russell is the latest consequence of a social media world that behaves as if it is above the law. In America and Europe these companies have legal protection from liability for user-generated content. Too many of these companies have milked this privilege for all its worth. There is far too much bullying, abuse, misinformation as well as serious and organised crime online. For too long the response from many of the large platforms has fallen short."
These new measures will include long overdue steps to force social media companies to shield users from images of suicide, self harm and bullying and to remove references to suicide and self harm methods; to sign up to new corporate social responsibility standards that will require them to remove extreme material quickly; and to report on how many complaints of online bullying and trolling they receive.
This follows the recent deaths of Molly Russell 14 and Zoe Watts 19, who took their own lives after looking at self-harm posts on social media. Sadly these are not isolated cases as the statistics show - suicide rates for under 15s rose 82% between 2013 and 2017, and 22% in 15 to 19 year olds with social media a recognised factor. The Minister for suicide prevention stated that harmful online content was at risk of becoming 'normalised'.
“There is a huge rise in unhappiness among our children and when I talk to counselors up and down the country and ask them why, they always say 'social media'. Young people are looking for comfort on the internet where there are people who feel the way they do. And tragically they are finding them and the solutions they are offering are coping mechanisms which are self harm even to the point of suicide.” Childline.
This comes at a time when the fall out of the Cambridge Analytica scandal along with a series of data breaches and malpractice, has placed enormous pressure on tech and social media companies to get their houses in order. However, despite their frequent protestations that they’ve been unfairly maligned and are doing so, not to mention some 15 voluntary codes of conduct since 2008, dark practices remain common place. The tragic stories about the suicides of teenagers Zoe and Molly show that social media channels played a significant role in providing content that sent them spiraling downwards into self harming and then, tragically, to ultimately take their own lives.
There was another story this weekend, which was less well covered, but is symptomatic of the culture and apparently arrogant attitudes that pervade Google and Facebook, especially bearing in mind they’re supposed to be trying very, very hard to be good. Apple caught Facebook harvesting user data via a ‘side-loaded’ app uploaded to iPhones using a special program only open to corporates, in breach of Apple’s terms. Google also got caught with a similar app, so Apple, who have taken the moral high ground when it comes to user data, revoked the permissions of both businesses and barred them from iPhones, causing all kinds of mayhem as various Facebook and Google apps were offline. Whilst this access has subsequently been reactivated, after a suitable period on the naughty step, Facebook had no excuse for their actions, which they took despite a similar app being forcibly removed by Apple last year:
“Facebook ignored the rules, and did so with gusto. It is not hard to understand why brands are intoxicated by the granularity that its ad system provides, and that comes from knowing the users down to the minutest detail. How Facebook acquires that knowledge seems to be of little concern. The company revealed in the last three months of the culmination of a year of unrelenting scandal... profits soared 61%, hitting a record
$6.9 Billion.” Danny Fortsun.
It’s a shame that when the social media companies are so obsessed with and brilliant at collecting detailed data about users' buying habits and interests - which they sell to advertisers for such huge rewards - they don’t seem as adept at sharing this info when it’s desperately unhappy young girls like Zoe and Molly searching terms like 'suicide' and 'self harm'. It seems to me that, as well as moderating and removing dubious content (on which they have spent fortunes setting up teams to do, albeit with mixed results as it seems that malevolent content stays on for far too long) they could use their powerful algorithms towards serving up good advice and directing users towards counseling that might actually help them and ultimately save lives, rather than showing more and more harmful content?
“It’s amazing what these social media companies can do to target you with adverts based on your online behaviour. I want that genius to be used for social ends. Social media companies have a moral duty to do more to remove content that promotes suicide and self harm and should use their technical genius to do good.” Damian Hinds, Education Secretary.
You can't help feeling, from the way social media giants operate and their consistent failure to self-regulate, that there's only one thing they really care about - and it's profit not people.
It's disconcerting too, to read how hard it is for the parents to get hold of the browser history of their children's online journeys to try and retrospectively piece together their descent into darkness. This does beg the question (given that these are logged in and underage users, so they know who they are and what they're looking at) how much better would it be, as well as using the algorithms to serve up positive content, if they could use their expertise to develop some mechanism for alerting parents so that they could be made aware of what their kids and looking at so they can to try to find help? As a father of three I'd much rather know that my child had issues, so that we could try to address them, rather than finding out when it's too late.