section image

Social media companies to owe duty of care to users?

Articles | Wed 6th Feb, 2019

The report

The House of Commons Science and Technology Committee published on the 31st January 2019 a report entitled “Impact of social media and screen-use on young people’s health”.

 

The issue

“Social media and screens have become ubiquitous in the lives of children. Figures produced by Ofcom indicated that 70% of 12–15 year olds have a profile on social media, while the OECD reported in 2015 that 94.8% of 15 year olds in the UK used social media sites before or after school.”

 

The harm

“While we heard about a variety of instances where social media could be a force for good, we also received evidence about some of the potential negative impacts of social media on the health and emotional wellbeing of children. These ranged from detrimental effects on sleep patterns and body image through to cyberbullying, grooming and ‘sexting’. Generally, social media was not the root cause of the risk but helped to facilitate it, while also providing the opportunity for a large degree of amplification. This was particularly apparent in the case of the abuse of children online, via social media.”

 

The solution (one part of) 

“This principle—to protect children from harm when on social media sites—must be enshrined in legislation as social media companies having a ‘duty of care’ towards its users who are under 18.”

“A duty of care, applying to both a person and to companies, has been defined as a requirement to:
take care in relation to a particular activity as it affects particular people or things. If that person does not take care, and someone comes to a harm identified in the relevant regime as a result, there are legal consequences, primarily through a regulatory scheme but also with the option of personal legal redress.”

 

Thoughts

The Committee should be commended for its efforts in tackling what is undeniably a significant problem, and one which will only get worse in years to come if ignored. This particular suggestion though, to impose a duty of care on social media companies in respect of individual users allowing for “personal legal redress”, is perhaps less laudable. Its brief explanation in the report alone creates a number of questions: What would the scope of the duty be? Would psychiatric injury suffered by a user be foreseeable? Is it right that the operator of a digital platform (essentially what social media is), who is largely passive in the user process (apart from designing the platform and implementing the algorithms that partly determine the content the user sees), should be liable for the harmful effects of the autonomous choices made by a user to communicate with other users or access content uploaded/shared by other users?

Statutorily imposing a duty of care on social media companies with a view to allowing for private law claims seems like a messy and dubiously effective way of addressing the problem. Funding an effective and robust regulator is probably a better option for the Government, which does not require a ‘duty of care’ to be recognised in statute. Such a regulator should be endowed with ‘teeth’ to properly hold companies to account, for example the ability to impose meaningfully large fines and suspension of activity.

The Government is set to legislate on ‘Online Harms’ in the next parliamentary session; I’m intrigued to see what, if anything, happens.

Portfolio Builder

Select the practice areas that you would like to download or add to the portfolio

Download    Add to portfolio   
Portfolio
Title Type CV Email

Remove All

Download


Click here to share this shortlist.
(It will expire after 30 days.)