The Online Safety Bill had its third reading in the House of Commons on 17 January 2023 and now continues its Parliamentary passage in the House of Lords.
The UK Government seeks to make the UK the safest place in the world for children and adults online, while defending the right to freedom of expression.
The Bill introduces new rules for many companies which host user-generated content (eg social media platforms which allow users to post their own images, video and comment online or interact with each other through messaging and forums). There are some exceptions, such as in the case of news agencies.
Illegal content and harmful content
The concept of illegal content and harmful content are significant for the purposes of the Bill.
The illegal content category is reasonably clear, and includes child sexual abuse, inciting violence, revenge porn and promoting self-harm.
The notion of harmful (but legal) content has been more controversial. Examples might include online abuse, cyberbullying or harassment or content that does not meet a criminal level but which promotes or glorifies self-harm or eating disorders.
Some free speech campaigners have expressed concern as to whether the concept that something can be legal yet harmful is sufficiently clearly defined.
How will the Bill seek to protect children?
The Bill will make social media companies legally responsible for keeping children and young people safe online.
It will protect children by making social media platforms:
• prevent children from accessing harmful and age-inappropriate content
• enforce age limits and implement age-checking tools
• ensure the risks and dangers posed to children on the largest social media platforms are more transparent, including by publishing risk assessments
• provide parents and children with clear and accessible ways to report problems online when they do arise
How will the Bill seek to protect adults?
Platforms will need to:
• remove all illegal content
• empower adult internet users with tools so that they can tailor the type of content they see and can avoid potentially harmful content if they do not wish to see it on their feeds
• offer adult users the option to verify their identity
The biggest platforms will also be obliged to set out their terms and conditions with respect to the types of legal content that adults can post on their sites. The information must be provided in a clear and accessible format to enable adults to make more informed decisions and generally have more control over the type of harmful content that they can see.
Sanctions for posting illegal and harmful content
Ofcom will be responsible for ensuring that platforms effectively protects their users, with powers to take appropriate actions against companies which do not adhere to the new framework, no matter where they are based (as long as they are accessible to UK users).
Companies may be fined up to £18 million or 10 per cent of their annual global turnover, whichever is greater. Ofcom will be able to apply to a court to block non-compliant services.
Criminal actions may be taken against companies’ senior management for failure to comply with Ofcom’s information requests.
Striking the balance
Views differ as to whether the Government has struck the correct balance between the need to protect vulnerable people from being exposed to potentially harmful and illegal content whilst not impinging disproportionately on freedom of speech.
The Bill is likely to receive careful scrutiny in the House of Lords, including from former judges and current and former senior lawyers. It is important that legal obligations are expressed in clear statutory language in order that those subject to the law can take effective measures and be confident that they are in compliance.