The Online Safety Bill (the Bill) aims to make the internet a safer place for all users by prohibiting providers of user to user services (ie. social media platforms) from hosting illegal or harmful content. When passed the legislation will mark a milestone in the fight to hold the biggest and most popular providers of social media platforms to account in addition to providers of search engines and generally those companies whose services host user generated content such as images, videos, and comments.
Currently, there are a number of pieces of legislation which users may be able to rely on to obtain justice, but these typically require victims of abuse to bring action against the individual posting the online abuse. Whilst this legislation remains in force, the Bill provides additional measures to be brought against the providers of the service which is being utilised to carry out the online abuse.
The draft Bill anticipates the legislation will be extraterritorial. However, some services will be exempt including news websites, some retail services, some services used internally by businesses and email services.
Content that platforms will be required to remove includes:
- child sexual abuse,
- controlling or coercive behaviour,
- cyber bullying,
- extreme sexual violence,
- extreme violence against animals or people,
- hate crime and speech,
- inciting violence,
- illegal immigration and people smuggling,
- promoting or facilitating suicide,
- promoting self-harm,
- revenge porn,
- selling illegal drugs or weapons,
- sexual exploitation, and
Protecting Vulnerable Users
One of the key benefits of the Bill is its ability to protect vulnerable users from online harm, including children who may be at particular risk of cyberbullying or other forms of online abuse. Some content, while not illegal, may be harmful or age-inappropriate for children. Harmful content that platforms will need to protect children from accessing includes pornographic content, online abuse, cyberbullying, and online harassment, as well as content that does not meet a criminal level, but which promotes or glorifies topics such as suicide, self-harm, or eating disorders.
The Bill primarily protects children, by imposing a duty on user to user service providers to:
- remove harmful content or ensuring that it does not appear in the first place,
- enforce age limits and age-checking measures,
- ensure that risks and dangers to children’s safety are more transparent, including publishing risk assessments, and
- assist parents by making it easier for them to monitor their children's online activities and providing them with clear and accessible ways to report problems when they do arise.
Many social media platforms only allow users over the age of 13 on their platforms. However, according to Ofcom’s ‘Children and parents: media use and attitudes’ report published in 2022, 33% of 5–7 year-olds and a staggering 60% of 8-11 year-olds said they had a social media profile. Social media companies are therefore expected to see a significant reduction in user numbers with the imposed duty of age verification.
Social media platforms will need to demonstrate that their age verification processes are robust enough to assess whether users are the appropriate age for the content.
Vulnerable adults are also protected under the Bill, including individuals who may lack the capacity to make decisions about their own online safety, such as those with dementia or learning disabilities. These individuals may need additional support and protection to ensure that they are able to use the internet safely and independently.
The Bill intends to protect adults by way of a “triple shield” to:
- remove all illegal content,
- remove content that is banned by platforms’ own terms and conditions, and
- allow adults to tailor the type of content they see via toggles (i.e. the ability to switch between two different options) and the ability to potentially avoid harmful content should they not wish to see it. Children will automatically have these settings by default.
The Bill if passed raises concerns regarding freedom of speech and privacy. While it is important to safeguard users from harm, there is a risk that the legislation could be used to limit access to content that is widely considered controversial or offensive even if it is not illegal. There is also the risk of over-blocking or censorship as companies and service providers err on the side of caution in an attempt to not fall foul of the law.
The Bill also proposes measures to end anonymous browsing by requiring some online service providers to implement age-verification checks for users. This means that individuals will have to provide proof of their age and identity before being able to access certain content or services online. While this will help to protect children and young people from accessing inappropriate or harmful content, it could also have negative impacts on privacy and freedom of expression. By ending anonymous browsing, people could be discouraged from sharing their opinions or accessing certain types of content for fear of being identified.
Implementation and Enforcement
All organisations in scope will need to tackle illegal content on their services and assess whether their services are likely to be accessed by children.
In addition, the biggest and highest risk platforms will also have to set out in their terms and conditions what types of legal content adults can post on their sites and will be required to transparently enforce their terms and conditions. These platforms will also be required to offer adult users the option to verify their identity, as well as tools to control who they interact with and the content they see online.
There are questions about how the Bill would be enforced and whether it would be effective in achieving its goals given the vastness of the internet. Ofcom is being put in charge as regulator. As regulator, Ofcom is to be granted enforcement powers including powers to impose substantial fines for breach of:
- up to £18 million, or
- 10% of annual global turnover,
whichever is greater.
It is also anticipated that Ofcom may, in certain circumstances request an order from the courts to restrict access to the relevant service being provided by the platform.
The Bill is a complex and controversial piece of legislation that raises numerous significant issues. While it is essential to ensure that internet users are protected from harm and abuse, it is also vital to ensure that rights of users are respected. As the Bill progresses through the legislative process, it will be crucial for policymakers to thoroughly consider the pros and cons of the legislation and to work towards achieving a balance between these competing interests.
+44 (0)330 045 2498