You are here

7 key issues from the Online Safety Bill report

A new parliamentary legislative committee report has proposed changes to the draft Online Safety Bill.

After gathering feedback on the initial draft – including Parent Zone’s own response – the legislative committee made up of MPs and peers published their recommendations on how the Bill should be amended before progressing into legislation. 

Notable inclusions are moving online pornography further into the Bill’s scope, making internet service providers more responsible for risks of harm on their platforms, and greater enforcement powers for Ofcom, the Bill’s regulator. 

Damian Collins MP, the committee’s chair, said their objective was achieving “parity between the real world and the online world”, and encouraging tech companies to put safety – especially children’s safety – over profit. 

The changes are positive, but there are still some significant omissions. Here’s the key issues we picked out...

1. A focus on functionality – and more power to the regulator

The legislative committee’s recommendation to restructure the Bill is a positive – particularly to focus on platforms’ underlying business models and functionality, rather than only on individual pieces of content. A ‘Safety by Design’ Code of Practice would set out the steps providers will need to take to sufficiently consider and mitigate risks of harm.

The report suggests the Bill should require service providers to conduct ‘risk assessments’ around ‘reasonable foreseeable threats to user safety’. These would include the potential harm caused by functionalities. For example in-stream payments on platforms such as OnlyFans – or algorithms that lead users down a ‘rabbit hole’ to ‘maximise user engagement and attention’.

We agree with the recommendation for Ofcom to have the powers to set minimum standards of these risk assessments, with service providers required to ‘undertake independent audits of their systems, processes and algorithms’. Proposals include financial penalties – or even criminal sanctions for executives (or designated ‘safety controllers’) who are 'grossly non-compliant'.

2. The Duty of Care should be more explicit

We have previously queried whether the draft Bill’s proposed ‘Duty of Care’ went far enough – and are pleased to see more explicit standards being recommended by the legislative committee. 

While the draft Bill required all tech companies to have a duty of care for children (and, for Category 1 service providers, those over 18), it needed to be more responsive to existing and emerging harms. 

The report acknowledges this – suggesting Ofcom drafts up mandatory Codes of Practice on risk areas such as child exploitation and terrorism. Importantly, Ofcom would also be able to introduce new Codes as new platforms, innovations and harms emerge – helping to future-proof the Bill as technology develops. Establishing a permanent committee to ‘amass expertise’ on such a fast-moving environment would also support this.

3. Pornography should be included

This was a key part of our response to the draft Bill. In the draft, it would have been harder for children to access some social media platforms than to view extreme pornography. Porn sites would have also been able to remove user-generated content in order to remove themselves from the draft Bill’s scope. 

Under new recommendations, all pornography sites would have a legal duty to prevent children accessing them, which is likely to require age-assurance procedures. With all the work the BBFC previously did to be the new regulator of pornography – before those plans were shelved – Ofcom should have the tools at its disposal to continue as a new regulator.

4. Gaming is still not mentioned...

We have called for gaming to be included within the scope of the Bill. 

Ofcom research found 70% of five to 15-year-olds play online games, rising to 86% of 12 to 15-year-olds. Gaming IS the internet for many children – so it would be impossible to claim the Bill will make the UK the “safest place in the world to go online” if it excludes a key online playground. 

Areas of risk include loot boxes and ‘dark nudge’ techniques that encourage in-game purchases – which, as we have demonstrated, come straight out of the gambling industry. 

We were disappointed to see virtually no mention of gaming within the committee’s report. 

The amended focus on platforms’ functionalities and business models may yet allow gaming to be covered in its scope. But the report’s single gaming recommendation is that ‘some’ gaming platforms could ‘potentially’ fall into Category 1 online service provider thresholds. This is an area we will continue to watch with interest.

5. … and neither are parents

Parents are absolutely integral to ensuring our younger generations grow up with the ability to use technology confidently and safely. We had hoped that this new report would make parents part of the solution, or acknowledge their prominence and responsibility within childrens’ online lives. Despite the committee’s emphasis on child safety being the heart of the report, there is no mention of parents when it comes to protecting children from online harms. 

We had previously suggested parents should be given the opportunity to consent to online services or transactions for their child, as well as the ability to lodge a complaint* on their child’s behalf. 

(* While this is a particularly disappointing omission, the report does recommend the introduction of a ‘digital ombudsman’ to deal with individual complaints against platforms. This may support parents submitting complaints for their children.)

6. End-to-end encryption will not be banned

End-to-end encryption means a message cannot be viewed by anyone except the sender and receiver. It is a popular feature of messaging platform WhatsApp – but also makes the sharing of illegal content and criminal activities such as child sexual abuse much harder to stop. The debate is a complex one, with some arguing that it is an important privacy feature.

While this kind of encryption can be valuable for security, it is important that tech firms ensure that it doesn’t create an increased risk for children. To this end, the report does not suggest banning end-to-end encryption, but does recommend that it is considered a ‘risk factor’ during any platform’s risk assessment processes.

7. New criminal offences will be created

The report suggests creating a wide range of new criminal offences punishable under the Online Safety Bill. This includes making ‘cyberflashing’ – the sending of unwanted nude or explicit images – a crime, and making content promoting self-harm illegal. 

This is something the committee hopes will increase transparency and accountability. Self harm content sits within the complex area of ‘legal but harmful’ making this new offence an interesting development. We will be listening with interest to the debates about how enforceable  it will be in practice. 

What happens next?

Damian Collins MP told Parent Zone in a briefing that the Online Safety Bill is “not anti-tech”. Rather, it aims to give a clear framework to tech companies to work to, to help all users – especially younger and more vulnerable users – stay safe online. 

Collins also suggests that the report’s recommendations be taken wholesale, as a package rather than “a menu to cherry-pick from”. He is clearly keen to avoid too much unpicking of the Bill when it arrives in parliament for debate and any pushback from freedom of speech advocates. 

While questions have been raised around Ofcom’s readiness to act as regulator, it has hired around 300 extra staff in the last few months in preparation – suggesting it is gearing up to invest significant resources in this new role. 

The government must respond to the committee’s report by February 2022. This will include an indication of the its next steps in relation to the legislative committee’s recommendations.