Skip to content

Three years since the publication of a Green Paper, and over 18 months since a White Paper, the Government has announced its final plans for a new Online Safety Bill. The Children’s Commissioner welcomes this groundbreaking legislation, which is a significant step forward in combatting online harms. There is now much work to be done to ensure that the new regulatory regime will be as effective as possible in upholding children’s rights online.

We urge Parliamentarians to press for the Bill to be introduced at the earliest opportunity in 2021, and to scrutinise the following three aspects of the proposals closely throughout the legislative process:

  1. Enforcement action

The legislation will introduce a statutory duty of care, overseen and enforced by Ofcom. Ofcom will have a range of powers to use against companies which breach their duty of care, including:

The White Paper also proposed holding senior managers personally accountable for breaches of the duty of care. Although the Bill will makef provisions to impose criminal sanctions on senior managers, the power would not be immediately available to Ofcom – instead it would need to be introduced by Parliament via secondary legislation, and this would not happen for at least two years after the regulation has come into effect. Furthermore, the power can only be used as a sanction when platforms fail to share information with Ofcom.

Ofcom will be guided by a principle of proportionality when enforcing the duty of care. This means the sanctions levelled against a company will depend on the level of harm, as well as the size and resources of the platform.

Our response:

  1. Private communications

In recognition of the high proportion of child sexual exploitation and abuse which is perpetrated across private channels, the regulatory framework will apply to private messaging services, including those which are end-to-end encrypted.

The Bill will grant the regulator the power to compel companies to tackle illegal child sexual exploitation and abuse (CSEA) content and activity on their services. Where alternative measures are unavailable, the regulator has the power to require a company to implement automated, “highly accurate” technology to identify illegal CSEA. This power is limited to instances where Ofcom has evidence of “persistent and prevalent” CSEA on the platform, where no alternative approaches to tackling this content exist, and ministerial permission has been granted. Failure to comply with Ofcom’s request to use automated scanning will result in enforcement action against the company.

The Government suggests that the power to compel the use of automated technology is likely to be more “proportionate” on public platforms than private.

Our response:

  1. Age verification/assurance

Some services and online content are not illegal, but are inappropriate for children (e.g. pornography).

All companies will be required to assess whether their platforms are “likely to be accessed” by children. Crucially, a service need not be aimed at children for it to be “likely to be accessed” by children. For example, if a site is targeted at users aged 13 or over, but under 13s make up part of the user base, then the “likely to be accessed” test could be met. This replicates the approach taken in the ICO’s Age Appropriate Design Code, which protects children’s data privacy and will come into effect in September 2021.

Services which are “likely to be accessed” by children will be required to conduct a child safety risk assessment, and implement systems to protect children from experiencing harm. This might include using age verification and age assurance technology to restrict children’s access, or by offering a differentiated, child-friendly service (e.g. restricting certain functions, such as end-to-end encrypted messaging, to adults).

Our response:

[1] Note that this goes further than GDPR fines, which are set at 20 million Euros (around £18 million) or 4% (not 10%) of annual global turnover.