header-logo header-logo

No time to waste in safeguarding children online

28 October 2022 / Emily Carter
Issue: 8000 / Categories: Features , Privacy , Technology , Child law , Health & safety
printer mail-detail
99028
As the government tweaks the Online Safety Bill, Emily Carter highlights the importance of making progress
  • Looks at current data protection law safeguarding children.
  • Highlights importance of continuing with the Online Safety Bill despite inherent challenges.

It was back in 2019 that the government published the white paper for the Online Safety Bill, hailed as world leading legislation intended to make the UK ‘the safest place in the world’ to be online. In early September—after three long years of consultation, scrutiny and debate—the then prime minister Liz Truss paused the Bill’s progress through the last stages of legislative process so it could be ‘tweaked’. The recent conclusion of the inquest into the death of 14-year-old Molly Russell, however, has highlighted that there is no time for delay. Given the coroner found that Molly died ‘whilst suffering from depression and the negative effects of online content,’ the government should remain focused on the task in hand.

Regulating online services was never going to be straightforward, and there is a diverse range of stakeholders invested in getting the legislation right, especially concerning the regulation of legal but harmful content accessed by adults. However, further delay or entire derailing of the Online Safety Bill carries its own risks, with the greatest impact landing on children’s safety. Bearing in mind there is no such thing as a perfect compromise, is the proposed scheme of online safety regulation even workable? And what can be learned from the existing regulation of children’s privacy rights online?

Children’s online privacy rights

Data protection law has already made headway in protecting children online. Children are explicitly recognised as having all the same rights as adults, underpinned by Art 2 of the UN Convention on the Rights of the Child which enshrines ‘the best interests of the child’. Although online platforms are entitled to rely upon the consent of a child over 13 years, all those under 18 years old require special protection. Specific obligations exist around lawful processing and transparency, for example, ensuring that privacy information is age appropriate and appears at the point of disclosure rather than being hidden within unintelligible small print. The UK General Data Protection Regulation (GDPR) includes specific duties concerning profiling and automatic decision making, both of which depend on data gathered about users.

The processing of children’s data has been a regulatory priority for the Information Commissioner’s Office (ICO) for a number of years. Children’s rights were advanced further one year ago when the ICO introduced the Appropriate Age Design Code (known as the Children’s Code), a statutory code of practice for online services ‘likely to be accessed’ by children. It applies to any company, whether or not in the UK, which processes the data of UK children.

The Children’s Code includes a number of protections for children online including risk assessment, age verification, measures to protect children from any harmful effect of profiling and turning off geo-location services. The code embodies the principles of ‘privacy by design’ and ‘privacy by default’, requiring services to ‘build in’ online protections for children. In practice, the code has already led to a number of platforms making changes to their default settings. It was used as the template for the California Age-Appropriate Design Code Act (due to take effect in July 2024), and various other jurisdictions are considering similar codes.

Although the Children’s Code does not have the force of law, it is used as a yard stick to measure compliance with the principles and obligations of data protection legislation. The ICO is assessing the compliance of more than 50 online services, has conducted nine audits and has four investigations ongoing. On 26 September, the ICO issued a notice of intent to impose a potential fine of £27m against TikTok concerning alleged failings around consent, transparency and processing of ‘special category’ data.

The ICO has laid a sound foundation for further regulation which extends the obligations upon online services to put systems and processes in place to protect children from harmful material. Importantly, data protection regulation demonstrates how online safety regulation may operate in the real world.

How will regulations work?

Regulating online content is inherently challenging due to the volume of content, number of users, and the complexity and speed of technical development. When the white paper was published, there was no precedent for online regulation of user generated content. The stakeholders and their interests are diverse, representing the technology industry, regulators, criminal enforcement, civil society and charities on the frontline of child protection. The Bill must provide certainty and clarity for all stakeholders, as well as flexibility to evolve in the future. It is a tall order.

Principles-based regulation of systems and processes allows the regulatory scheme to evolve over time as theory is applied to practice. As demonstrated by the implementation of the GDPR, detailed codes of practice and guidance will be developed in consultation with relevant stakeholders and adjusted over time in response to the issues facing the industry.

The Online Safety Bill applies to user-to-user services and search services with links to the UK. This includes any service with significant numbers of UK users or which targets UK users. Some 25,000 online services will be within scope, with more weighty obligations falling upon 30–40 services based on risk of harm. It includes measures to protect all users, but those relevant to protecting children include:

  • requiring risk assessments to be undertaken;
  • regulating the availability of illegal content, especially child sexual abuse material;
  • imposing specific duties upon services likely to be accessed by children to put in place systems and processes to protect them from age-inappropriate content; and
  • requiring transparency reports setting out steps taken by services to tackle online harm.

Rather than moderating specific content, which would be inflexible and unworkable, the Bill focuses upon ensuring services have appropriate systems and processes with reference to the risks identified. This will include assessing the impact of algorithms on the content available to children. ‘Safety by design’ will need to be incorporated within service development, alongside existing requirements of ‘privacy by design’. Age verification will be an essential element of ensuring that children only access age-appropriate material, with specific provision for access to online pornography.

Ofcom has been given responsibility for regulating online safety, with a range of investigatory and enforcement powers, underpinned by criminal sanctions and individual senior management responsibility. Ofcom may impose fines of up to £18m or 10% of global annual turnover (whichever is higher) and it may also seek court orders to restrict the provision of services by non-compliant services. Ofcom is also focused upon improving media literacy in this area, and published research in March on children and parents’ attitudes and use of the media to inform this work (see bit.ly/3FbnvRZ).

Ofcom has been actively planning the implementation of the new law from next spring and has plans to recruit more than 300 additional staff. Ofcom will be standing shoulder to shoulder with the ICO. Both regulators have been closely liaising in relation to online safety including as key members of the new Digital Regulatory Cooperation Forum established two years ago. With such a long lead time, it is no surprise that Ofcom is more than ready to move ahead with the critical work required to implement the legislation when passed.

No time to waste

Given our online world is defined by its dynamism, online safety regulation will necessarily be an iterative process. The government should not allow the inherently fraught debate concerning ‘legal but harmful’ adult content to delay the child protection provisions of the Bill from becoming law. Reviews of the online safety legislation are anticipated after two and five years of enactment, providing the opportunity to adjust the legislation.

The organisations instrumental to the success of the legislation need certainty sooner rather than later in order to plan ahead. They will need to develop their internal systems and processes in line with the anticipated legislative obligations, especially the bigger social media platforms with existing self-regulation schemes. The safer technology industry needs clarity, especially around age verification technology which will be critical to the legislation’s success. Those organisations on the front line of identifying and removing harmful content from the internet, such as Internet Watch Foundation, need to understand their role within the new regulatory scheme.

The Online Safety Bill cannot pin down all the necessary detail required by the industry over coming years. The regulator must be trusted to get on with the important work of filling in the gaps of the legislation, including vital issues such as identification of priority harms to children. Ofcom has committed to listening to all stakeholders, and is subject to both parliamentary scrutiny and judicial oversight in the event of unreasonable or unlawful decision making.

Ofcom has scheduled full consultation with all stakeholders commencing in Spring 2023 and envisages finalising guidance and codes concerning illegal harms and child protection in 2024. There is an enormous amount of work to be done, just as soon as the Online Safety Bill becomes law. Although our government has a bulging in-tray, there is no time to waste in ensuring those ’tweaks’ are published and the legislation finalised.

Emily Carter is a partner in the Public Law team at Kingsley Napley, specialising in information law (www.kingsleynapley.co.uk).

MOVERS & SHAKERS

NLJ career profile: Liz McGrath KC

NLJ career profile: Liz McGrath KC

A good book, a glass of chilled Albarino, and being creative for pleasure help Liz McGrath balance the rigours of complex bundles and being Head of Chambers

Burges Salmon—Matthew Hancock-Jones

Burges Salmon—Matthew Hancock-Jones

Firm welcomes director in its financial services financial regulatory team

Gateley Legal—Sam Meiklejohn

Gateley Legal—Sam Meiklejohn

Partner appointment in firm’s equity capital markets team

NEWS

Walkers and runners will take in some of London’s finest views at the 16th annual charity event

Law school partners with charity to give free assistance to litigants in need

Could the Labour government usher in a new era for digital assets, ask Keith Oliver, head of international, and Amalia Neenan FitzGerald, associate, Peters & Peters, in this week’s NLJ

An extra bit is being added to case citations to show the pecking order of the judges concerned. Former district judge Stephen Gold has the details, in his ‘Civil way’ column in this week’s NLJ

The Labour government’s position on alternative dispute resolution (ADR) is not yet clear

back-to-top-scroll