News

Social media companies make significant child privacy and safety changes as a result of legislation: New report

This innovative report documents the flurry of improvements that tech companies have made to protect children’s safety and privacy.
- Professor Sonia Livingstone
Child_and_Computer

New legislation and regulations are driving social media companies such as Meta and TikTok to make major child safety and privacy changes, according to a new report, launched in the House of Lords today (Monday 20 May 2024).

The report, Impact of regulation on children’s digital lives, from researchers at the Digital Future for Children centre at the London School of Economics and Political Science (LSE) and 5Rights Foundation found 128 changes related to child safety and privacy were made during the period 2017–2024 by Meta, Google, TikTok and Snap.

A peak of 42 changes were logged across the four companies in 2021, the year the Age Appropriate Design Code (AADC) came into effect in the UK. Also known as the Children’s Code, the AADC is one of several key regulatory measures focused on protecting children from online harms, alongside the UK Online Safety Act (OSA) and the United Nations General comment No.25 – wider context of children’s rights.

A significant proportion (63) of the changes made by the four companies between 2017- 2024 were made in the ‘in default’ category. This is where changes are made to the design of a service, providing default protections. For example, in July 2021, Instagram changed their default settings so that everyone under 16 (or under 18 in certain countries) is defaulted into a private account whenever they join.

The introduction of privacy and safety tools was the second most popular area of change (37 of the 128 changes). Tools provide new mechanisms for users or parents to control how certain platform features work. For example, in 2021 TikTok introduced a ‘filter all comments’ feature and in 2022 Instagram announced a tool allowing users to see their feeds in chronological order.

However, despite the positive steps made, the report reveals companies are significantly relying on tools such as parental controls in response to legislation and regulation. Evidence indicates these measures have low levels of use and efficacy. There is a risk of over-reliance on this measure, to the exclusion of other changes.  

Going forwards, the researchers have outlined eleven recommendations to improve child safety legislation and regulations and ensure children are better protected online. These include requiring companies to work across industry to introduce best practice rather than working separately; for regulators to publish their expectations of good practice; and for the introduction of mandatory access to data for child safety research with child safety changes being recorded and logged transparently.

Commenting on the report, Professor Sonia Livingstone from the Department of Media and Communications at LSE and the Director of Digital Futures for Children (DFC) said: “No longer need the public wait for businesses to regulate themselves! This innovative report documents the flurry of improvements that tech companies have made to protect children’s safety and privacy following the introduction of legislation and regulation. Particularly welcome are the many ‘by default’ design changes, as these benefit everyone without relying on users turning them on.”

Steve Wood founder of PrivacyX Consulting, ex-ICO Deputy Commissioner and author of the report added: “This report illustrates the effective impact that regulation is having in protecting children’s safety and privacy online.  The research highlights a shift towards substantive design changes that build in safeguards by default - from private account settings to restrictions in targeted in targeted advertising. We will repeat the research to assess what progress has been made in 2025 and have set our expectations about how the companies can be more transparent about the design changes they make.”

Behind the article

For more information, interviews or a copy of the report please contact media.relations@lse.ac.uk