A sweeping set of regulations governing how online services should treat children’s data have been welcomed by campaigners as they come into effect.
The Age Appropriate Design Code – which was written into law as part of the 2018 Data Protection Act, which also implemented GDPR in the UK – mandates websites and apps from Thursday to take the “best interests” of their child users into account, or face fines of up to 4% of annual global turnover.
Unless they can prove their service is not likely to be used at all by children, companies now face a choice: they must make their entire offering compatible with the code or attempt to identify younger users and treat them with care. The code prohibits the use of “nudge” techniques aimed at encouraging children to give up more of their privacy than they would otherwise choose to, calls on companies to minimise the data they collect about children and requires them to offer children privacy options that default to the maximum security.
“This shows tech companies are not exempt,” said Beeban Kidron, the baroness and campaigner who introduced the legislation that created the code. “This exceptionalism that has defined the last decade, that they are different, just disappears in a puff of smoke when you say, ‘actually, this is business. And business has to be safe, equitable, run along rules that at a minimum protect vulnerable users.’”
“This code will lead to changes that will help empower both adults and children,” said Elizabeth Denham, the information commissioner. “One in five UK internet users are children, but they are using an internet that was not designed for them. In our own research conducted to inform the direction of the code, we heard children describing data practices as ‘nosy’, ‘rude’ and a ‘bit freaky’.
“When my grandchildren are grown and have children of their own, the need to keep children safer online will be as second nature as the need to ensure they eat healthily, get a good education or buckle up in the back of a car.”
In the weeks leading up to the passage of the code, a number of major tech platforms have already introduced significant changes to how they treat child users. TikTok introduced a range of changes restricting the sharing options of younger users, and disabled notifications from the app after bedtime for those under 18. At Google, a new policy now lets anyone under 18, or their parents, request the removal of images from search results, while the company has acted to disable entirely its “location history” service for children, which keeps a record of users’ movements.
YouTube also updated its default privacy settings, and turned off the autoplay option by default for all users aged 13-17, while a plethora of changes at Facebook sees users under 18 exempted from targeted advertising entirely, receive tighter default sharing settings, and get protection from “potentially suspicious accounts” – adults who have previously been blocked by large numbers of young people on the site.
Many of the companies insisted that the changes were not fully motivated by the code, however. A Google spokesperson said its updates extended beyond any single current or upcoming regulation, while a Facebook spokesperson said its update “wasn’t based on any specific regulation”.