#childsafety | New code aims to force tech giants to protect kids online

The UK’s data regulator has published a set of standards which it believes will force tech companies to take protecting children online seriously.

The Appropriate Design Code, drawn up by the Information Commissioner’s Office (ICO) covers everything from apps to connected toys, social media platforms to online games, and even educational websites and streaming services.

It makes it clear firms will be expected to make data protection of young people a priority from the design up.

It is hoped the code will come into effect by the autumn of 2021 pending approval from parliament.

The provisions include setting privacy settings to high by default and ending nudge techniques that encourage users to lower them.

Location settings that allow the world to see where a child is, should also be switched off by default, data collection and sharing should be minimised and profiling that can allow children to be served up targeted content should be switched off by default too.

The Information Commissioner Elizabeth Denham described the code as “transformational” and told the PA news agency: “I think in a generation from now when my grandchildren have children they will be astonished to think that we ever didn’t protect kids online.

“I think it will be as ordinary as keeping children safe by putting on a seat belt.”

Andy Burrows, head of child safety online policy at the NSPCC, said the code would force social networks to “finally take online harm seriously and they will suffer tough consequences if they fail to do so”.

He said: “For the first time, tech firms will be legally required to assess their sites for sexual abuse risks, and can no longer serve up harmful self-harm and pro-suicide content.

“It is now key that these measures are enforced in a proportionate and targeted way.”

Facebook, which has been under the spotlight for its approach to the safety of its users, said: “We welcome the considerations raised by the UK Government and Information Commissioner on how to protect young people online.

“The safety of young people is central to our decision-making, and we’ve spent over a decade introducing new features and tools to help everyone have a positive and safe experience on our platforms, including recent updates such as
increased Direct Message privacy settings on Instagram.

“We are actively working on developing more features in this space and are committed to working with governments and the tech industry on appropriate solutions around topics such as preventing underage use of our platforms.”

The full 15 provisions in the Appropriate Design Code are:

1. The best interests of the child should be a primary consideration when designing and developing online services likely to be accessed by a child.

2. Firms should “assess and mitigate risks to the rights and freedoms of children” who are likely to access an online service, which arise from data processing. They should take into account differing ages, capacities and development needs.

3. A “risk-based approach to recognising the age of individual users” should be taken. This should either establish age with a level of certainty that is appropriate to the risks to the rights and freedoms of children that arise from data processing, or apply the standards in this code to all users instead.

4. Privacy information provided to users “must be concise, prominent and in clear language suited to the age of the child”.

5. Children’s personal data must not be used in ways that have been “shown to be detrimental to their wellbeing, or that go against industry codes of practice, other regulatory provisions or Government advice”.

6. Uphold published terms, policies and community standards.

7. Settings must be set to “high privacy” by default.

8. Collect and retain “only the minimum amount of personal data” needed to provide the elements of the service in which a child is actively and knowingly engaged. Give children separate choices over which elements they wish to activate.

9. Children’s data must not be disclosed, unless a compelling reason to do so can be shown.

10. Geolocation tracking features should be switched off by default. Provide an “obvious sign for children when location tracking is active”. Options which make a child’s location visible to others must default back to off at the end of each session.

11. Children should be provided age-appropriate information about parental controls. If an online service allows a parent or carer to monitor their child’s online activity or track their location, provide an “obvious sign to the child when they are being monitored”.

12. Switch options which use profiling off by default. Profiling should only be allowed if there are “appropriate measures” in place to protect the child from any harmful effects, such as content that is detrimental to their health or wellbeing.

13. Do not use nudge techniques to “lead or encourage children to provide unnecessary personal data or weaken or turn off their privacy protections”.

14. Connected toys and devices should include effective tools to ensure they conform to the code.

15. Children should be provided with prominent and accessible tools to exercise their data protection rights and report concerns.

Source link