Skip to main content

BLOG

Published

Keeping UK children safe online: have your say

In April 2019, the UK government published three policy and practice initiatives relating to child exploitation and online safety:

  • Online Harms White Paper (Department for Digital, Culture, Media and Sport and Home Office), published 8th April 2019;
  • Age appropriate design: a code of practice (Information Commissioner's Office) published 15th April 2019; and
  • Child Exploitation Disruption Toolkit (Home Office) published 15th April 2019.

Public consultations are now open for the Online Harms White Paper and Age appropriate design, giving you the chance to have your say on government standards and policy which will affect us all and our organisations. You can also provide feedback on the Child Exploitation Disruption Toolkit.

Your insights are invaluable to ensuring that the best possible safeguards and policies are in place to protect and support the children you work with.

Below, you can read a brief outline of each document and why they are significant for professionals who work with children.

 

Online Harms White Paper


What is it?

The Online Harms White Paper sets out the government's plan to tackle both legal and illegal online harms, to keep UK internet users safer online.

It puts forward plans for a new system of accountability and oversight for tech companies, moving beyond self-regulation. A new regulatory framework for online safety will make clear companies' responsibilities to keep users safer online. An independent regulator will implement, oversee and enforce the new regulatory framework.

Why is it important?

The UK will be the first country to establish a regulatory framework that tackles this range of online harms, helping to make the internet a safer place for children.

For the first time, social media platforms and online services will be accountable if they fail to tackle and address a range of online harms. The framework seeks to make the online world safer for children by introducing standards that social media platforms and online services must adhere to. These include (but are not limited to):

  • Make sure that terms and conditions are clear and accessible to children and vulnerable users;
  • produce annual transparency reports that outline the harmful content present on their platforms and what countermeasures they are taking to address these. (These reports will be published online by the regulator, so users and parents can read these and make informed decisions about internet use);
  • be honest about any design practices used on their platforms that encourage prolonged screen time e.g. 'infinite scroll' that loads content continuously as the user scrolls down the page; and
  • invest in the development of safety technologies, such as effective age verification, to reduce the burden on younger users to stay safe online.

All tech companies will have to follow the new regulatory framework. The independent regulator will have the power to take action against companies who breach their statutory duty of care. This may include a substantial fine or imposing liability on members of senior management.

Have your say

The government has invited both individuals and organisations to provide their views by responding to the consultation questions set out throughout the Online Harms White Paper. You can respond to the consultation here.

The consultation is open until 1 July 2019.

 

Age appropriate design: a code of practice


What is it?

Age appropriate design: a code of practice sets out the design standards expected of providers of online services and apps likely to be used by children. This document has been developed by the Information Commissioner's Office (ICO) – the UK's independent body established to uphold information rights. This design code is a requirement of the Data Protection Act 2018.

The Age Appropriate Design Code sets out 16 standards of age appropriate design for online services such as apps, connected toys, social media platforms, online games, educational websites and streaming services.

Why is it important?

The standards set out in the Age Appropriate Design Code attempt to make the internet a safer place for children and young people, placing responsibility on online services to ensure that the 'best interests of the child' is their primary consideration when designing and developing online services. The Age Appropriate Design Code will direct online services to:

  • Make sure that design code is transparent, and that privacy information is clear and accessible to children;
  • keep children's personal data secure, not using or sharing this in ways that can be shown as being detrimental to a child's wellbeing;
  • set the settings of a site or app as 'high privacy' by default;
  • switch geolocation off by default; and
  • stop using 'nudge techniques' to lead or encourage children to provide unnecessary personal data, turn off privacy protections, or extend use.

These standards aim to make sure that children's personal information is more secure online, and minimise their exposure to online harms.

The ICO explains that it has a statutory duty to take the provisions of the code into account when enforcing GDPR and PECR. Penalties can be imposed on online services that fail to follow the code, including fines of up to 20 million euros or 4% of the company's annual turnover (whichever is higher).

Where they see harm or potential harm to children, the ICO is likely to take more severe action against a company who has placed a child at this level of risk.

Have your say

The Information Commissioner is calling for evidence and views on the Age Appropriate Design Code. In particular, the Commissioner in interested in evidence-based submissions provided by: bodies representing the views of children or parents; child development experts; providers of online services likely to accessed by children; and trade associations representing such providers. You can submit your views here.

The consultation is open until 31 May 2019.

 

Child Exploitation Disruption Toolkit


What is it?

The Child Exploitation Disruption Toolkit has been created for front-line staff working to safeguard children under the age of 18 from sexual and criminal exploitation, both offline and online. This includes professionals working in law enforcement, social care, education, housing, the voluntary sector, and related partner organisations.

This toolkit is intended to help all safeguarding partners to understand and access existing legislative opportunities at their disposal and to target specific risks, ranging from warning notices to offence charges and care orders.

It is split into 6 areas of law enforcement and other agency activity:

  1. Abduction and trafficking;
  2. sexual offences;
  3. victim care;
  4. behaviour;
  5. location; and
  6. 'other options'.

In addition to the 6 areas, it includes best practice guidance for professionals on information sharing and multi-agency working.

Certain sections of the toolkit will be more relevant to those working in law enforcement than those working in other sectors.

Why is it important?

It's important to be aware of the existing legislative opportunities across different agencies to make sure that the best possible safeguards are in place for the children you work with. You may also use it inform your own organisation's safeguarding policies and procedures, and can be shared amongst colleagues.

It also makes clear other agencies' responsibilities to ensure that when safeguarding a child it is clear who to involve and what information may be appropriate to share across agencies.

Have your say

If you would like to share your feedback about the document, or if you would like to receive notifications about future iterations of the toolkit, you can email at: CEtoolkit@homeoffice.gov.uk.

For up-to-date information on the latest legislation, policy, research and guidance relating to children and young people online; refer to the Thinkuknow guidance area for professionals.

Related Resources

RELATED GUIDANCE

Online Harms White Paper

Read the government’s plans to tackle online harms and make companies more responsible for their users’ safety online, especially children and other vulnerable groups.