← Back to all briefings
Compliance 6 min read Published Updated Credibility 84/100

Compliance Briefing — September 2, 2020

Implementation guide for the UK ICO's Age Appropriate Design Code transition year, covering applicability assessments, age assurance, default privacy engineering, and accountability evidence.

Timeline plotting source publication cadence sized by credibility.
2 publication timestamps supporting this briefing. Source data (JSON)

Executive briefing: The UK Information Commissioner’s Office (ICO) began a 12-month transition period on 2 September 2020 to enforce the Age Appropriate Design Code (Children’s Code), compelling online services likely to be accessed by children to implement high-privacy defaults, data minimisation, and safeguards by 2 September 2021. Product, legal, security, and engineering leaders must orchestrate cross-functional programmes to classify child users, redesign data flows, and demonstrate accountability under the UK GDPR.

Enforcement timeline and regulatory expectations

The transition year that started on 2 September 2020 gave organisations until 2 September 2021 to implement the Children’s Code. During this period the ICO expected visible progress: applicability assessments, DPIAs, roadmap publication, and early rollouts of age-appropriate defaults. Post-grace-period enforcement relies on the ICO’s UK GDPR toolkit—information notices, assessment notices, stop-processing orders, and fines up to 4% of global annual turnover. The ICO has signalled active oversight through investigations and actions against platforms whose age assurance, transparency, or data sharing fell short of the code.

Teams should work toward quarterly milestones ahead of any new product launch: (1) scope and risk segmentation, (2) DPIA and design approvals, (3) implementation of privacy safeguards, (4) validation and evidence gathering. Coordinate with legal to ensure board-level sign-off for high-risk processing affecting children and to prepare for ICO engagement, including potential participation in the ICO sandbox.

Compliance requirements and technical implementation

The Children’s Code contains 15 standards covering best interests of the child, age-appropriate application, transparency, detrimental use of nudge techniques, parental controls, profiling, geolocation, default settings, data minimisation, data sharing, connected toys, online tools, and governance. Treat the standards as cumulative: falling short on any material standard can undermine compliance.

Conduct service mapping to determine where children are likely to access the product. Indicators include analytics showing significant under-18 usage, app store categories, marketing materials, and comparisons with similar services. Document applicability decisions in Article 30 records and keep evidence (e.g., survey data, regulator guidance) for audit trails. Where services are mixed-audience, design differentiated experiences with gating and content controls to prevent inadvertent exposure of children to adult-facing features.

Operationalise compliance through technical controls. Build configuration flags that enable child-specific defaults: disable precise geolocation, set profiles to private, restrict contact discovery to verified users, and block third-party sharing unless strictly necessary. Implement consent and transparency flows that are concise, age-appropriate, and tested for comprehension. Ensure all data collection events include lawful basis annotations and retention rules, and block telemetry that is not essential for delivering the service.

Age assurance and user journey redesign

Implement proportionate age assurance to distinguish users under and over 18. Combine self-declaration with contextual signals (device settings, language, time-of-day use patterns) while minimising intrusion. For higher-risk use cases—social networking, livestreaming, in-app purchases—layer additional verification such as credit reference checks or third-party age estimation that align with ICO guidance and avoid excessive data retention.

Redesign registration, onboarding, and parental dashboards to provide layered notices with plain language, icons, and just-in-time prompts. Avoid dark patterns that encourage children to weaken privacy protections or overshare data. Offer clear parental controls with audit trails that show when privacy settings were changed and by whom. Ensure exit paths from prompts are equally prominent to avoid nudging toward riskier options.

Data minimisation, default privacy, and profiling controls

Inventory every data element collected from child users and challenge necessity. Remove non-essential fields, truncate precise location to coarse regions, and implement automated deletion schedules tied to inactivity or service completion. Configure defaults to maximum privacy: private profiles, disabled public searchability, deactivated ad identifiers, and off-by-default data sharing with third parties. When functionality requires sharing (e.g., multiplayer matchmaking), expose clear opt-in controls at the moment of need and log user choices for accountability.

Disable behavioural advertising for child accounts and restrict recommendation systems to content curation that demonstrably serves the best interests of the child (e.g., educational progression). Provide accessible toggles for profiling features and record legitimate interest assessments. Where profiling is required for safety (fraud or abuse detection), document necessity and proportionality, and maintain human oversight to reduce adverse impacts.

Security, incident response, and vendor management

Children’s data warrants heightened protection. Enforce multi-factor authentication for administrative access, apply role-based access controls, encrypt data in transit and at rest, and monitor for anomalous access patterns. Incorporate child-specific scenarios into incident response plans, including parental notification templates, rapid ICO reporting procedures, and data minimisation steps during containment.

Vet third-party processors, SDKs, and advertising partners to ensure they honour privacy settings and do not transmit child data without lawful basis. Update contracts with child data protection clauses, audit rights, and incident notification timelines aligned to the code. Maintain a vendor register with data flow diagrams, access scopes, and review cadences, and remove unused SDK permissions that could expose children to tracking or profiling.

Compliance validation, testing, and continuous improvement

Run Data Protection Impact Assessments (DPIAs) tailored to child users. Identify high-risk processing—profiling, geolocation, user-generated content visibility—and score mitigations. Incorporate child rights impact assessments to balance commercial objectives with the best interests of the child, referencing UN Convention on the Rights of the Child principles.

Invest in participatory research with children and parents to evaluate comprehension of notices, clarity of parental controls, and usability of privacy settings. Conduct moderated usability studies across age cohorts (7–9, 10–12, 13–15, 16–17) and capture comprehension gaps. Supplement qualitative findings with A/B experiments comparing high-privacy defaults versus legacy experiences, measuring retention, complaint rates, and safety incidents. Feed results into release checklists and regression suites to ensure no privacy regressions are introduced during updates.

Risk mitigation and accountability evidence

Establish governance that links senior accountability to delivery of the Children’s Code. Assign product, engineering, security, and legal owners for each standard. Track progress through dashboards that measure adoption of privacy settings, volume of child safety incidents, time-to-close for user complaints, and training completion rates. Provide regular board updates summarising residual risks and remediation timelines.

Maintain audit-ready documentation: DPIA outputs, design decision logs, policy updates, incident postmortems, and training records. For global platforms, reconcile UK Children’s Code controls with EU Digital Services Act, forthcoming EU AI Act safeguards, and any local age verification mandates to prevent fragmented experiences. Prepare for ICO inquiries by keeping evidence packages that show how best-interests assessments were made and how child feedback informed design choices.

Communications and stakeholder engagement

Publish clear privacy information for children and parents, including FAQs, video explainers, and accessible complaint channels. Coordinate with customer support to provide scripts and escalation paths for privacy concerns. Engage with industry bodies (techUK, ISBA, UKIE) and child safety NGOs to benchmark practices and stay ahead of evolving guidance. Where appropriate, test new approaches in the ICO’s sandbox to validate proportionality of age assurance or transparency methods.

Follow-up: The enforcement grace period ended in September 2021; the ICO has since taken actions against platforms, including a £12.7 million fine against TikTok in April 2023 for unlawful processing of children’s data, and continues to refine design guidance for immersive and high-risk environments.

Sources

Timeline plotting source publication cadence sized by credibility.
2 publication timestamps supporting this briefing. Source data (JSON)
Horizontal bar chart of credibility scores per cited source.
Credibility scores for every source cited in this briefing. Source data (JSON)

Continue in the Compliance pillar

Return to the hub for curated research and deep-dive guides.

Visit pillar hub

Latest guides

  • United Kingdom
  • Children's Code
  • Privacy
  • Product design
  • Data protection
Back to curated briefings