Compliance Briefing — September 2, 2020
The UK Information Commissioner’s Office began enforcing the Age Appropriate Design Code on 2 September 2020, compelling online services to harden privacy defaults and profiling controls for children.
Executive briefing: The UK Information Commissioner’s Office (ICO) began a 12-month transition period on 2 September 2020 to enforce the Age Appropriate Design Code (Children’s Code), compelling online services likely to be accessed by children to implement high-privacy defaults, data minimisation, and safeguards by 2 September 2021. Product, legal, and engineering leaders must orchestrate cross-functional programmes to classify child users, redesign data flows, and demonstrate accountability under the UK GDPR.
Determine applicability and risk segmentation
The Children’s Code applies to information society services (ISS) provided to or likely accessed by UK children under 18, including apps, social media platforms, online games, streaming services, connected toys, and educational technology. Assess whether the service is “likely to be accessed” using ICO indicators such as user analytics, market research, app store listings, and comparable services’ demographics. Even services targeting adults must consider incidental child access if analytics reveal significant child usage.
Conduct a risk-based segmentation exercise. Categorise services into tiers—core child-directed, mixed audience, and adult-oriented with incidental access. Document the rationale, evidence sources, and thresholds. For multinational platforms, map regional variants (e.g., .co.uk domain, EU/US versions) and determine whether geolocation, age gating, or service segmentation is feasible. Engage data protection officers (DPOs) early to validate applicability assessments and record decisions in your Article 30 records.
Build a compliance programme aligned to the 15 standards
The Children’s Code comprises 15 cumulative standards covering best interests, data minimisation, default privacy, data sharing, profiling, geolocation, parental controls, transparency, nudge techniques, connected toys, and online tools. Establish a programme office led by privacy, product, and engineering stakeholders. For each standard, assign owners, define maturity targets, and develop roadmaps.
Conduct Data Protection Impact Assessments (DPIAs) specific to child users. Identify processing operations presenting high risk—profiling, behavioural advertising, location tracking—and evaluate mitigations. Incorporate child rights impact assessments to balance commercial objectives with the best interests of the child, referencing UN Convention on the Rights of the Child principles. Document residual risks, escalation decisions, and sign-off by senior management.
Age assurance and user journey redesign
Implement proportionate age assurance mechanisms to distinguish users under 18 and apply appropriate protections. Combine self-declaration with contextual signals (device settings, usage patterns) and, where high-risk processing occurs, integrate third-party age verification or credit reference checks compliant with UK standards. Ensure friction is proportionate: high-risk services (social networking, live streaming) may require stronger verification, while low-risk educational resources might rely on self-declaration combined with parental notifications.
Redesign registration and onboarding flows to provide clear, child-friendly transparency. Use layered notices with plain language, icons, and contextual prompts explaining data collection and usage. Provide accessible parental resources explaining controls, data practices, and complaint channels. Avoid dark patterns—nudges encouraging children to weaken privacy—by aligning UI/UX with the code’s prohibition on nudge techniques detrimental to best interests.
Default privacy, data minimisation, and profiling controls
Configure default privacy settings to the highest level for child accounts. Disable precise geolocation, public friend lists, and data sharing with third parties unless demonstrably necessary for the service. Where functionality requires sharing (e.g., multiplayer gaming), provide clear explanations and opt-in choices at the point of use. Log user selections and maintain audit trails for accountability.
Review data minimisation: inventory personal data collected, evaluate necessity, and eliminate fields not essential for core service delivery. Implement retention policies with automatic deletion or anonymisation schedules. For profiling, disable behavioural advertising and limit recommendation algorithms unless they demonstrably support the child’s best interests (e.g., educational content curation). Provide accessible toggles for profiling features and document legitimate interest assessments.
Security, incident response, and vendor management
Enhance technical and organisational measures tailored to children’s data sensitivity. Enforce multi-factor authentication for administrative access, encrypt data in transit and at rest, and monitor for anomalous access patterns. Update incident response plans to include child-specific notification considerations, including parental communication templates and ICO reporting timelines.
Review third-party processors, SDKs, and advertising partners for compliance with the code. Update contracts to include child data safeguards, audit rights, and incident reporting obligations. Conduct technical due diligence on embedded software development kits (SDKs) to ensure they respect privacy settings and do not transmit data off-platform without lawful basis. Maintain a vendor register specifying child data access rights and review cadences.
Training, governance, and accountability documentation
Develop training modules for engineers, designers, product managers, and customer support teams covering child privacy principles, dark pattern avoidance, and escalation procedures. Include scenario-based exercises illustrating ethical dilemmas and regulatory expectations. Track completion rates and refresh training annually.
Update governance documentation: revise privacy policies, terms of service, DPIA templates, and standard operating procedures to reference the Children’s Code. Maintain audit-ready evidence, including design decision logs, testing results, and user research findings demonstrating child input. Establish metrics dashboards monitoring child safety incidents, privacy complaints, and control adoption rates.
Monitoring, enforcement horizon, and future developments
Set quarterly checkpoints leading to the enforcement deadline. Milestones should cover age assurance deployment, privacy setting rollouts, DPIA completion, vendor contract updates, and communication campaigns. Engage with the ICO’s regulatory sandbox, industry associations (UKIE, ISBA, techUK), and child safety NGOs to benchmark practices and share insights.
Monitor regulatory developments post-Brexit, including potential divergence between the UK Children’s Code and the EU’s Digital Services Act, ePrivacy reform, and the forthcoming EU AI Act, which may introduce additional safeguards for minors. Prepare for the ICO’s enforcement toolkit—information notices, assessment notices, fines up to 4% of global turnover—and integrate compliance status into enterprise risk dashboards.
Key actions for leadership
Leadership should champion a child-centric design ethos that balances innovation with duty of care. Allocate budget for privacy-by-design tooling, user research with children and parents, and external advisory support. Tie executive objectives to successful code implementation, and report progress to the board and regulators. Transparent communication with parents and child users will build trust and differentiate the service in a market increasingly scrutinised for its treatment of young audiences.
User testing and child participation
Invest in participatory design research with children and parents to validate whether privacy notices, parental dashboards, and in-product controls are intuitive. Run moderated usability sessions covering different age cohorts (e.g., 7–9, 10–12, 13–15) and document comprehension gaps. Capture analytics on how often children adjust privacy settings post-launch and iterate based on behavioural data. Collaborate with child advocacy groups or academic experts to evaluate the psychological impact of nudges and reward mechanics, ensuring they align with the code’s best-interests principle.
Complement qualitative research with quantitative testing: deploy A/B experiments comparing high-privacy defaults versus legacy experiences, measuring retention, satisfaction, and support ticket volume. Use the findings to defend design choices in ICO audits and demonstrate continuous improvement.
Follow-up: The enforcement grace period ended in September 2021; ICO has since taken actions against TikTok, Snap, and other platforms, and in 2024 it opened consultations on tightening design guidance for immersive environments.
Sources
- ICO Age Appropriate Design Code (Children's Code) — Information Commissioner's Office; ICO statutory code outlining 15 standards for online services accessed by children.
- ICO — Age appropriate design: a code of practice — Information Commissioner's Office; Full PDF of the Age Appropriate Design Code setting compliance expectations and transition timeline.