Policy Briefing — UK Children’s Code Finalised
The UK Information Commissioner’s Office published the final Age Appropriate Design Code, setting statutory standards for online services accessed by children ahead of the September 2021 enforcement deadline.
Executive briefing: The UK Information Commissioner’s Office (ICO) confirmed the Age Appropriate Design Code (Children’s Code) on 12 August 2020, setting a 12-month transition period before enforcement from 2 September 2021. The statutory code applies to information society services likely to be accessed by children, including apps, online games, educational platforms, and connected devices. Organizations must embed child-centric privacy by design across governance, product development, and data operations to avoid regulatory sanctions and protect young users.
Understand the code’s scope and legal interplay
The code elaborates on GDPR and UK Data Protection Act obligations for services that process personal data of children under 18. It applies regardless of whether children are the intended audience; likelihood of access triggers coverage. Key principles include best interests of the child, data minimization, transparency, detrimental use of data avoidance, and robust geolocation and profiling controls. The ICO can take enforcement action—including fines aligned with GDPR thresholds—against non-compliant organizations.
Cross-border providers must coordinate compliance with EU guidance (such as the Article 29 Working Party’s opinion on children’s data), the U.S. Children’s Online Privacy Protection Act (COPPA), and upcoming global youth privacy laws. Establish a regulatory matrix that maps overlapping requirements to streamline control design.
Conduct gap assessments against the 15 standards
The code sets 15 design standards ranging from data minimization to connected toy controls. Perform a structured gap analysis covering each standard:
- Best interests of the child: Document how product decisions prioritize safety and wellbeing, including ethics reviews and stakeholder consultations.
- Data protection impact assessments (DPIAs): Create child-specific DPIA templates that evaluate unintended consequences, profiling risks, and manipulative design patterns.
- Age-appropriate application: Implement age assurance methods that balance proportionality, such as self-declaration backed by behavioral signals or third-party verification for high-risk services.
- Transparency: Redesign privacy notices with layered, bite-sized explanations, using visuals and plain language tailored to developmental stages.
- Detrimental use of data: Inventory features that nudge excessive engagement or monetize vulnerabilities, and rework incentives to support healthy usage.
- Parental controls and profiling: Offer granular controls, clear indicators when monitoring occurs, and options for children to seek support.
Capture remediation tasks, owners, budgets, and milestones in a program management tool. Prioritize high-risk standards—such as profiling, geolocation, and nudge techniques—that the ICO has highlighted during enforcement briefings.
Embed governance and accountability
Establish a cross-functional Children’s Code steering committee comprising privacy, product, engineering, policy, customer support, and child-safety experts. Assign executive sponsorship to ensure resources and decision-making authority. Update accountability frameworks to include board reporting on child privacy metrics, audit trails for design decisions, and regulator engagement protocols.
Update policies—privacy, acceptable use, content moderation—to reference the code’s standards explicitly. Ensure your Data Protection Officer maintains visibility into product roadmaps and marketing campaigns targeting young audiences. Where the organization operates globally, align governance with local representatives or lead supervisory authorities to streamline cross-jurisdictional coordination.
Redesign product experiences with child-centric privacy
Audit user journeys to identify areas where default settings expose children to unnecessary data collection or sharing. Implement privacy by default: disable location tracking, profile-based recommendations, and targeted advertising unless robust safeguards exist. Provide contextual prompts that explain why data is requested and how it will be used, with easy opt-out mechanisms.
Collaborate with UX researchers and child development specialists to test content comprehension. Use focus groups or co-design workshops with parents and young users to validate interfaces. Ensure dark patterns—such as deceptive consent flows, forced data sharing for access, or misleading gamification—are removed. Integrate well-being features, including usage dashboards, break reminders, and mental health resources, where appropriate.
Strengthen age assurance and parental engagement
Develop a risk-based age assurance strategy. Low-risk services may rely on self-declaration with routine back-end monitoring for anomalies (e.g., adult-like purchasing patterns). Higher-risk contexts, such as social platforms or monetized games, may require third-party verification, document checks, or school-issued credentials. Document the rationale for chosen methods, addressing privacy, accessibility, and discrimination considerations.
Design parental dashboards that provide oversight without undermining children’s rights to privacy and freedom of expression. Offer controls for content filters, communication settings, and purchase limits. Clearly explain when parental monitoring is active and provide children with education resources about online safety. Implement escalation channels for reporting abuse or seeking support, and integrate them with trust and safety operations.
Implement data minimization and retention controls
Catalog all personal data fields collected from or about children, tagging them by sensitivity and usage purpose. Remove non-essential fields, especially those used for personalization or marketing. Automate data deletion once purposes are fulfilled, with retention schedules validated by legal counsel. Where analytics are necessary, leverage aggregation, differential privacy, or on-device processing to reduce exposure.
Review data sharing arrangements with third parties—including ad networks, analytics providers, and content partners—to ensure they meet Children’s Code standards. Execute data processing agreements that restrict downstream use, prohibit profiling, and mandate prompt breach notification. Conduct vendor audits focusing on encryption practices, access controls, and incident response capabilities.
Upgrade security and incident response
The code emphasizes strong security tailored to children’s data sensitivity. Implement multi-layered defenses: encryption in transit and at rest, fine-grained access controls, and anomaly detection for account takeover attempts. Train security teams to identify grooming or exploitation patterns in behavioral data. Integrate child-safety considerations into incident response playbooks, ensuring coordination with law enforcement and child protection agencies when necessary.
Run tabletop exercises simulating data breaches involving children, unauthorized contact attempts, or content moderation failures. Document lessons learned and update controls. Provide transparency reports summarizing incidents, remediation actions, and safety improvements.
Communicate transparently and build trust
Create communication plans for children, parents, educators, and regulators. Publish plain-language summaries of privacy practices, safety features, and reporting channels. Offer in-product education modules that teach children about digital literacy, privacy controls, and responsible sharing. Engage with schools and community organizations to gather feedback and demonstrate commitment to safeguarding.
Establish a regulator liaison function to maintain ongoing dialogue with the ICO. Notify the ICO proactively about significant product changes affecting children, share DPIA outcomes when requested, and respond promptly to inquiries. Participation in industry codes of practice—such as the UK’s voluntary standards for video-sharing platforms—can reinforce credibility.
Monitor enforcement trends and global developments
Track ICO enforcement actions, guidance updates, and case studies released through the Children’s Code hub. Analyze penalties issued under GDPR for child privacy violations to benchmark risk exposure. Monitor international developments, including Ireland’s Fundamentals for a Child-Oriented Approach to Data Processing, California’s Age-Appropriate Design Code Act (2022), and Australian eSafety initiatives. Harmonize controls across jurisdictions to minimize duplication.
Review legal challenges or legislative amendments that could adjust the code’s scope. Maintain a legal watchlist summarizing consultations, parliamentary debates, and court decisions influencing youth privacy norms. Update stakeholders quarterly on emerging trends, technology impacts, and required program adjustments.
Action checklist for the next 90 days
- Complete a gap assessment against all 15 Children’s Code standards, prioritizing remediation for high-risk features and data flows.
- Stand up a cross-functional steering committee with executive sponsorship to oversee implementation, budget allocation, and regulator engagement.
- Redesign privacy notices, consent flows, and default settings with child-tested language and opt-out pathways.
- Deploy age assurance controls proportionate to service risk, documenting DPIA findings and technical safeguards.
- Audit data sharing agreements and vendor controls to ensure downstream partners uphold Children’s Code requirements.
Zeph Tech supports digital product teams with child privacy assessments, design playbooks, and governance frameworks that operationalize the UK Age Appropriate Design Code while fostering safe, trusted experiences for young users.
Follow-up: The code became enforceable in September 2021 and underpinned 2023 ICO enforcement actions, including the TikTok £12.7 million penalty and Snap’s mandated design changes for UK teenage users.
Sources
- Age Appropriate Design: a code of practice for online services — Information Commissioner’s Office; The ICO set out 15 enforceable standards for online services that process children’s data under the UK Data Protection Act 2018.
- ICO publishes final version of the Age Appropriate Design Code — Information Commissioner’s Office; The ICO outlined the transition timetable and governance expectations for companies building services used by children.
- Children’s code now in force — Information Commissioner’s Office; The ICO marked the end of the 12-month transition and confirmed active supervision of the Children’s Code from 2 Sep 2021.