Policy Briefing — UK Children’s Code Finalised
Comprehensive implementation roadmap for the UK ICO’s Age Appropriate Design Code, outlining governance, product design, data minimization, and international alignment steps for digital services impacting children.
Executive briefing: The UK Information Commissioner’s Office (ICO) confirmed the Age Appropriate Design Code (Children’s Code) on 12 August 2020, triggering a 12-month transition before enforcement on 2 September 2021. The statutory code—mandated by Section 123 of the Data Protection Act 2018—applies to information society services likely to be accessed by children, including apps, online games, education technology, media streaming, search engines, and connected devices. Organizations must embed child-centric privacy by design across governance, product development, data operations, and supplier oversight to avoid regulatory sanctions and protect young users.
Applicability and legal interplay
The Children’s Code translates GDPR and UK Data Protection Act principles into 15 design standards for services that process children’s personal data. Coverage is triggered when children under 18 are likely to access a service, regardless of intended audience. The code applies to UK-based providers and to overseas services that monitor behaviour or offer goods or services to UK users. Key concepts include best interests of the child, data minimization, transparency, avoidance of detrimental use of data, and strong controls over geolocation, profiling, and nudge techniques.
Assess whether your service offers user-to-user interaction, personalization, advertising, or location-aware features that materially affect young users. High-risk functions—such as open messaging, algorithmic recommendations, live streaming, and behavioural ads—require heightened safeguards or deactivation for under-18 audiences. Map overlaps with EU guidance on children’s data, the Irish Data Protection Commission’s Fundamentals, the California Age-Appropriate Design Code Act, and sector-specific rules (e.g., Ofcom video-sharing platform requirements) to ensure harmonized obligations for multinational products.
Document applicability decisions, including evidence used to determine whether children are likely to access the service (traffic analytics, user surveys, marketing channels, app store ratings). Where adults form the primary audience but incidental child access is plausible, adopt proportional mitigations such as conservative defaults, limited data capture, and age-gating for higher-risk functions.
Enforcement timeline and regulator expectations
The ICO laid the final code before Parliament in August 2020 and commenced supervision after the transition period ended on 2 September 2021. Since coming into force, the ICO has published compliance FAQs, industry blogs, and case studies, and it has already pursued enforcement where design choices endangered children’s privacy. The £12.7 million TikTok penalty in April 2023 cited unlawful processing of under-13 users and inadequate age assurance, underscoring the regulator’s willingness to apply GDPR-level sanctions and mandatory remedial steps.
Expect active supervision of features the ICO has repeatedly flagged: profiling for targeted ads, detrimental nudge techniques, default public profiles, unchecked geolocation, and opaque privacy notices. Organizations should maintain a regulator engagement plan to provide DPIAs, design documentation, and testing evidence on request. Regularly review ICO guidance updates and public statements to align roadmaps with emerging enforcement priorities.
Default settings and privacy by design
Audit user journeys to identify where default configurations expose children to unnecessary data collection or sharing. Implement privacy by default: disable location tracking, profile-based recommendations, and targeted advertising unless robust safeguards exist and a compelling legitimate purpose is documented. Avoid friction-filled opt-outs; children and parents should be able to change settings quickly without manipulative design patterns.
Provide layered, developmentally appropriate explanations at the point of data collection. Use icons, plain language, and concise prompts to clarify why data is requested, how long it is kept, and whether it is shared with third parties. Offer dashboards for parents and teens that summarize active settings, recent consent decisions, and available controls. Where services blend adult and youth audiences, set the most protective defaults by age tier and avoid confusing or lengthy configuration flows.
Conduct gap assessments against the 15 standards
Use the ICO’s 15 standards as the backbone of a structured gap analysis covering governance, data practices, and user experience:
- Best interests of the child: Document how product decisions prioritize safety and wellbeing, including ethics reviews and stakeholder consultations.
- Data protection impact assessments (DPIAs): Create child-specific DPIA templates that evaluate unintended consequences, profiling risks, and manipulative design patterns.
- Age-appropriate application: Implement age assurance methods that balance proportionality, such as self-declaration backed by behavioural signals or third-party verification for high-risk services.
- Transparency: Redesign privacy notices with layered, bite-sized explanations, using visuals and plain language tailored to developmental stages.
- Detrimental use of data: Inventory features that nudge excessive engagement or monetize vulnerabilities, and rework incentives to support healthy usage.
- Parental controls and profiling: Offer granular controls, clear indicators when monitoring occurs, and options for children to seek support.
Capture remediation tasks, owners, budgets, and milestones in a program management tool. Prioritize high-risk standards—such as profiling, geolocation, connected devices, and nudge techniques—that the ICO has highlighted during enforcement briefings.
Embed governance and accountability
Establish a cross-functional Children’s Code steering committee comprising privacy, product, engineering, policy, customer support, and child-safety experts. Assign executive sponsorship to ensure resources and decision-making authority. Update accountability frameworks to include board reporting on child privacy metrics, audit trails for design decisions, regulator engagement protocols, and escalation paths when products introduce new high-risk features.
Update policies—privacy, acceptable use, content moderation—to reference the code’s standards explicitly. Ensure your Data Protection Officer maintains visibility into product roadmaps and marketing campaigns targeting young audiences. Where the organization operates globally, align governance with local representatives or lead supervisory authorities to streamline cross-jurisdictional coordination and avoid inconsistent messaging.
Strengthen age assurance and parental engagement
Develop a risk-based age assurance strategy. Low-risk services may rely on self-declaration with routine back-end monitoring for anomalies (e.g., adult-like purchasing patterns). Higher-risk contexts—such as social platforms, monetized games, or services enabling contact between users—may require third-party verification, document checks, or school-issued credentials. Document the rationale for chosen methods, addressing privacy, accessibility, equality, and discrimination considerations.
Design parental dashboards that provide oversight without undermining children’s rights to privacy and freedom of expression. Offer controls for content filters, communication settings, screen-time alerts, and purchase limits. Clearly explain when parental monitoring is active and provide children with education resources about online safety. Implement escalation channels for reporting abuse or seeking support, and integrate them with trust and safety operations.
Implement data minimization and retention controls
Catalog all personal data fields collected from or about children, tagging them by sensitivity and usage purpose. Remove non-essential fields, especially those used for personalization or marketing. Automate data deletion once purposes are fulfilled, with retention schedules validated by legal counsel. Where analytics are necessary, leverage aggregation, differential privacy, or on-device processing to reduce exposure.
Review data sharing arrangements with third parties—including ad networks, analytics providers, and content partners—to ensure they meet Children’s Code standards. Execute data processing agreements that restrict downstream use, prohibit profiling, and mandate prompt breach notification. Conduct vendor audits focusing on encryption practices, access controls, and incident response capabilities.
Upgrade security and incident response
The code emphasizes strong security tailored to children’s data sensitivity. Implement multi-layered defenses: encryption in transit and at rest, fine-grained access controls, and anomaly detection for account takeover attempts. Train security teams to identify grooming or exploitation patterns in behavioural data. Integrate child-safety considerations into incident response playbooks, ensuring coordination with law enforcement and child protection agencies when necessary.
Run tabletop exercises simulating data breaches involving children, unauthorized contact attempts, or content moderation failures. Document lessons learned and update controls. Provide transparency reports summarizing incidents, remediation actions, and safety improvements. Track time-to-detect and time-to-remediate metrics specific to youth-related incidents.
Communicate transparently and build trust
Create communication plans for children, parents, educators, and regulators. Publish plain-language summaries of privacy practices, safety features, and reporting channels. Offer in-product education modules that teach children about digital literacy, privacy controls, and responsible sharing. Engage with schools and community organizations to gather feedback and demonstrate commitment to safeguarding.
Establish a regulator liaison function to maintain ongoing dialogue with the ICO. Notify the ICO proactively about significant product changes affecting children, share DPIA outcomes when requested, and respond promptly to inquiries. Participation in industry codes of practice—such as the UK’s voluntary standards for video-sharing platforms—can reinforce credibility.
Monitor enforcement trends and global developments
Track ICO enforcement actions, guidance updates, and case studies released through the Children’s Code hub. Analyze penalties issued under GDPR for child privacy violations to benchmark risk exposure. Monitor international developments, including Ireland’s Fundamentals for a Child-Oriented Approach to Data Processing, California’s Age-Appropriate Design Code Act (2022), and Australian eSafety initiatives. Harmonize controls across jurisdictions to minimize duplication and reduce engineering overhead.
Review legal challenges or legislative amendments that could adjust the code’s scope. Maintain a legal watchlist summarizing consultations, parliamentary debates, and court decisions influencing youth privacy norms. Update stakeholders quarterly on emerging trends, technology impacts, and required program adjustments.
Action checklist for the next 90 days
- Complete a gap assessment against all 15 Children’s Code standards, prioritizing remediation for high-risk features and data flows.
- Stand up a cross-functional steering committee with executive sponsorship to oversee implementation, budget allocation, and regulator engagement.
- Redesign privacy notices, consent flows, and default settings with child-tested language, protective defaults, and opt-out pathways.
- Deploy age assurance controls proportionate to service risk, documenting DPIA findings and technical safeguards.
- Audit data sharing agreements and vendor controls to ensure downstream partners uphold Children’s Code requirements.
Zeph Tech supports digital product teams with child privacy assessments, design playbooks, and governance frameworks that operationalize the UK Age Appropriate Design Code while fostering safe, trusted experiences for young users.
Follow-up: The code became enforceable in September 2021 and underpinned 2023 ICO enforcement actions, including the TikTok £12.7 million penalty and Snap’s mandated design changes for UK teenage users.
Sources
- Age Appropriate Design: a code of practice for online services — Information Commissioner’s Office; the statutory code issued under Section 123 of the Data Protection Act 2018 sets 15 enforceable standards for online services processing children’s data.
- Data Protection Act 2018, Section 123 — UK Parliament; establishes the duty on the ICO to produce the Age Appropriate Design Code and confirms the code’s legal status.
- ICO fines TikTok £12.7m — Information Commissioner’s Office; outlines enforcement action for processing children’s data without adequate consent or age assurance.
Continue in the Policy pillar
Return to the hub for curated research and deep-dive guides.
Latest guides
-
Semiconductor Industrial Strategy Policy Guide — Zeph Tech
Coordinate CHIPS and Science Act, EU Chips Act, and Defense Production Act programmes with capital planning, compliance, and supplier readiness.
-
Digital Markets Compliance Guide — Zeph Tech
Implement EU Digital Markets Act, EU Digital Services Act, UK Digital Markets, Competition and Consumers Act, and U.S. Sherman Act requirements with cross-functional operating…
-
Export Controls and Sanctions Policy Guide — Zeph Tech
Integrate U.S. Export Control Reform Act, International Emergency Economic Powers Act, and EU Dual-Use Regulation requirements into trade compliance, engineering, and supplier…




