Skyscrapers and highways of a big modern city at sunset. Aerial view on downtown Dubai, United Arab Emirates.
Article

UAE’s Child Digital Safety Law: What Every Digital Platform and ISP Should Know

January 9, 2026
The law has extraterritorial reach over digital platforms and internet service providers that operate in, or target users in, the UAE.

The United Arab Emirates (UAE) has enacted Federal Decree‑Law No. 26 of 2025 on Child Digital Safety (the CDS Federal Law), establishing a comprehensive framework to protect children online with extraterritorial reach over digital platforms and internet service providers that operate in, or target users in, the UAE. The CDS Federal Law took effect on January 1, 2026, and covered entities have up to one year from that date to comply, unless the Cabinet extends the deadline.

Since the CDS Federal Law delegates key operational details to administrative regulations, several elements, including platform classification and penalties, are intended to be finalized through Cabinet decisions. 

Scope

The CDS Federal Law applies to all digital platforms and internet service providers (ISPs) that operate within the UAE or target users in the UAE, whether in the public or private sector and regardless of any legal presence in the country. Covered “digital platforms” include websites, electronic search engines, smart applications and messaging apps, forums, electronic games platforms, social media platforms, live‑streaming platforms, podcast platforms, streaming services and on‑demand online video content platforms, and e‑commerce platforms. Because many obligations hinge on who qualifies as a child, the law defines a “child” as any person under 18. 

Platform Classification

A Cabinet-issued, risk-based classification will define digital platform obligations, including age restrictions, age verification methods, child protection measures, and compliance verification procedures. Non-compliance may lead to blocking, closure, or other administrative actions under the Administrative Penalties Regulation. The Cabinet will specify penalty-imposing entities, enforcement mechanisms, and appeal procedures.

Platform Obligations

All digital platforms must implement enhanced child protection measures according to their Cabinet classification. These include default high-privacy settings for children’s accounts, age restriction tools with appropriate verification, and activation of content-blocking and filtering tools. Platforms must also provide privacy features suitable for children’s age groups, robust parental controls, and measures to raise awareness of excessive use risks. User-friendly tools for reporting harmful content, along with AI systems for proactive detection and removal, are required. Immediate reporting of child pornography or harmful content involving children to concerned entities is mandatory, as is executing removal orders and submitting periodic compliance reports per the CDS Federal Law. 

ISP Obligations

The Telecommunications and Digital Government Regulatory Authority (TDRA) will establish policies and standards for ISPs to ensure children’s digital safety. ISPs must activate network-level content filtering, ensure safe use for children by linking services to parental controls, and provide guidance tools for monitoring digital content. They must also report any child pornography or harmful content involving children to the relevant entities and provide necessary information. TDRA will review ISP policies and monitor compliance with these obligations. 

Online Commercial Games and Betting

Digital platforms must prevent children from accessing or participating in online commercial games, including gambling and wagering, whether directly or indirectly. This includes restrictions via advertising, promotion, or use of personal data. Platforms and ISPs must implement technical and administrative measures, such as age verification, parental controls, and content blocking, to enforce this ban. 

Data Protection

Platforms cannot collect, process, publish, or share personal data of children under 13 without explicit, documented parental consent, a simple and always-available way to withdraw consent, and clear privacy policy disclosures to the child and caregiver. Internal access must be restricted to authorized personnel only, with no commercial use, targeted ads, or tracking beyond the authorized purpose. Additional controls may be determined by Cabinet decisions under the CDS Federal Law. The Cabinet will specify permissible data categories and consent verification methods. These obligations must align with the UAE’s personal data protection laws, including Federal Decree-Law No. 45 of 2021 (PDPL). While the PDPL’s Executive Regulations are pending, limiting their enforceability, the CDS Federal Law provides immediate privacy protections for children under 13 to be implemented within a defined period of one year following January 1, 2026.

Compliance

Compliance with the CDS Federal Law will be monitored by UAE authorities, including those responsible for child affairs, media, and cybersecurity. The CDS Federal Law also establishes the Child Digital Safety Council (the Council), chaired by the Minister of Family, to coordinate efforts among federal, local, and private entities. The Council is entitled to propose strategic policies and initiatives to enhance child safety in the digital environment, ensuring alignment with international standards and comprehensive protection from digital risks.

Next Steps for Platforms and ISPs

Companies with any UAE user base — especially those offering apps, social media, streaming, gaming, e-commerce, or live-streaming — should consider taking the following steps:

  • Map product surfaces: Identify and document all product features and services accessed by minors to assess compliance needs.
  • Assess jurisdictional reach: Determine whether operations “operate in” or “target users in” the UAE to understand the scope of obligations under the CDS Federal Law.
  • Develop compliance policies: Create comprehensive policies that address child digital safety, including privacy settings, age verification, and content monitoring.
  • Implement monitoring tools: Deploy tools and systems for proactive monitoring and management of harmful content, leveraging AI and machine learning where applicable.
  • Enhance privacy and safety features: Ensure privacy settings are set to high by default for children’s accounts and implement robust parental control tools.
  • Prepare for TDRA oversight: ISPs should ensure network-level content filtering and parental-control offerings are in place to meet regulatory standards.
  • Stay informed on regulatory updates: Monitor for further instructions and regulations issued by the Cabinet to ensure ongoing compliance.

Endnotes

    This publication is produced by Latham & Watkins as a news reporting service to clients and other friends. The information contained in this publication should not be construed as legal advice. Should further analysis or explanation of the subject matter be required, please contact the lawyer with whom you normally consult. The invitation to contact is not a solicitation for legal work under the laws of any jurisdiction in which Latham lawyers are not authorized to practice. See our Attorney Advertising and Terms of Use.