Skip to main content

III. Enabling the exercise of children’s privacy rights

A. Scope of Application

The CBA Sections submit that Canada requires a modern, enforceable legislative framework for children’s privacy that is consistent across jurisdictions and sectors, and that goes beyond PIPEDA guidance, aligns with international standards such as the United Nations Convention on the Rights of the Child (UNCRC)1, the UK Age Appropriate Design Code2, and European Union (EU) guidelines, applies consistently across all sectors and jurisdictions, and ensures coordinated implementation through cross‑government, industry, and human rights oversight mechanisms to fully realize children’s rights in the digital environment.

In developing its initial guidance, pending more formal legislative amendments empowering enforcement action, the OPC should consider the following details relating to the scope of application:

  1. Advocate for legislative framework for the protection of children’s privacy in Canada. Bills C-27 (the Digital Charter Implementation Act) and Bill C-63 (the Online Harms Act) died on the Order Paper with the prorogation of Parliament in January 2025. It appears no new legislation will be reintroduced. The legislative framework must go beyond addressing online harms to the broader scope of age-appropriate design and intentionally designing services with children in mind. For example:
    1. Article 16 of the UNCRC, which sets out the privacy rights of children globally, must be interpreted and applied alongside Article 17. Article 17 sets out children’s information rights, and particularly their right to access information, “aimed at the promotion of [the child’s] social, spiritual and moral wellbeing and physical and mental health.” It further directs State Parties to the Convention to “encourage the development of appropriate guidelines for the protection of the child from information and material injurious to his or her well-being, bearing in mind the provisions of articles 13 and 18.”
    2. The UK’s Age-Appropriate Design Code, also known as the Children’s Code, is a statutory code of practice created under Section 123 of the Data Protection Act 20183, as amended to implement UK General Data Protection Regulation (GDPR)4 provisions for children’s data privacy. It is widely heralded as a global best practice. The protections for children’s privacy must meet or exceed those required by the UK standard and must be supported by legislation that allows for robust and effective enforcement of these laws.
    3. A further relevant benchmark is the recently published guidelines of the European Commission relating to the protection of minors under the Digital Services Act5in addition to benefiting children’s well-being, aligning Canadian industry standards for children’s privacy with Europe’s could be beneficial to encourage trade and export opportunities for Canadian developers and service providers.
  2. The measures proposed for age-appropriate design to protect and promote children’s privacy interests should apply in both public and private sectors, and in all Canadian jurisdictions, unless a substantially similar code is in place in a province. In such cases, coordination with that province is required so the coverage is consistent across Canada.
  3. OPC guidance under PIPEDA cannot adequately achieve the goal of consistency across Canada or meet Canada’s enforcement obligations under the UNCRC. Specific legislative provisions within new or existing laws concerning children’s privacy are the appropriate vehicle to begin this reform. The OPC’s welcome efforts to provide guidance within the current PIPEDA framework does not supplant the need for more thorough ongoing law reform.
  4. The CBA Sections submit that robust implementation of children’s privacy rights could provide a powerful lens to protect children’s privacy, and through that lens, their right to solitude, their right to a private and supportive family life, the protection of their reputation, their control over the use of their image and likeness. The European Court of Human Rights (ECtHR), Article 8 jurisprudence, is an example of such a lens in actions. Canadian Charter jurisprudence has not sufficiently developed a jurisprudence on children’s rights to date. Children’s privacy rights must be interpreted and applied in keeping with the general principles of children’s rights and all their other rights guaranteed under the UNCRC Convention. Meaningful implementation of children’s privacy rights in Canada requires not only dedicated efforts by industry and oversight bodies such as the OPC in relation to PIPEDA, it requires a cross-sectoral approach and integrative collaboration with criminal justice systems, health systems, educational services and child protection services – to name a few – as well as general human rights oversight mechanisms to respect the intersectional interdependence of children’s privacy interests with all of their other human rights. Structural change and formal mechanisms for joint implementation of children’s rights in relation to the digital environment are required for full implementation of the rights of the child as set out in General Comment 25.6 “The OPC’s leadership in providing PIPEDA guidance is important and welcome. However, to effectively champion the interests of Canadian children, the OPC must also collaborate across government departments, with industry, and among different levels of government.
  5. The OPC’s guidance within the current PIPEDA framework should also be developed in conjunction with guidance in relation to public sector service offerings impacting children under the Privacy Act.7 Moreover, detailed guidance should be available as to how health technologies and information management practices are developed to protect children’s privacy interests in the health sector, both in its private and public dimensions.

Should a children’s privacy code apply differently to sites exclusively directed at children and those directed at a broad audience that includes children?

  1. Age-appropriate design guidance must apply to services and products designed specifically for children as well as to services that children may be reasonably foreseen to access or where they are the subject of information being disseminated. The question should not be whether the intended audience contains a “significant number of children” but rather whether children’s use and access of the services is reasonably foreseeable. Concerning the second issue, age verification and age assurance standards in the UK and in Europe are moving towards regulation based on whether children are likely to access the content which addresses when children are users, but there is also a need to address situations where children are not the users but are the subject of the information being disseminated as that also impacts their right to privacy. For example, an app used by a daycare provider to communicate with parents of young children, while not accessed by a child itself, may contain and disseminate significant amounts of information about the child. For adult-only services the onus must be clearly placed on the service provider to monitor age-based access and ensure that the services are not available to children, and that the content being made available does not violate the rights of children (such as, by way of example, child sexual abuse material). The UK’s recent guidance to the industry in relation to highly effective age verification practices for adult sites is a helpful reference for Canadian policy and guidance around children’s access to online services.
  2. Drawing from Dr. Ignacio Cofone’s work, OPC guidance should anticipate that, moving forward, the regulation of civil liability for harms to children from online play, work, learning and social environments needs to be approached on both tort and contractual law bases (The Privacy Fallacy). Moreover, issues such as intellectual property rights of children (to their image and likeness, to the content they create) also must be addressed in a way that allows them to realize their creative goals and enforce their rights. While some of this may be accomplished via privacy related guidance and related legislative amendments, other areas of law will also need to be engaged.
  3. A children’s privacy code should not apply differently to sites exclusively directed at children and those directed at a broad audience that includes children. First, websites are rarely exclusively directed at children, and so the distinction will find little application. For example, while Kids Help Phone provides national resources for youth, it also offers content to support parents. Kids Help Phone, though directed at children, is not exclusive to them. The baseline application would introduce unnecessary complexity.
  4. Depending on the definition of “exclusive” and the additional obligations that arise, if the definition is too strict, other sites may introduce content targeted at adults to exempt themselves from additional obligations. Instead of encouraging adherence to norms, companies may be motivated to escape them. The alternative – an expansive definition of “exclusive” – becomes functionally equivalent to a likelihood-of-being-accessed-by-children test.
  5. International trends are moving away from a two-tiered approach. In the United States, the Children’s Online Privacy Protection Act (COPPA)8 historically applied to websites directed at children under 13 or services with actual knowledge that a user is under 13. This meant many general sites avoided COPPA unless they knowingly collected data from young children. However, new U.S. proposals and state laws are broadening the scope. For example, California’s Age-Appropriate Design Code (CAADC)9 (effective 2024) covers any online service “likely to be accessed” by minors (under 18) – including general audience sites – not just services overtly aimed at children. In Europe, the approach is similarly broad. The UK’s Age-Appropriate Design applies to all digital services likely to be accessed by users under 18, even if the primary audience is adults.

Which factors should be considered when determining the likelihood of children accessing a service?

  1. The UK Information Commissioner’s Office (ICO) has enumerated, in their FAQ, several factors that are instructive. These include, among other factors: the nature and content of the service; whether that has particular appeal for children; the number of children users of a service; whether advertisements on a service, including third party advertisements, are directed at or likely to appeal to children; complaints received about children accessing a service; and, conversely, any measures put in place to prevent children gaining access to content.
  2. As technology and business practices continue to evolve, so will the quantity and quality of available evidence. Any evaluation of children’s likelihood of accessing a service should follow a common-sense approach.

How can this assessment be done in a privacy-protective manner?

  1. In its October 2024 submission on Privacy and Age Assurance – Exploratory Consultation10, the CBA Privacy and Access Law Section emphasized a risk-based approach. This approach emphasized that any requirements around privacy should be proportional to the level of risk posed to children based on the kind and quantity of information being collected about them. A similar rationale can apply to the requirements imposed on website developers: more intrusive age-verification may only be required if there is a substantial risk to children accessing a website’s content.

Should a children’s privacy code only apply when certain risks or harms are possible due to access to or use of the site – and if so, which ones?

  1. A children’s privacy code should always apply, but using a proportional, risk-based approach. This would align with international trends, Canada’s historical approach, and the complexity of how often children access certain websites, and how that information impacts them.

When considering risks and harms, again, the UK’s recent guidance to the industry in relation to highly effective age verification practices for adult sites is a helpful reference for Canadian policy and guidance in the area of children’s access to online services (Quick guide to Protection of Children Codes).11 In addition to these harms, we recommend the OPC consider the risk of discrimination against certain vulnerable intersections, including disabled, Indigenous, and 2SLGBTQ+ children. To fully understand these harms, intentional efforts should be made to seek comments from these communities.