Skip to main content Skip to section navigation

The Canadian Centre for Child Protection supports Bill C-63, otherwise known as the Online Harms Act. Overall, the Government of Canada has proposed a strong and thoughtful framework for addressing online child sexual abuse and exploitation. Over the coming weeks, we will publish a series of recommendations that we believe would further strengthen the protections for children/youth within this proposed Bill.

Legislated age assurance requirement needed to ensure regulated services fulfil their child specific duties under proposed Online Harms Act


Key takeaways

  • The Online Harms Act would impose a duty to provide children with enhanced protections on social media services, including livestreaming and user-uploaded adult content services.
  • However, Bill C-63 does not legislate a requirement to verify the age of users.
  • It is not clear how online service providers would fulfil their duty to protect children without a concurrent age assurance obligation.
  • The government has said through various public statements that it is open to further debating the matter, saying age assurance is “entrenched” in Bill C-63.
  • Recommendation: Amend the draft Online Harms Act to explicitly include requirements for online service providers subject to regulation to use effective age assurance measures.

Purpose

This briefing note is to inform readers about the Canadian Centre for Child Protection’s (C3P’s) view that the absence of any provision in Canada’s proposed Online Harms Act (Bill C-63) related to age assurance requirements casts doubt on the ability of regulated online service providers to fulfil their child-user specific duties, creating a significant gap in the effectiveness of the proposed online safety regime. C3P recommends amendments to the draft Online Harms Act to expressly include age assurance requirements, recognizing the details of such requirements may need to be the subject of regulation.

Issue

A significant pillar of the drafted Online Harms Act relates to child safety. Three of the seven types of harmful content targeted by the proposed law are specific to children:

Harmful content means
[...]
(b) content that sexually victimizes a child or revictimizes a survivor;
(c) content that induces a child to harm themselves;
(d) content used to bully a child;

In addition to this, one of the main legislated duties of care, relate specifically to children:

Duty to protect children
64 An operator has a duty, in respect of a regulated service that it operates, to protect children by complying with section 65.
Design features
65 An operator must integrate into a regulated service that it operates any design features respecting the protection of children, such as age-appropriate design, that are provided for by regulations.

This duty effectively carves out an obligation for platforms to provide children with enhanced protections (that are not owed to adults). These safety measures, which will be detailed in regulation, might include such elements as age-appropriate design codes, safety-by-design standards, default account settings, parental controls, privacy settings, etc.

However, currently, Bill C-63 does not contain any provisions that explicitly require regulated online services to make use of age assurance techniques to establish which users are children. In the absence of such provisions, it is not clear how the government expects online service providers to fulfil their child protection duty or demonstrate they are complying with the Act as part of a digital safety plan. There are also concerns that without an age assurance obligation, regulated services may exploit this as a loophole to plausibly deny responsibility for failures to fulfil their child protection duties under the Act.

It should be noted that nothing in Bill C-63 prevents the mandating of such a requirement in regulation.

Discussion

Domestic policy initiatives supporting age assurance

A number of policy initiatives in Canada have been driving toward or exploring online age assurance as a means of limiting a number of harms to children over the last several years.

On April 11, 2019, Health Canada published an exploratory consultation document entitled, "Reducing Youth Access and Appeal of Vaping Products: Potential Regulatory Measures".1 The objective was to explore whether increased regulatory action beyond what is already in place is needed to protect children. Among the proposed regulatory options were new regulations designed to further restrict vaping product access by requiring online retailers to use enhanced age-verification techniques.

On November 24, 2021, a private member’s bill was introduced in the Senate by Sen. Julie Miville-Dechêne. Bill S-210 (“An Act to restrict young persons’ online access to sexually explicit material”) would make it an offence for online services to make sexually explicit material available to children on the internet, while also mandating age-assurance checks before Canadian internet users can access adult pornography online.2

On June 10, 2024, Canada’s Office of the Privacy Commissioner (OPC) launched an exploratory consultation on privacy and age assurance.3 The OPC preliminary position is that “it is possible to design and use age assurance in a privacy-protective manner.”

Global efforts supporting age assurance

As discussions surrounding age assurance continue in Canada, it should be noted that other jurisdictions including the U.K., Germany, France, and several U.S. states have already instituted age assurance laws and/or systems to limit children's exposure to adult pornography.

Looking at the U.K. specifically, the Office of Communications (“Ofcom”), which regulates communications services in their country, recently released its draft guidance on age assurance, which includes what it deems to be “highly effective” methods.4 The draft guidance provides several age assurance methods that may be implemented by companies to meet their age assurance requirements under the new online safety regulations in the U.K. These include digital wallets, credit card checks, facial age estimation, open banking, and more. Despite the long list, Ofcom states it will continue to “review [its] position over time as technologies evolve.”

In other regions, such as in Australia, Spain, and the EU, age assurance frameworks are being proposed or piloted following studies and formal public consultations.

The European Commission’s Directorate–General for Communications Networks, Content and Technology (Connect) hosted its first meeting of the task force on Age Verification under the Digital Services Act (DSA) in January 2024. With this task force, the Commission launched a proof-of-concept pilot on using the EU Digital Identity Wallet for age verification with six Member States.5 Member States had the opportunity to take the floor and present their own national approaches to age verification, and to discuss the overall tasks of the Task Force.

Furthermore, the recent EU Commission report on age assurance studied both the legal and practical aspects of age assurance, detailing its necessity, methods, and associated challenges.6 The report concluded it was not a matter if age assurance was needed but rather how it should be implemented.

In Australia, the Office of the eSafety Commissioner began research and consultations in 2021 with stakeholders to consider how a mandatory age verification mechanism could be achieved in Australia. In August of 2023, eSafety made their roadmap toward age verification publicly available.7 Although the Australian government initially declined to pursue age assurance policies, on May 1, 2024, it announced it would provide $6.5 million in funding for an age assurance pilot to protect children from harmful content, including pornography.8 The pilot is to occur alongside other interventions aimed at curbing easy access to damaging material by children and young people and tackling extreme misogyny online. On July 2, 2024, reports now say the Australian government is expanding its trial of age assurance technology to look at blocking underaged children from accessing social media platforms (in addition to adult content sites).9

Addressing privacy concerns

A common issue raised in relation to age assurance technology is a concern that its implementation will end up violating the privacy rights of adults. There exists privacy preserving options that have been proposed in other jurisdictions, including the use of encrypted tokens administered by a trusted issuer such as the government or a regulated sector such as the financial industry. In our view, solutions that work best require a whole system, multi-stakeholder approach.

Other partial solutions include publishers passing on the responsibility of age verification to device or app-stores. While partial solutions such as this may serve as one layer in a multi-layered approach, they should not be solely relied upon. Publishers should bear some direct responsibility for restricting access to adult-only materials.

Recommendation

Amend Bill C-63 to include a legislated requirement for regulated online services to use effective age assurance measures to ensure they can fulfil their various age-specific duties under the proposed Online Harms Act.

Sources

  1. 1 https://www.canada.ca/content/dam/hc-sc/documents/programs/consultation-reducing-youth-access-appeal-vaping-products-potential-regulatory-measures/consultation-reducing-youth-access-appeal-vaping-products-potential-regulatory-measures-eng.pdf
  2. 2 https://www.parl.ca/legisinfo/en/bill/44-1/s-210
  3. 3 https://www.priv.gc.ca/en/about-the-opc/what-we-do/consultations/consultation-age/expl_gd_age/
  4. 4 https://www.ofcom.org.uk/news-centre/2023/implementing-the-online-safety-act-protecting-children
  5. 5 https://digital-strategy.ec.europa.eu/en/news/digital-services-act-task-force-age-verification
  6. 6 https://digital-strategy.ec.europa.eu/en/library/research-report-mapping-age-assurance-typologies-and-requirements
  7. 7 https://www.esafety.gov.au/sites/default/files/2023-08/Roadmap-for-age-verification_2.pdf?v=1719428417973
  8. 8 https://www.pm.gov.au/media/press-conference-sydney-15
  9. 9 https://www.theguardian.com/australia-news/article/2024/jul/02/social-media-porn-site-ban-australia-trial-age-assurance

Support our work. Donate today.

Be a part of something big. As a registered charitable organization, we rely on donations to help us offer our programs and services to the public. You can support us in helping families and protecting children.

Donate Now