Legislative Efforts

    • On January 6, 2025, Prime Minister Trudeau prorogued Parliament. Any unfinished legislative business, including the Online Harms Act, died on the order paper.

    • The Online Harms Act was tabled on February 26, 2024.

    • On July 4th, 2024, the Office of the Parliamentary Budget Officer released a Legislative Costing Note titled The Online Harms Act: Establishment of a Digital Safety Commission, Ombudsperson and Office. You can read it here.

    • On December 4, 2024, Minister Virani announced that the government would move to split the Bill  into two parts, separating the proposed regulations for social media companies away from new proposals for prosecuting hateful acts.

    • On September 16, 2024, MP Michelle Rempel Garner introduced Bill C-412 - An Act to enact the Protection of Minors in the Digital Age Act and to amend the Criminal Code.

    • This Private Member’s Bill aimed to: 

      • Safeguard Minors Online: Mandate that online platforms prioritize the safety and well-being of minors, ensuring their personal data is protected and not misused.

      • Proposed Criminal Code Amendments:
        Prohibiting the creation or distribution of digitally altered images that falsely depict individuals in explicit content.

        • Establishing a specific offense for online harassment, with considerations for anonymity or false identity as aggravating factors.

        • Allowing courts to require individuals at risk of online harassment offenses to enter into recognizance and facilitating the identification of anonymous offenders through production orders

    • How did this differ from the proposed Online Harms Bill?

      • Bill C-412 focuses specifically on enhancing legal protections for minors against online exploitation and abuse, whereas the Online Harms Act proposed a comprehensive regulatory approach to combat a wide range of online harms, with an emphasis on holding platforms accountable for harmful content.

    • The Youth Assembly on Digital Rights and Safety brought Canadian youth together from across the country to shape recommendations focused on improving online safety and enhancing online experiences for young people, resulting in a youth-led recommendations report.

    • The U.S. Surgeon General issues an advisory describing current evidence on the impacts of social media on the mental health of children and adolescents. It notes that we cannot conclude social media is sufficiently safe for children and adolescents and provides immediate steps we can take to mitigate the risk of harm.

    • Following the Government’s invitation for consultation and feedback on its proposed approach to address harmful content, the Canadian Department of Heritage releases an overview report of the feedback received.

      • Respondents oppose the government’s proposal, including website blocking provisions, which they underline would create a chilling effect on speech and pose a real threat to an open and safe internet. Platforms would be inclined to take down all questionable content rather than risk being blocked.

      • Respondents identified several overarching concerns about the freedom of expression, privacy rights, and the proposal's impact on certain marginalized groups.

      • As a result, the government substantively reverses course on its initial proposal, focused narrowly on content regulation, and shifts to platform business models, product design and transparency.

    • The Canadian Department of Heritage and its Digital Citizen Initiative convene an Expert Advisory Group on Online Safety designed to provide the Minister with best practices in designing legislative and regulatory frameworks to address harmful content online and incorporate feedback from the 2021 national consultation.

    • Canadian Commission on Democratic Expression releases a report focused on various policies debated worldwide to make online systems more transparent and accountable to the public interest.

    • The citizens’ assembly releases its second report on recommendations to strengthen Canada’s response to online disinformation.

    • The citizens’ assembly releases its final report on democratic expression with recommendations for reducing online harms and safeguarding human rights in Canada.

    • Canadian Expert Advisory Group concludes its meetings by recommending a duty-of-care approach to regulating online harms.

    • The Minister of Canadian Heritage conducts 19 virtual and in-person roundtables nationwide on crucial legislative and regulatory framework elements on online safety.

    • The Government collects participant feedback at the roundtables in the "What We Heard Report," outlining critical areas of online safety.

    • The Canadian Department of Heritage uses information collected during the roundtables to develop policy and legislation, collaborating with the Minister of Justice and Attorney General of Canada to table legislation protecting children, marginalized communities and Canadians online as soon as possible.

  • The CCDE publishes the first of its three annual reports detailing a six-step program to reduce online hate and other harms.

    The 2020 Citizens Assembly issues and presents a public report with recommendations to strengthen Canada’s response to new digital technology and reduce online harms to the CCDE, the Federal Heritage Minister, and researchers.

    The Canadian Government publishes its proposal to address harmful content online for consultation and feedback. Two documents are presented for consultation:

    Frances Haugen, then Facebook executive turned Meta whistleblower, reveals internal documents (1, 2) exposing that platforms are aware of the harmful and adverse impacts their products have on adolescent users’ mental health.

    During the election cycle, the Liberals commit to enacting legislation addressing online safety within 100 days of its mandate if re-elected.

    • The Canadian Commission on Democratic Expression (CCDE), partly funded by the Canadian Government, establishes a three-year mandate focused on understanding, anticipating, and responding to the effects of new digital technologies on public life and Canadian democracy.

    • As part of The CCDE, a set of Citizens’ Assemblies composed of 42 randomly selected and representative residents from across Canada is convened to learn about issues, generate potential solutions, and provide recommendations to the Commission, the federal government, and the Canadian public.

    • Citizens’ Assembly begins meeting between September and December 2020. Their recommendations aim to:

      1. Strengthen oversight and accountability for digital platforms,

      2. Enhance international regulatory cooperation and enforcement, reduce misinformation and empower users;

      3. establish new digital rights;

      4. ensure user safety; accountability and awareness; and

      5. Support independent journalism and Canadian content.

  • The purpose of the Online Harms Act is to promote the online safety of Canadians. It is particularly concerned with regulating the risk that seven forms of harmful content pose to Canadians:

    1. Intimate content communicated without consent

    2. Content that sexually victimizes a child or revictimizes a survivor

    3. Content that induces a child to harm themselves 

    4. Content used to bully a child

    5. Content that foments hatred 

    6. Content that incites violence

    7. Content that incites violent extremism or terrorism.

    New positions and offices:


    Digital Safety Commission of Canada

    Mandate to administer and enforce the Act, ensure that operators of applicable social media services are transparent and accountable, and contribute to the development of standards for online safety.

    Digital Safety Ombudsperson of Canada

    Mandate to provide support to users of applicable social media services and advocate for the public interest in relation to online safety.

    Digital Safety Office of Canada

    Mandate to support the Digital Safety Commission of Canada and the Digital Safety Ombudsperson of Canada in the fulfillment of their mandates.

    Responsibilities for Applicable Social Media Platforms

    Duty to Act Responsibly

    A duty to act responsibly in respect of the services that they operate, including by:

    • Implementing measures that are adequate to mitigate the risk that users will be exposed to harmful content on the services; and,

    • Submitting digital safety plans to the Digital Safety Commission of Canada.

    Duty to protect children

    A duty to protect children in respect of the services that they operate by integrating into the services design features that are provided for by regulations.

    Duty to Make Certain Content Inaccessible

    A duty, in certain circumstances, to make content that sexually victimizes a child or revictimized a survivor and intimate content communicated without consent inaccessible to persons in Canada.

    Duty to keep record of compliance

    A duty to keep all records that are necessary to determine whether they are complying with their duties under that Act.

    Access to Inventories and Electronic Data (Transparency)

    Digital Safety Commission of Canada to accredit certain persons that conduct research or engage in education, advocacy or awareness activities that are related to the Act for the purposes of enabling those persons to have access to inventories of electronic data and to electronic data of the operators of social media services in respect of which that Act applies.

    Other Provisions:

    Amendments to the Criminal Code to:

    • Define “hatred” for the purposes of the new offense and the hate propaganda offenses;

    • Create a hate crime offense for committing an offense under that Act or any other Act of Parliament that is motivated by hatred based on certain factors;

    • Create a recognizance to keep the peace relating to hate propaganda and hate crime offenses;

    • Increase the maximum sentences for hate propaganda offenses.

    Amendments to the Canadian Human Rights Act to:

    Provide that it is a discriminatory practice to communicate or cause to be communicated hate speech by means of the Internet or any other means of telecommunication in a context in which the hate speech is likely to foment detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination

    • Re-instating an improved section 13 and defining hate speech as “content that expresses detestation or vilification of an individual or group of individuals on the basis of race, national or ethnic origin, colour, religion, age, sex, sexual orientation, gender identity or expression, marital status, family status, genetic characteristics, disability and conviction for an offence for which a pardon has been granted or in respect of which a record suspension has been ordered”

    • Enhancing the complaints process by allowing the Canadian Human Rights Commission to accept complaints alleging discriminatory practice;

    • Adding remedies to address communications of hate speech, including by authorizing the Canadian Human Rights Tribunal to inquire into such complaints

    Amendments to an Act respecting the mandatory reporting of Internet child pornography by persons who provide and Internet service to:

    • Clarify the types of Internet services covered by that Act;

    • Simplify the mandatory notification process set out in section 3 by providing that all notifications be sent to a law enforcement body designated in the regulations;

    • Require that transmission data be provided with the mandatory notice in cases where the content is manifestly child pornography;

    • Extend the period of preservation of data related to an offence;

    • Extend the limitation period for the prosecution of an offence under that Act; and

    • Add certain regulation-making powers.

  • Who was to be regulated?

    • The Bill aimed to regulate large social media platforms, which are defined in the Bill as “websites or apps that help people communicate and share content online, including adult content sites and live streaming platforms.”

      Private messaging features on platforms were excluded from the regulation.

    What did regulated services need to comply with?

    • The Bill created four duties, and a data transparency requirement. 

      1. Duty to Act Responsibly

      • Main goal: Minimize the risk of harmful content without eliminating it entirely, while protecting free speech.

      • Services must submit a Digital Safety Plan. 

      • Services must report on:

        • How they meet regulations;

        • Measures for protecting children;

        • The amount of harmful content moderated;

        • User complaints and feedback.

      • Platforms must provide tools for blocking or flagging harmful content, inform users about flagged content, and label automated content (i.e. bots)

      2. Duty to Protect Children

      • Main goal: Operators must ensure design features are age-appropriate and in place to protect children.

      3. Duty to Make Certain Content Inaccessible

      • Operators must remove:

        • Content that sexually exploits children or revictimizes survivors and 

        • Non-consensual intimate content.

      • Suspected harmful content must be made inaccessible within 24 hours and the user who posted it must be notified. Users can appeal the decision if content was wrongly removed.

      4. Duty to Keep Records

      • Operators must maintain records and data. 

      Data Transparency Requirement

      • Regulated services must share data with qualified individuals for research purposes. 

      • Researchers can hold services accountable for the accuracy of their digital safety plans and content moderation practices.

    What powers would regulators have had?

    • 1. Investigative Powers:

      • Can compel people to appear and provide testimony or documents under oath.

      • Can hold public or private hearings to investigate issues.

      • Appoints inspectors to verify compliance with the Act.Inspectors can enter places to gather relevant documents or information, even remotely, if the owner agrees.

      2. Power to Issue Orders:

      • The Commission can issue orders to operators to ensure compliance if it believes they are violating the Act.

      • Orders can be made enforceable in Federal Court, where they can be executed like any court order.

    What were the proposed punishments for non-compliance?

    • Punishments vary based on violations or offences. 

      Violations

      • Penalties: Administrative monetary penalties for non-compliance with the Act, including:

        • Violating the Act, Commission orders, inspector requirements, or false statements.

        • Obstructing Commission or inspector actions.

      • Maximum Penalty: Up to 6% of gross global revenue or $10 million, whichever is greater.

      • Considerations include the nature of the violation, compliance history, benefits gained, ability to pay, and more.  

      Offences

      • Offences: Operators commit an offence if they:

        • Violate Commission orders, obstruct the Commission or inspectors, or make false statements.

      • Penalties:

        • On conviction: Up to 8% of gross global revenue or $25 million (indictment) or 7% of revenue or $20 million (summary).

        • Personal liability: Penalties for individuals or non-individuals, with fines based on gross revenue or specific amounts, depending on conviction type.

      • Operators may avoid liability by proving they exercised due diligence.

      • There is also personal liability for persons that commits an offences

    Could users have flagged content to the regulator if operators were not responsive?

    • Yes. A person in Canada can submit complaints to the Commission about harmful content on a regulated service or the operator's compliance with the Act.

      Complainants who work for the company are also protected.

      Complaints about content that sexually victimizes a child, revictimizes a survivor, or involves intimate content shared without consent may be investigated. If not dismissed, the Commission will:

      • Notify the operator and user involved.

      • Order the operator to make the content inaccessible in Canada until a decision is made.

      The Commission will then determine if the content falls under these categories and may order permanent removal.