Skip Navigation

Canada’s Bill C-63: Online Harms Act Targets Harmful Content on Social Media

February 29, 2024

On February 26, 2024, the federal government tabled Bill C-63 (Bill), which would enact the Online Harms Act (Act) and establish Canada’s first federal online content moderation regime. 

The purposes of the Act include mitigating the risks posed by harmful content online, protecting children’s physical and mental health, holding online platforms accountable and rendering certain egregious forms of harmful content inaccessible. To achieve these objectives, the Act would establish a new online-harms regulator, the Digital Safety Commission of Canada (Commission), with broad search and seizure powers and the ability to levy fines of up to 6% of global revenues for non-compliance.

To maintain compliance with their duties under the Act, social media platforms would be required to implement tools to detect harmful content, develop a digital safety plan and comply with requirements relating to the removal of certain harmful content. 

The Act has been conceived at a time when the balance between content moderation and free expression is in flux. At the same time, legislators around the world are grappling with the appropriate response to digital policy issues, such as misinformation, online harassment, deepfakes and censorship. 

On one side of the spectrum, there is the European Union’s Digital Services Act and the United Kingdom’s Online Safety Act, which impose comparable but broader obligations on social media services and search engines to implement measures to remove harmful content and protect children. Conversely, the U.S. Supreme Court is currently seized with challenges to Florida and Texas state legislation that prohibits platforms from censoring content or users, on the basis that these restrictions violate the platforms’ constitutionally protected right to free speech.

Against this backdrop, it will be important for social media services operating in the Canadian market to consider how they will align their Canadian compliance program with varying requirements in other jurisdictions. 

Scope of Application

The Act imposes duties on “regulated services” with respect to harmful content made accessible to Canadians. 

Regulated services are social media services that either meet the user threshold prescribed in forthcoming regulations or are designated by regulation. The Act introduces the first federal legislative definition of social media services, defining them as “a website or application accessible in Canada, with the primary purpose of facilitating interprovincial or international online communication among users by enabling them to access and share content.” The Act further provides that adult content services that allow users to share and access pornographic content and live-streaming services are social media services. The Act does not apply to a private messaging feature of a regulated service.

The Act defines seven classes of harmful content that attract regulatory obligations:

  • Intimate content communicated without consent, with such photos or video recordings (including deepfakes) depicting a person who is nude or engaged in sexual activity
  • Content that sexually victimizes a child or revictimizes a survivor
  • Content that induces a child to harm themselves, such as content that advocates self-harm, disordered eating or suicide
  • Content used to bully a child, including content that could cause serious harm to a child’s physical or mental health, through intimidation, threats or humiliation
  • Content that foments hatred
  • Content that incites violence
  • Content that incites violent extremism or terrorism

Duties Imposed on Social Media Services

Regulated social media services are subject to three key duties regarding harmful content on their platform: 

  1. The Duty to Act Responsibly. This duty requires social media services to, among other things, (a) implement measures that are adequate to mitigate the risk that users will be exposed to harmful content; (b) implement tools to block other users and flag harmful content; (c) make user guidelines, which must include a standard of conduct and a description of measures implemented to mitigate harmful content, publicly available on the service; (d) designate a resource person to receive complaints from users and make their contact information easily accessible to users of the service; and (e) develop a digital safety plan that will be submitted to the Commission and made publicly available and that must include the significant amount of information about the service’s compliance program as prescribed by the Act. 
  2. The Duty to Protect Children. This duty requires social media services to implement any design features respecting the protection of children prescribed by regulation, such as age-appropriate design. 
  3. The Duty to Make Certain Types of Content Inaccessible. This duty requires social media services to comply with certain obligations concerning content that sexually victimizes a child or revictimizes a survivor and that is intimate and communicated without consent. If such content is detected by the service, it must be removed within 24 hours of detection. If the content is flagged by another user, the service must review the content within 24 hours and block it within the following 24 hours if the flag is well-founded. In its review, the social media service must allow both the user who posted the content and the user who flagged the content to make representations about the nature of the content.

New Online Harms Regulator

The Act establishes a new regulatory framework through the Commission, the Digital Safety Ombudsperson (Ombudsperson) and the Digital Safety Office (Office).

The Commission is a new regulator with a mandate to administer and enforce the Act, develop online safety standards and investigate complaints related to content that sexually victimizes a child, revictimizes a survivor or is intimate content distributed without consent. The Commission can hold hearings (which may be held in private under certain circumstances) in response to public complaints about content or any other matter relating to social media services’ compliance with the Act. 

The Commission has broad powers to compel information, search any place, access records by means of telecommunications and seize records to verify compliance with the Act. The Act also provides whistleblower protection for any employee of a social media service that makes submissions to the Commission. If social media services contravene the Act, such as failing to take down content or not abiding by the duty to implement the measures set out in its digital safety plan, they may be obligated to pay administrative monetary penalties of up to 6% of the gross global revenue or C$10-million, whichever is greater. 

Through its powers to set standards, investigate complaints and hold public or private hearings, the Commission’s mandate will have a significant impact on how social media platforms operate in Canada.

The Ombudsperson’s role is to assist users of social media services with navigating the new framework and advocate for the public interest on systemic issues related to online safety.

The Office is the government body set up to support the work of the Commission and Ombudsperson.

Next Steps

Before it becomes law, the Bill must complete two additional readings in the House of Commons and three in the Senate. If passed, the rights and obligations in the Act will come into force only upon an order from the Governor in Council. Furthermore, as noted above, many key aspects will be determined by forthcoming regulation from the Governor in Council, such as the types of social media services subject to the Act, the number of users required to trigger application of the Act and the methodology for determining user counts.

For more information, please contact:


or any other member of our Technology or Communications groups.

More insights