Open Letter

We are writing to express an urgent need for Canada to join a growing group of peer countries by introducing legislation that will address the harms posed by digital platforms, particularly as those harms relate to children.

During the last federal election campaign, the Liberal Party promised that, if re-elected, they would enact legislation to address online safety. The government has now engaged on this issue for the past two years, and has conducted a wide range of consultations on how best to address the harms Canadians face online, including hearing from two national commissions, four citizens assemblies, roundtables across the country, and a panel of experts from law, public policy, and public health. They have listened to these consultations and reversed course on an initial proposal that narrowly focused on content regulation, including removal and blocking, and instead shifted to an approach that focuses on platform business models, product design, and transparency.

It is time for the government to introduce legislation that details this new governance model. Those consultations produced wide ranging consensus on the appropriate regulatory model for Canada: legislation should require that platforms be held responsible for the risks of and harm caused by their services, including how their platforms are designed, their advertising models, content moderation systems, and use of artificial intelligence. This legislation must protect and promote Charter rights, including freedom of expression, as being central to democracy and discourse, and allow for reasonable and proportionate limitations to protect children and other vulnerable groups. To that end, the core components of a law could include:

  1. A duty on platforms to act responsibly, including by upholding fundamental rights, protecting users from harm, and conducting risk assessments on products used by Canadians.

  2. A special duty to protect children from harm.

  3. The creation of a regulator, with the power to investigate and audit platforms, mandate corrective action, and impose fines.

  4. Mandatory transparency by platforms, including data sharing with researchers and an avenue to audit and verify that they are meeting their legal obligations.

  5. A victim-centred forum for recourse for users impacted by platforms’ content moderation practices.

There is much to debate in the details of this type of model - and we should engage in healthy discourse about it. But, we do know what is broadly needed, and it is time to table the bill so we can have this debate.

Other democracies including the United Kingdom, the European Union, and Australia have introduced and passed legislation aimed at safeguarding their citizens online, with special obligations to protect children. These regulations require platforms to enforce their own terms of service, be far more transparent, and protect their users from known risks in the design of their products. Some of these are second or third generation online safety laws, while Canada has yet to introduce its first. It is urgent that Canada act to protect the safety and fundamental rights of Canadians.

Unlike all other companies that offer consumer facing products and services to Canadians, digital platforms are not legally required to take any steps to mitigate the risks of harm of their products (except in narrow circumstances). This lack of regulatory oversight has led to mental health challenges, a crisis of reliable information, an increased vulnerability to foreign interference in our politics, exacerbated division in our society, and contributed to the development of a media ecosystem flooded with unreliable content where users are more prone to radicalization.

Crucially, our lack of governance has put Canadian children at greater risk online than their counterparts in much of the democratic world. Canadian kids are increasingly subjected to violations of privacy, harassment, extortion and cyberbullying from offenders within and outside of Canada, on platforms they use every day. The amount of child sexual abuse material online continues to rise, victimizing countless children and survivors, and putting their safety at risk. Children are navigating online feeds that fuel eating disorders and self-harming behaviours, which at the worst end of the spectrum tragically propel youth to suicide. We also know from internal documents exposed by whistleblowers that social media platforms are fully aware of the negative and harmful impact of their products, including knowing that teens point to social media as the reason for feeling more anxious and depressed, leading the US Surgeon General to issue a rare public health advisory on social media and youth mental health.

While we appreciate the government taking the time to consult with a wide variety of experts and stakeholders before crafting legislation on such an important issue, the time to act is now. Canadians deserve the same protections and rights as our European, Australian and British counterparts. 

We will certainly not all agree on the specifics of the bill, but it is time to start the urgent public debate about it. That way, we can then get to work to ensure that the bill maximizes the benefits of the digital ecosystem and protection of fundamental rights, while limiting the clear and unacceptable harms.