Skip to main content Skip to section navigation

Blog: Dehumanizing tech response to New York Times investigation on suicide-themed website shows industry’s true colours

Written by , Executive Director of the Canadian Centre for Child Protection.

As the end of 2021 nears, U.S. lawmakers are yet again seeking answers from technology companies over another form of online harm festering on the web.

In this case, use of the word “harm”, a term that has become a euphemistic, catch-all for the broad spectrum of terrible things that happen to people on the web, dramatically understates the issue at hand. This time we’re dealing with the loss of life that has alleged links to a website providing detailed instructions and encouragement about suicide.

On Dec. 9, 2021, an investigation by the New York Times identified 45 people across several countries, including Canada, who died by suicide after having spent time on the suicide-themed website. This prompted the U.S. House Committee on Energy and Commerce to release a statement requesting details from electronic service providers (ESPs) whose services may have facilitated the existence of this website.

The New York Times reported in their investigation, among so many other shocking details, that nearly half of all traffic to the website was driven by search engine results. This is the response given to the reporters by major tech players:

Google declined to comment.

Microsoft told the New York Times, “[We] have taken action in line with our policies” and “addressed the ranking associated with this website in our results.”

Cloudflare, a service that handles more than 10 per cent of all internet traffic and is used by many websites to mask details about themselves, including the suicide website, never responded to the investigation.

What shameful and dehumanizing responses for the families of these victims to hear!

These are the patterns of behaviour from entities who expect—and unfortunately are routinely given—a seat at the table when forming online safety policy. We have corporate leaders with a fiduciary obligation to act in the best interest of their shareholders alongside policymakers mandated to protect citizens from online harms - two outcomes that, under the current regulatory framework of the web, couldn’t be more at odds with one another.

The tobacco industry historically wielded a great deal of influence on governments working to set public health policy. It’s an outrageous concept today, but we clearly haven’t learned from our past mistakes.

The response over this issue by industry also further highlights the egregious tone-deaf remarks made by Adam Mosseri, head of Facebook’s Instagram platform on Dec.8, 2021, where he suggested the establishment of online safety standards could be handled by an industry-led panel. This is public gaslighting at its finest.

When technology companies show us who they really are, we should do our best to believe them.


About the Canadian Centre for Child Protection: The Canadian Centre for Child Protection (C3P) is a national charity dedicated to the personal safety of all children. The organization’s goal is to reduce the sexual abuse and exploitation of children through programs, services, and resources for Canadian families, educators, child serving organizations, law enforcement, and other parties. C3P also operates, Canada’s tipline to report child sexual abuse and exploitation on the internet, and Project Arachnid, a web platform designed to detect known images of child sexual abuse material (CSAM) on the clear and dark web and issue removal notices to industry.

Support our work. Donate today.

Be a part of something big. As a registered charitable organization, we rely on donations to help us offer our programs and services to the public. You can support us in helping families and protecting children.

Donate Now