Sunlight is the best disinfectant: Australia’s eSafety Commissioner report names names in tech safety report
Written by Lianna McDonald, Executive Director of the Canadian Centre for Child Protection
As we continue to peel back the layers of harm to children happening online, the chorus of calls for more transparency and accountability from the technology industry continues to grow louder.
But there’s a problem.
There is a longstanding and misguided belief that online services — social media platforms and the like — are somehow deserving of a special status that places their commercial activities beyond the scope of oversight. This isn’t isolated to Canada either: this is a global issue, with global consequences.
And unlike with most other industries, few of us have a strong conceptual grasp of what actual transparency and accountability looks like for the Facebooks and Snapchats of the world.
Many organizations, including the Canadian Centre for Child Protection (C3P) have been exhaustively working at building out a framework with true transparency at its core. And just this week, our Australian friends have made another important contribution to this effort.
Australia’s office of the eSafety Commissioner released a powerful report on the actions seven major technology companies are taking (or not) to tackle sexual exploitation and abuse of children on their services.
The eSafety commissioner, a government agency dedicated to online safety for Australians, used its legislative powers to request detailed information about moderation practices, the use of image/video detection technology to block child sexual abuse material and other policies. The companies on the receiving end of these reporting notices were Apple, Microsoft, Skype, Meta, WhatsApp, Snapchat and Omegle.
The key finding won’t come as a surprise to those working in the online child protection space: there exists a patchwork of tools and policies across the industry, creating major gaps that put children at risk.
Readers can pore over the details themselves, but I will highlight that the publication of this report — tying company names to their responses and actions — is an example of what true accountability and transparency looks like. In 2021, C3P published a report that similarly tied company names to behaviours and practices and we saw immediate results.
This is transparency and accountability at work.
The eSafety Commissioner's report is all the more compelling when we contrast it to other interpretations of “transparency” reporting, notably those made by the industry-led and funded Tech Coalition, which doesn’t provide meaningful public insight into the precise actions of its members.
As they say: sunlight is the best disinfectant. This report shows the Australian eSafety Commissioner understands this.