I spend hours every day searching for my own content, reporting thousands of accounts and posts sharing CSAM. When platforms don’t actively look for or prevent this content from being uploaded, the burden falls on me to have these images removed.
Reviewing Child Sexual Abuse Material Reporting Functions on Popular Platforms
Millions of images of child sexual abuse circulate freely on the internet each day, not only in obscure corners of the dark web, but also on some of the most popular web platforms. C3P’s research found most web platforms lack content reporting functions specific to child sexual abuse material (CSAM), often leaving victims feeling hopeless in their efforts to get their own abusive material removed.
Read the Review
Why the need for CSAM-specific reporting options?
While the majority of platforms have reporting mechanisms in place for content at large, they rarely have a CSAM-specific process or menu options for users to report material that is (or believed to be) CSAM.
This is problematic for three main reasons:
- Without the ability to explicitly flag images or videos as CSAM, companies limit their capacity to remove offending content quickly. Curbing the spread of these illegal images across the internet requires prompt action.
- Without accurate user-generated data on the prevalence of CSAM on their platforms, it’s difficult for companies to properly gauge the effectiveness of their proactive measures designed to intercept CSAM. It also likely hinders their ability to provide timely information to child welfare and law enforcement agencies.
- Survivors who attempt to halt the spread of images of their own child sexual abuse repeatedly cite ambiguous and non-specific reporting options on platforms as a key barrier to successfully getting images removed.
In addition to the lack of CSAM-specific reporting options, C3P also identified a number of additional barriers for reporting content, including the inability to report publicly-visible content without first creating (or logging into) an account; mandatory personal information fields in content reporting forms; and varying reporting tools across the desktop and mobile versions of the platform.
Recommendations
In order to clarify and streamline the process for reporting CSAM, C3P has five recommendations for companies that allow user-generated content on their service:
- Create reporting categories specific to child sexual abuse material
- Include CSAM-specific reporting options in easy-to-locate reporting menus
- Ensure reporting functions are consistent across the entire platform
- Allow reporting of content that is visible without having to create or log into an account
- Eliminate mandatory personal information fields in content reporting forms
Additional Research and Reports
This report focuses on only a small portion of how the global epidemic of CSAM is not being addressed properly in order to protect children and support survivors. C3P has released additional research that builds a better picture of the urgent need to have these horrific images and videos removed swiftly, and its effect on survivors when it’s allowed to remain online.