On Sunday, Twitter will turn 15, and one Canadian charity is using that milestone to draw attention to the ways the social media giant is neglecting the protection of children.
The Canadian Centre for Child Protection (C3P) has launched a haunting video campaign that illustrates the lack of action Twitter has taken when it comes to halting the spread of child sexual abuse material (CSAM) on the platform.
The video shows men and women congratulating Twitter for turning 15. The tone turns dark when they start sharing their experiences of being exploited on camera by abusers, which were then shared, and still exist, on the social media platform. While the people in the video are actors, the stories they share reflect real experiences.
According to Lianna McDonald, Executive Director for C3P, the sharing of such material is only getting worse on Twitter. According to 2020 data from the National Center for Missing and Exploited Children in the U.S., there has been a 41 per cent increase in reports of CSAM over the course of one year. She says this trend upwards points to a potential escalation in the usage of Twitter to share this type of material.
“This coincides as well with the experiences survivors are recounting to us and on the platform itself,” she says. “The problem only appears to be escalating.”
This problem isn’t unique to Twitter. In fact, it spans to all parts of the internet, beyond social media.
“There is an entire chain of electronic service providers, hosts, sites, and entities that are in the chain and that need to act to ensure this devastating material is not available and able to spread online,” McDonald says.
While the problem isn’t exclusive to Twitter, C3P is specifically concerned about the social media giant’s lack of action that can be taken to report such illicit material. In fact, in the charity’s review of reporting CSAM on popular platforms, it describes Twitter’s functions as “poor.”
For example, Twitter doesn’t have an option to report content as CSAM directly from a tweet, nor does it have an option to report a user for sharing CSAM from that username or that user’s profile page. It also doesn't have an option to report content as CSAM from a direct message.
C3P has offered to proactively detect and report known CSAM being shared on Twitter’s platform, but to date this assistance has been declined.
“Now is the time to demand Twitter and other online platforms to do better for survivors and children and prioritize the removal of CSAM,” says McDonald.