Social media platforms report a 9% increase

Nabil Anas
Nabil Anas

Global Courant 2023-05-09 03:55:00

Major social media sites and digital platforms reported a nine percent increase in suspected child sexual abuse material in 2022, with 85.5 percent of all 31.8 million reports coming from meta-platforms Facebook, Instagram and WhatsApp.

“These numbers are rising either because of an increase in the distribution of this material by users, or because companies are just now starting to look under the hood of their platforms, or both,” said Lianna McDonald, executive director of the Canadian Center for Child Protection. , said in one press release.

The data come from the US National Center for Missing and Exploited Children (NCMEC). Both Canada and the US legally require electronic service providers in their countries to do this report and remove instances of apparent child pornography when they become aware of it in their systems. However, there are no legal requirements for companies to proactively search for abusive content or use prevention tools to prevent it from being uploaded.

- Advertisement -

“Millions of CyberTipline reports each year, mostly filed by a handful of companies, is evidence that what we know about the extent of child sexual exploitation online is just the tip of the iceberg,” an NCMEC spokesperson told, referring to the American reporting program it administers. “Most technology companies around the world are choosing not to proactively detect and report child sexual exploitation on their networks.”

Meta submitted 27.2 million reports to CyberTipline in 2022, including 21.2 million from Facebook, five million from Instagram, and one million from WhatsApp; an increase of 1.1 percent compared to 2021. Facebook alone accounted for 66.6 percent of all reports in 2022.

Facebook is the most popular social media platform in both Canada and the US, with about three quarters of adults using it. In a statement to, a spokesperson for Meta said the company is actively investing in teams and technology to detect, prevent and remove malicious content with tools such as AI and image scanning software.

“We remove 98 percent of this content before anyone reports it to us, and we find and report more (child sexual abuse material) to NCMEC than any other agency,” Antigone Davis, Meta’s chief of security, said in a statement. to CTV News. approx. “We are committed to not only removing (child sexual abuse material) when we discover it, but also building technology to help prevent child exploitation.”

Signy Arnason is the associate executive director at the Canadian Center for Child Protection.

- Advertisement -

“Companies reporting high numbers could be both indicative of a problem with users distributing material, but could also be a sign that the platform is making an effort to moderate content or using detection tools,” Arnason told “Conversely, for large companies with very low reported numbers, this may indicate a reluctance to use proactive moderation tools to block this material; as a result, very low reported numbers are not necessarily a positive sign.”

Many popular electronic service providers recorded more troubling reports in 2022, including 2.2 million from Google, a 151 percent increase compared to 2021 as well as Snapchat (551,086, up 7.5 percent), TikTok (288,125, up 86.3 percent), Discord (169,800, up 473.5 percent), and Twitter (98,050, up 13.1 percent). per cent).

One of the bigger increases came from Omegle, a site that allows users to chat with a randomly selected stranger and has recently come under fire for hosting abusive users. Omegle submitted 608,601 reports in 2022, an increase of 1,197 percent from 2021. The image-sharing platform Pinterest, meanwhile, made 34,310 reports: an increase of 1,402.8 percent.

- Advertisement - has reached out to all companies mentioned in this story for comment.

An Omegle spokesperson said their platform uses periodic snapshots of video streams to moderate content.

“While users are solely responsible for their behavior while using the website, Omegle has voluntarily implemented content moderation services that use both AI tools and contracted human moderators,” the company told “Content flagged as illegal, inappropriate or against Omegle’s policies may result in a number of actions, including reports to appropriate law enforcement authorities.”

A spokesperson for Discord, an instant messaging and voice chat platform popular with gamers, said the company reports perpetrators to the NCMEC and is actively using technology to track down malicious material.

“Discord has a zero-tolerance policy for child sexual abuse, which has no place on our platform or anywhere in society,” Discord told “We have a dedicated team that never stops finding and removing this abhorrent content, and takes action, including banning the users responsible and engaging the appropriate authorities.”

A spokesperson for the messaging app Snapchat said the platform is using image and video scanning technology to hunt for such content and reports are being filed with NCMEC in the US.

“Any sexual exploitation or abuse of a Snapchatter and/or a minor on our platform is illegal and against our policies,” they said. “If we become aware of child sexual abuse or exploitation, whether identified through our proactive detection technology or reported to us through our confidential in-app reporting tools, we will delete it and report it to the authorities.

Pinterest also has a zero-tolerance policy for content that may exploit or endanger minors.

“When we detect content or behavior on the platform that violates policies, we take immediate action, removing content, banning related accounts and cooperating with relevant authorities,” a spokesperson told “We are committed to the trust and safety of children online and continue to work with organizations such as NCMEC to help remove this type of content from the Internet.”

Google, TikTok and Twitter did not respond to’s requests.

“There is increasing public pressure on social media platforms to do a better job of moderating user-generated content, and therefore find or block more of this material,” said Arnason of the Canadian Center for Child Protection. “If we are to fundamentally improve online safety for families, we need our elected officials to ensure that technology companies must prioritize online safety for their end users, just as they do in other industries.”

Canadians can report suspected online child exploitation to, which is administered by the Canadian Center for Child Protection. In addition to the NCMEC is also active in the US Take it downa service that helps remove explicit images and videos of minors from the Internet.

Social media platforms report a 9% increase

America Region News ,Next Big Thing in Public Knowledg

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *