Canadians Using AI to ID Unwanted Films, Illegal Content Online

By: Lee Rickwood

July 8, 2022

The Canadian Centre for Child Protection (C3P) has enlisted the help of artificial intelligence in its fight against exploitative content and the proliferation of child sexual abuse material online.

Winnipeg, MB-based C3P says that last year, more than 85 million pieces of child sexual abuse material (CSAM) were reported. That’s a 30 per cent increase in CSAM content from 2020 to 2021 – and it represents one piece of CSAM being uploaded every two seconds. 

film festival poster

The Canadian Centre for Child Protection (C3P) launched The Unwanted Film Festival to draw attention to the scale of what is a digital epidemic of abuse and exploitation.

To draw attention to this digital pandemic of abuse and exploitation, C3P launched what it calls The Unwanted Film Festival: a physical experience (it took place in New York City, as part of the world-famous Tribeca Film Festival) and an online destination.

“We are now paying the price for decades of neglect and inaction from tech companies around the world – which has cost countless victims and survivors their safety, dignity, and privacy,” says Lianna McDonald, Executive Director of C3P. “The Unwanted Film Festival is a global wake-up call about the failures to adequately address the festering CSAM epidemic on the Internet.” The initiative also aims to get tech organizations and governments to step up their fight against unwanted content.

The AI algorithm used for the campaign was created by a member of the team at No Fixed Address Inc. this past year; C3P has a longstanding relationship with the Toronto-based No Fixed Address (NFA) as a creative partner.

Making use of the capabilities of AI, more than 85 million ‘movie posters’ were created for the Festival: each one helps visualize the scope and impact that child sexual abuse has, organizers describe.

Each one uses the powerful words and sentiments of participant survivors and turns them into movie titles and taglines.

“It’s hard to describe what it feels like to know that at any given moment someone somewhere is looking at images of me as a child being sexually abused and getting sick gratification from it. It’s like I’m being abused over and over and over again. How can you allow survivors to be revictimized day after day? How can I heal from this while these crimes keep happening to me? Help us build a safer today for the children of tomorrow.”

          – A CSAM Survivor

Those sampled phrases were then matched to images in a database of impactful but carefully curated images, creating and raising awareness about the issue in a safe and respectful manner. After being initially trained with the data from the survivor stories, developers describe, the algorithm continued to learn the appropriate context and meaning for the tagline and subheads on the posters with human assistance. Essentially the vetting process, which was a series of yes/no ‘approvals’ of the posters, was fed back into the machine to allow it to better predict which sorts of combinations would work best from both a content and layout perspective.

# # #

There is no one standard definition for child sexual abuse material (CSAM) or child sexual exploitation and abuse (CSEA) online.

CSAM encompasses a wide range of described behaviours and situations in Canada, from sexual solicitation of a child—with or without a response from the child—to sexual grooming (the trust-building period prior to abuse), to sexual interaction online (cybersex) or offline (meeting in person), to accessing, producing or sharing images related to the abuse of children and youth.

Generally, in the Canadian legal context, the crime of online child sexual exploitation and abuse includes child sexual abuse material, self-generated materials and sexting (often distributed without consent), sextortion, grooming and luring, live child sexual abuse streaming and made-to-order content.

Despite the legal definitions and obvious social harms, at present C3P reports, there are no Western democracies, including Canada, that have laws saying tech companies or Internet service providers must proactively scan for CSAM.

But that may soon change.

Tech companies (in the U.K.) could be fined up to $25 million – or ten percent of their global annual revenue – if they don’t build suitable mechanisms to scan for child sex abuse material (CSAM) in end-to-end encrypted messages.

Proposed updates to the Online Safety bill say that British and foreign providers of a “regulated user-to-user service” must report child sexual exploitation and abuse (CSEA) content to the country’s National Crime Agency. The amendments to the legislation indicate that companies must develop software capable of peering into messages, even end-to-end encrypted ones, in order to actively detect and report CSEA content.

Public awareness and activism along those lines is what C3P and the Unwanted Film Fest are proposing: there is a petition on the website folks can sign to add their names to demands that tech organizations stop the upload of CSAM.

It’s a bold call, and a controversial proposal.

Barely a year ago, Apple was forced to amend its plans to scan for CSAM on iPhones and iPads that sync to iCloud, due to heavy criticism from privacy and consumer advocacy groups.

“Once this capability is built into Apple products, the company and its competitors will face enormous pressure – and potentially legal requirements – from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable,” declared a letter signed by more than 90 human-rights groups.

The U.S. hosts more child sexual abuse content online than any other country in the world, new research has found. That country alone accounted for 30 per cent of the global total of child sexual abuse material URLs at the end of March 2022, according to the Internet Watch Foundation, a UK-based organization that works to spot and take down abusive content.

Last year, the IWF says it took action to remove a record-breaking 252,000 URLs where images or videos of children suffering sexual abuse were posted.

C3P says CSAM, if it continues at current growth rates, could reach upwards of 110 million reported images before this year is over.

As the Unwanted Film Festival continues, its AI-created posters are being displayed in the five most spoken languages round the world: English, French, Mandarin, Hindi, and Spanish to reflect this global issue. Additionally, titles were featured in German as the festival screened at the recent G7 conference in Germany, bringing the urgent issue to those international leaders in the position to make meaningful change.

C3P’s plans call for the festival will wrap up in Canada, at the Toronto International Film Festival in the fall.

The national charity has other efforts underway as it seeks to reduce the sexual abuse and exploitation of children, including educational programs, services and resources available to Canadians. C3P operates Cybertip.ca, Canada’s national tipline to report child sexual abuse and exploitation on the Internet, and Project Arachnid, a web platform launched in 2017 designed to detect known images of child sexual abuse material.

When CSAM or harmful‑abusive content is detected, a removal request is sent to the hosting provider. Project Arachnid has detected and verified more than 5.4 million images and has issued removal notices to more than 760 electronic service providers worldwide, organizers describe.

Sadly, nearly half (48%) of all images Project Arachnid issued a removal notice about had already previously been flagged to the service provider, pointing to the need for stronger regulation and legislation that can bring action quickly and decisively on behalf of exploited children everywhere.

-30-

 

 

 

 

 

 

 

 

 

 

 

 

tech-for-good approach is vital.


Leave a Reply

Your email address will not be published. Required fields are marked *