The AFP Wants Pics of Younger You to Help Train Its Child Abuse-Thwarting Algorithm

The AFP Wants Pics of Younger You to Help Train Its Child Abuse-Thwarting Algorithm

Monash University has partnered with the Australian Federal Police (AFP) to help combat child abuse. The pair are now asking for help, which of course, we all want to help with such a cause. But. They want you to send pictures of yourself as a child to help train its algorithm.

On the surface this seems very WTF, but it actually makes quite a lot of sense.

Monash is calling this a “world first ethically-sourced and managed image bank for research to combat child exploitation”.

They’re asking persons aged 18 and above to contribute photographs of themselves as children. The pictures will be used to train artificial intelligence (AI) models to recognise the presence of children in ‘safe’ situations. They will help identify ‘unsafe’ situations and potentially flag child exploitation material.

The project is an initiative of the AiLECS Lab, which is a collaboration between Monash University’s Faculty of Information Technology and the AFP. The AFP already develops AI tech that aid law enforcement in a number of ways. It’s called the My Pictures Matter crowdsourcing campaign.

“To develop AI that can identify exploitative images, we need a very large number of children’s photographs in everyday ‘safe’ contexts that can train and evaluate the AI models intended to combat child exploitation,” AiLECS Lab co-director associate professor Campbell Wilson said.

“But sourcing these images from the internet is problematic when there is no way of knowing if the children in those pictures have actually consented for their photographs to be uploaded or used for research.”

Wilson said that by obtaining photographs from adults, through informed consent, the team can build technologies that are ethically accountable and transparent. Which makes sense. Models trained on images of people are often fed ones scraped off the internet. Or, without documented consent for their use.

Monash said “comprehensive strategies” have been developed to ensure the pics are stored safely and that privacy is maintained.

People who have contributed photos can choose to get details and updates about each stage of the research. They can also opt to change use permissions or revoke their research consent and withdraw images from the database.

Here’s what Monash said of the initiative:

Machine learning tools developed with the help of My Pictures Matter campaign will support a major ongoing AiLECS initiative to counter online child exploitation through technologies designed to: better protect survivors of abuse from ongoing harm; assist in identifying and prosecuting perpetrators; and minimise repeated exposure of highly distressing content for the AFP and other law enforcement officers.

In 2021, the AFP-led Australian Centre to Counter Child Exploitation received more than 33,000 reports of online child exploitation and each report can contain large volumes of images and videos of children being sexually assaulted or exploited for the gratification of offenders.

By the end of 2022, the researchers are aiming to have a database of at least 100,000 ethically-sourced images for training the AI algorithm.

Want to upload your pics? Head over to the My Pictures Matter campaign website.

If you or someone you care about needs support, please call LifeLine Australia on 13 11 14. If life is in danger, call 000.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.