Campus Review Vol 33. Issue 05 - Oct - Nov 2023 | Page 20

industry & research campusreview . com . au
( L to R ) AiLECS co-director AFP LSC Dr Janis Dalins , Monash Faculty of Information Technology dean Professor Ann Nicholson , Monash interim vice-chancellor Professor Susan Elliott ,
AiLECS co-director Associate Professor Campbell Wilson , AFP . Picture : supplied

Sensitive data

Monash partners with AFP to expand AI research program
By Erin Nixon

Monash University has received a $ 4.4m funding boost to enhance its partnership with the Australian Federal Police ( AFP ), to support its groundbreaking research into AI technology and combating digital crime .

Funds from the Commonwealth Confiscated Assets Account will expand Monash ’ s AI for Law Enforcement and Community Safety Lab ’ s ( AiLECS Lab ) research capacity to disrupt digital exploitation networks .
The flagship four-year program , co-directed by Monash associate professor Campbell Wilson and AFP leading senior constable Dr Janis Dalins , will support ongoing operations targeting child exploitation networks , illegal firearm trafficking , and harmful deep fake disinformation .
AiLECS Lab will become a comprehensive research centre using machine learning , natural language processing , and network analysis to pioneer research in a law enforcement context .
What is AiLECS ? AiLECS Lab began as a research lab in 2019 and has an ongoing partnership with the AFP to develop sophisticated algorithms that could detect digital content related to child exploitation .
Professor Jon Rouse APM joined AiLECS Lab as a Professor of Practice , advancing machine learning technology to combat child sexual exploitation incidents . Considered a pioneer in countering child exploitation , Professor Rouse was awarded the 2018 Champion for Children Award in New York , and Queensland ’ s 2019 Australian of the Year .
AiLECS Lab is developing an algorithm that could learn to recognise components of an image that may indicate the presence of sexual content and the presence of a child ; however ethically sourcing media content to train the algorithm is a challenge .
“ The material is illegal to possess , and we don ’ t want to expose our team to traumatic images and videos ,” Professor Wilson said in an exclusive with Campus Review .
Other studies have used explicit imagery downloaded from the Internet to train image detection algorithms ; however , AiLECS Lab team wasn ’ t comfortable using data sets without direct consent to use the images .
“ We ’ re trying to shift the dial on the discussion around where does the data come from for AI algorithms , because a lot of it is sourced from the web ,” Professor Wilson said .
“ It ’ s borderline impossible to get explicit consent from children , so one of our researchers suggested asking adults to contribute images of themselves as children because adults
A comprehensive research centre using machine learning , natural language processing , and network analysis to pioneer research in a law enforcement context .
can consent to the use of their own image as a child .
“ We all sort of face palmed and immediately thought , why hadn ’ t we thought of that before ?”
In a landmark operation to ethically source and manage sensitive data sets involving child sexual abuse victim-survivors , AiLECS Lab initiated the My Pictures Matter crowdsourcing campaign earlier this year .
“ My Pictures Matter is a call to action ; join us in the fight to develop technologies to counter child exploitation by contributing your images ,” Professor Wilson said .
AiLECS Lab now has over 3000 images to train the algorithm and hopes to grow the database to enhance the capability of the algorithm to accurately detect exploitative content .
“ We ’ re keen to point out is that it ’ s in no way a facial recognition dataset – we ’ re not using it to try and detect identity – we ’ re using it to train algorithms to understand what a child in general looks like ,” Professor Wilson said . ■
18