91大黄鸭

Skip to content

91大黄鸭 tech & RCMP tackle child exploitation

A tech startup out of 91大黄鸭 has joined forced with the RCMP for the Protecting Innocence Hackathon
web1_170707-BPD-m-childexploitation-

The RCMP is working with ato identify new online child exploitation images and rescue at-risk children.

The internet holds a vast collection of sexually explicit photos of children that were nearly impossible for the RCMP to find and they are looking to change that.

To do this, RCMP will now work along side , and researchers from the University of Manitoba, using their artificial intelligence technology.

鈥淔or every single one of our files, there鈥檚 a child at the end of it,鈥 said Cpl. Dawn Morris-Little, an investigator at the RCMP-led National Child Exploitation Coordination Centre (NCECC) in Ottawa.

鈥淚mages that look homemade or images that are unknown 鈥 those take priority because you don鈥檛 know when it was created, and those children could still be at risk.鈥

The artificial intelligence technology, called computer vision, mimics human vision by using algorithms to scan unknown photos and picks out the ones that have a high probability of being child exploitation.

鈥淲hat would take weeks for an investigator would take the algorithm minutes or hours to scan,鈥 explained Brad Leitch, head of product development at Two Hat Security.

鈥淭he algorithm can eliminate the photos of trees and doughnuts and Eiffel Towers pretty successfully and put those high-probability, exploitative images at the top of the list so we can identify victims and make prosecutions more quickly.鈥

To keep moving forward, taking place July 6 and 7 in Vancouver. Sponsored by the RCMP, Microsoft, Magnet and Two Hat Security, the is an attempt to build a bridge between three diverse disciplines 鈥 law enforcement, academia and the technology sector 鈥 for the greater good.

The goal is to work together to build technology and global policy that helps stop online child exploitation.

鈥淲e are hopeful that by encouraging teamwork and partnerships across these three vital industries, we will come closer to ridding the internet of online child exploitation,鈥 reads a Two Hat release.

Since 2011, the RCMP has used software called PhotoDNA to help identify known and documented explicit photos.

The RCMP reports that PhotoDNA works by converting photos into a hash code, which is like a unique fingerprint for each image. That hash code is added to a database, and if it鈥檚 ever found again anywhere in the world, online or on a hard drive, PhotoDNA will flag it.

But, with the rise of smartphones and tablets, creating new child exploitation content has never been easier.

In 2016, the NCECC received 27,000 cases, almost double the number reported in 2015. That is approximately 25,000,000 images containing child sexual abuse imagery.

This new content can鈥檛 be identified by PhotoDNA, since it hasn鈥檛 been added to its database yet.

鈥淭he numbers are only going up, so we need to be handling these cases in a much smarter way,鈥 added Sgt. Arnold Guerin, who works in the technology section of the Canadian Police Centre for Missing and Exploited Children (CPCMEC), which includes the NCECC.

鈥淣ew technology can provide us with tools to review cases in an automated way, and bubble up to the top the ones that need to be dealt with right away.鈥

Guerin said that often minutes matter in child exploitation investigations.

鈥淚f we seize a hard drive that has 28 million photos, investigators need to go through all of them,鈥 said Guerin. 鈥淏ut how many are related to children? Can we narrow it down? That鈥檚 where this project comes in, we can train the algorithm to recognize child exploitation.鈥

The RCMP add that achieving 100 per cent accuracy with the algorithm isn鈥檛 the goal, investigators will still have to go through all the material to make sure nothing is missed. The algorithm is meant to prioritize what police look at first, to make sure they鈥檙e using their time and resources efficiently.

It can also reduce work loads and help protect the health and wellness of investigators.

鈥淲e see images that no one wants to see, so maintaining our mental health is a priority,鈥 said Morris-Little. 鈥淎nything that takes the human element out of cases is going to reduce the risk of mental health injury to an investigator.鈥

Guerin said technology like computer vision can act as a shield, sifting through material before it gets to an investigator.

The computer vision product is still in development, but Guerin hopes the RCMP will be able to use it later this year.

鈥淚f I could reduce the amount of toxicity officers have to endure every day, then I鈥檓 keeping them as healthy as possible, while also keeping more kids safe,鈥 he added.


 


carmen.weld@bpdigital.ca

Like us on and follow us on .





(or

91大黄鸭

) document.head.appendChild(flippScript); window.flippxp = window.flippxp || {run: []}; window.flippxp.run.push(function() { window.flippxp.registerSlot("#flipp-ux-slot-ssdaw212", "Black Press Media Standard", 1281409, [312035]); }); }