Save over 40% when you secure your tickets today to TNW Conference 💥 Prices will increase on November 22 →

This article was published on January 10, 2020

Microsoft’s new message scanning tool can help identify sexual predators in chatrooms


Microsoft’s new message scanning tool can help identify sexual predators in chatrooms

For the ongoing series, Code Word, we’re exploring if — and how — technology can protect individuals against sexual assault and harassment, and how it can help and support survivors.

Mircosoft released a tool known as Project Artemi” yesterday to catch traps laid out by sexual predators online for children. The tool will be available to companies that offer chat function at no cost. 

The company has developed this tool in partnership with The Meet Group, Roblox, Kik, and Thorn. A team of researchers worked on a technique to identify potential instances of child online grooming for sexual purposes and form a reply. 

Kik’s involvement is quite intriguing as the chat app has been accused of being home to predators on multiple occasions. Maybe the app is finally waking up to take proactive measures for child safety.

The company said in a blog post the algorithm is trained on historical conversations to evaluate characteristics of the chat and determine the probability of one of the participants being a predator:

Building off the Microsoft patent, the technique is applied to historical text-based chat conversations. It evaluates and “rates” conversation characteristics and assigns an overall probability rating. This rating can then be used as a determiner, set by individual companies implementing the technique, as to when a flagged conversation should be sent to human moderators for review.

Emily Mulder from the Family Online Safety Institute, a nonprofit dedicated to helping parents keep kids safe online, told NBC News these kinds of tools can help curb child exploitation online, but they might not consider cultural and language differences: 

There are cultural considerations, language barriers and slang terminology that make it hard to accurately identify grooming. It needs to be married with human moderation.

Zohar Levkovitz, CEO of L1ght, a startup that works on reducing child abuse online, said:

It’s important to add that grooming is just one of the many dangers that younger generations encounter in the digital world. Kids face a never-ending stream of cyberbullying, self-harm encouragement, shaming and more. And if you’re focusing on grooming, you can’t rely simply on a vocabulary based solution.

Child safety has been quite a tough challenge for online companies.

Last year, YouTube disabled comments on thousands of videos featuring minors to prevent creeps. In August, Microsoft said it has a dedicated team to monitor conversations on Xbox Live. In December, Instagram started to check the age of its users to make sure all users on its platform are aged 13 and above.

If you’re a company looking to use these tools you will have to contact Thron to obtain it.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with