IE 11 is not supported. For an optimal experience visit our site on another browser.

Microsoft launches tool to identify child sexual predators in online chat rooms

The tool, codenamed Project Artemis, is designed to look for patterns of communication used by predators to target children
Image: Two boys using laptop in dark room
Cavan Images / Getty Images

Microsoft has developed an automated system to identify when sexual predators are trying to groom children within the chat features of video games and messaging apps, the company announced Wednesday.

The tool, codenamed Project Artemis, is designed to look for patterns of communication used by predators to target children. If these patterns are detected, the system flags the conversation to a content reviewer who can determine whether to contact law enforcement.

Courtney Gregoire, Microsoft’s chief digital safety officer, who oversaw the project, said in a blog post that Artemis was a “significant step forward” but “by no means a panacea.”

“Child sexual exploitation and abuse online and the detection of online child grooming are weighty problems,” she said. “But we are not deterred by the complexity and intricacy of such issues.”

Microsoft has been testing Artemis on Xbox Live and the chat feature of Skype. Starting Jan. 10, it will be licensed for free to other companies through the nonprofit Thorn, which builds tools to prevent the sexual exploitation of children.

The tool comes as technology companies are developing artificial intelligence programs to combat a variety of challenges posed by both the scale and the anonymity of the internet. Facebook has worked on AI to stop revenge porn, while Google has used it to find extremism on YouTube.

Games and apps that are popular with minors have become hunting grounds for sexual predators who often pose as children and try to build rapport with young targets. In October, authorities in New Jersey announced the arrest of 19 people on charges of trying to lure children for sex through social media and chat apps following a sting operation.

Microsoft created Artemis in conjunction with the online children’s game Roblox, messaging app Kik and the Meet Group, which makes dating and friendship apps including Skout, MeetMe and Lovoo. The collaboration started in November 2018 at a Microsoft hackathon focused on child safety.

Artemis builds on an automated system Microsoft started using in 2015 to identify grooming on Xbox Live, looking for patterns of keywords and phrases associated with grooming. These include sexual interactions, as well as manipulation techniques such as detachment from friends and family.

The system analyzes conversations and assigns them an overall score indicating the likelihood that grooming is happening. If that score is high enough, the conversation will be sent to moderators for review. Those employees look at the conversation and decide if there is an imminent threat that needs referring to law enforcement or, if the moderator identifies a request for child sexual exploitation or abuse imagery, the National Center for Missing and Exploited Children is contacted.

The system will also flag cases that might not meet the threshold of an imminent threat or exploitation but violate the company’s terms of services. In these cases, a user could have their account deactivated or suspended.

The way Artemis has been developed and licensed is similar to PhotoDNA, a technology developed by Microsoft and Dartmouth College professor Hany Farid, that helps law enforcement and technology companies find and remove known images of child sexual exploitation. PhotoDNA converts illegal images into a digital signature known as a “hash” which can be used to find copies of the same image when they are uploaded somewhere else. The technology is used by more than 150 companies and organizations including Google, Facebook, Twitter and Microsoft.

For Artemis, developers and engineers from Microsoft and the partners involved fed historical examples of patterns of grooming they had identified on their platforms into a machine learning model to improve its ability to predict potential grooming scenarios, even if the conversation hadn’t yet become overtly sexual. It is common for grooming to start on one platform before moving to a different platform or a messaging app.

Emily Mulder from the Family Online Safety Institute, a nonprofit dedicated to helping parents keep kids safe online, welcomed the tool and noted that it would be useful for unmasking adult predators posing as children online.

“Tools like Project Artemis track verbal patterns, regardless of who you are pretending to be when interacting with a child online. These sorts of proactive tools that leverage artificial intelligence are going to be very useful going forward.”

However, she cautioned that AI systems can struggle to identify complex human behavior. “There are cultural considerations, language barriers and slang terminology that make it hard to accurately identify grooming. It needs to be married with human moderation.”