Microsoft has created an automated system to detect pedophiles online

Microsoft has created an automated system to detect pedophiles online

Microsoft has created an automated system to detect sexual predators who try to seduce children through the Internet. The tool, codenamed Project Artemis , is designed to detect communication patterns in conversations.

Based on this, the system assigns a rating for the probability that one of the participants is trying to abuse the other. Companies implementing the system can set a score (for example, 8 out of 10) above which flagged conversations are sent to a human moderator for review .

Automated child protection

The tool is available free of charge to companies that offer online chat features, through a non-profit organization called Thorn , which develops technology products to defend children from sexual abuse. In addition to being a preventive and very practical tool, this system would also provide child protection experts with more information on how pedophiles operate over the internet.

Microsoft has not explained in detail what patterns or words set off your system’s alarms, as this could cause sexual predators to adjust their behavior to try to mask their activities .

However, the system also carries risks: it is likely to return many false positives , as automated systems are not yet very good at understanding the meaning and context of language. That means social media companies will need to invest in more human moderators if they are truly committed to tackling the problem.

The system also assumes that messages are not encrypted and that users consent to having their private communications read, which also carries another extra obstacle.

As of January 10, 2020, licensing and adoption of the technique will be handled by Thorn. Companies and services that want to try and adopt the technique can contact Thorn directly at [email protected] .