Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on May 29, 2019

Creepy programmer builds AI algorithm to ‘expose’ adult actresses


Creepy programmer builds AI algorithm to ‘expose’ adult actresses

Yesterday, Yiqin Fu, a research associate at Yale University, tweeted a thread about a Chinese programmer who had claimed he had built an algorithm that had identified 100,000 adult actresses by cross-referencing footage from porn videos with social media profile pictures. Using this tool, they hope to help others check whether their girlfriends have ever acted in pornographic films.

The facial recognition reportedly tool took half a year to build and has over 100 terabytes of video data pulled from sites including Pornhub, 91, 1024, sex8, and xvideos. This was compared against profile pictures from Facebook, Instagram, TikTok, Weibo, and others.

When the software was first announced, it had around 1,000 comments — most expressing their excitement about the service with replies like “A true blessing for us credulous programmers,” “When can we use it?,” and “Wtf I love the tech future now.”

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

On the thread, Fu noted that the most up-voted comment asked if the OP plans on identifying the men in porn videos to which he replied how he’s open to the idea. But for legal reasons, he said he may have to “anonymize the data” before letting people query the database.

This isn’t the first time someone has used AI to identify faces in porn. In 2017, Pornhub announced that it was using machine learning and facial recognition to detect over 10,000 porn stars across the site in an effort to make it easier for users to find content they like. At the time, Motherboard argued the development was a privacy nightmare waiting to happen.

But unlike Pornhub, the intent here is much more ill-conceived. Porn stars often rely on pseudonyms to shield off their personal matters from their stage personas. From that perspective, cross-referencing porn videos with social media content could seriously endanger this boundary.

The programmer who built the tool was also asked whether he knew what sort of legal jeopardy he could be in. But claimed that everything was legal because he hasn’t shared any data or opened up the database to outside queries, and sex work is currently legal in Germany, where he’s based.

While this technology has the potential to find victims of human trafficking or other forms of sexual exploitations, that’s not the intent here. Rather, it’s a weapon for shaming women and stripping them of their privacy. One user on Fu’s thread tweeted how it won’t be long until people abuse this service to find porn stars that look similar to people they know in real life. That, combined with the sophistication of AI-generated ‘deepfake’ videos, proves the future is truly a horrific place to be a woman.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with