A Chinese programmer based in Germany created a software using face-recognition technology to identify women who had appeared in porn videos.
The information about the project was posted on the Chinese social network WeiboA. Then a Twitter handle @yiqinfu tweeted ’’A Germany-based Chinese programmer said he and some friends have identified 100k porn actresses from around the world, cross-referencing faces in porn videos with social media profile pictures. The goal is to help others check whether their girlfriends ever acted in those films.’’
The project took nearly half a year to complete. The videos were collected from websites 1024, 91, sex8, PornHub, and xvideos, and all together it consists of 100+ terabytes of data.
The faces appearing on these videos are compared with profile pictures from various popular social media platform like Facebook, Instagram, TikTok, Weibo, and others.
The coder deleted the project and all his data after it found out that the project violates the European privacy law.
However, there is no proof that there is no program on the global system that matches women’s social-media photos with images from porn sites.
According to the programmer whatever he did ‘was legal because 1) he hasn't shared any data, 2) he hasn't opened up the database to outside queries, and 3) sex work is legal in Germany, where he's based.’
But, this incidence has made clear that program like this could be possible and would have awful consequences. “It’s going to kill people,” says Carrie A. Goldberg, an attorney who specializes in sexual privacy violations.
“Some of my most viciously harassed clients have been people who did porn, oftentimes one time in their life and sometimes nonconsensually [because] they were duped into it. Their lives have been ruined because there’s this whole culture of incels that for a hobby expose women who’ve done porn and post about them online and dox them.”
The European Union’s GDPR privacy law prevents this kind of situation, but people living in other places are not as lucky.