An app created solely for "entertainment" a couple
of months back, won attention as well as criticism. It professed to have the
option to take off the clothes from pictures of women to make counterfeit nudes
which implied that any woman could be a victim of revenge porn.
Saying that the world was not prepared for it the app
developers have now removed the software from the web and wrote a message on
their Twitter feed saying, "The probability that people will misuse it is
too high, we don't want to make money this way."
Likewise ensuring that that there would be no different
variants of it accessible and subsequently withdrawing the privilege of any
other person to utilize it, they have also made sure that any individual who
purchased the application would get refund too.
The program was accessible in two forms - a free one that
put enormous watermarks over made pictures and a paid rendition that put a
little "fake" stamp on one corner.
Katelyn Bowden,
founder of anti-revenge porn campaign group Badass, called the
application "terrifying".
"Now anyone could find themselves a victim of revenge
porn, without ever having taken a nude photo, this tech should not be available
to the public, “she says.
The program apparently utilizes artificial intelligence
based neural networks to remove clothing from the images of women to deliver
realistic naked shots.
The technology is said to be similar to that used to make
the so-called deepfakes, which could create pornographic clips of celebrities.