Creepy deepfake AI tool undresses everyday women so users can ‘see them naked’

  • Bookmark
  • We have more newsletters

    A terrifying new use of AI means that people who download special apps can take photos of clothed women – and make them look naked in the snaps.

    The "undressing" apps strip women in the photos naked using artificial intelligence software – and it's already been used to create images of thousands of unsuspecting women.

    Now, of course, the images won’t be of the person’s actual body, but the creepy “deep fake” technology will make a realistic nude body appear below the person’s face.

    Which means those who don’t know the photos are an AI modification could mistake them as being real – making these creepy apps a new form of non-consensual pornography.

    Deep-faking has been around for years and has been used to create realistic non-consenting pornography of celebrities and public figures.

    But, the latest AI apps make the technology more accessible to everyday users meaning private individuals could be targeted by complete strangers.

    Especially worrying is the fact that it’s hard to track where and when the images are being made and also where they’re being shared.

    This means that an app user could take a “naked” photo of you as you innocently walk down the street or sit in the park and you wouldn’t know it was out there.

    And, the creepy apps are not let illegal in the UK despite legislation around revenge porn and gender based harassment being in place.

    Research has shown that one such website received 38million hits in the first seven months of 2021 – The Daily Star will not be sharing the name of the website for obvious reasons – and the bio states: "Undress any girls with any dress. Superior than Deepnude. The threat to girls."

    Get all the biggest Lifestyle news straight to your inbox. Sign up for the free Daily Star Hot Topics newsletter.

    It also claims to "reveal the truth hidden under clothes" and "make all men's dreams come true".

    Back in 2015, “image based sexual abuse” laws were put in place making it illegal to share "private sexual photographs and films with intent to cause distress."

    But, this only applies to images and videos that were consented to be taken by the victim and so deep fakes and creepy app creations are not covered by them as yet.

    Sophie Mortimer, from the Revenge Porn Helpline, told Glamour: “Unfortunately we are all too aware of nudification apps, though we haven’t really had cases on the Helpline – this may be because victims are unlikely to be aware that such images have been created.

    "I find it deeply disturbing that this technology is being used to degrade and dehumanise women, reducing them simply to body parts with no rights or agency.

    "While deepfakes are actually exempt from the current legislation, they may form part of a course of conduct that could be harassment or perhaps malicious communications, depending on the context.”

    Do you think such apps should be made illegal? Tell us in the comments…

    • Artificial Intelligence

    Source: Read Full Article