In another win for post-Elon Musk Twitter (sorry, X), several users have reported seeing ads for websites that claim to use A.I. to remove someone’s clothing from a photo, resulting in you being in possession of a naked photo of someone without their consent or knowledge. Neat!


Advertisement


The website linked to in the ad, which was posted by an account that has since been suspended, “undresses” people for free, but only provides a blurry image until you log in with either your Google or Discord accounts. From there, you can apply the feature to one more image before you’re asked to buy credits, the cheapest of which is a bundle of 20 credits for $10.


Considering where A.I. technology is at present, the results are a mixed bag, so one has to wonder what kind of weirdos are so desperate to see people — let’s be honest, women — naked without their consent that they’d be willing to overlook extra appendages like a third boob. More likely, the results will be used to harass or blackmail women, as young TikTokers like Brooke Monk are all too familiar with.



Monk, who is 20, has been dealing with A.I.-generated deepfakes allegedly of her naked body being circulated on social media platforms for months now, including on accounts that briefly manage to circumvent TikTok’s filters before eventually being taken down. One such account was recently up for at least two days before being suspended, and attempts to search for the account now bring up a message from TikTok that states, “This phrase may be associated with behavior or content that violates our guidelines.”



According to a recent report from Bloomberg, these “nudify” apps have recently soared in popularity, with 24 million people visiting them in September alone. This has led to parents of teen girls who are victims of such sites to refer to them as being part of an “A.I. pandemic” that’s seen over 143,000 new deepfake videos posted online this year — again, primarily of women and teenage girls.


As a result, advocates are calling for lawmakers to implement safeguards for victims of these sites, but as we know from experience with other internet-based crimes against women like revenge porn, the law can be incredibly slow to respond, particularly when the technology is rapidly developing and the issue is increasingly spiraling out-of-control.


If only all of that could be fake.