Ai is creating a concerning rise in the use of nudify apps, and there doesn't appear to be a way to stop these deepfakes. In at least three of those cases, snapchat was reportedly used to circulate ai nudes We’re building new technology to detect ads for nudify apps and sharing signals about these apps with other tech companies so they can take action too
The viral AI avatar app Lensa undressed me—without my consent | MIT
As noted by commentators, high‑school students increasingly exploit ai “undressing” apps to fabricate nude images of peers, which raises urgent concerns about adolescent privacy, consent, and emotional harm
Legally, the incident highlights challenges in holding overseas or loosely regulated ai‑tool creators accountable.
Farah nasser was driving home in her tesla with three young children when a conversation with grok's ai assistant took an inappropriate turn. Researchers have identified thousands of ads for such apps on facebook and instagram Meta is finally cracking down on nudify apps that use ai to generate nonconsensual nude and explicit. ‘nudify’ apps are software applications that use advanced ai algorithms to manipulate regular photos of clothed individuals, typically young girls, and generate fake nude images.
Moreover, not only do such applications exist, but there is ample evidence of the use of such applications in the real world and without the consent of an image subject Still, despite the growing awareness of the existence of such.