Science and Technology News and Videos
AI image generators that claim the ability to "undress" celebrities and random women are nothing new — but now, they've been spotted in monetized ads on Instagram., Meta — the parent company of Facebook and Instagram — contained in its ad library several paid posts promoting so-called "nudify" apps, which use AI to make deepfaked nudes out of clothed photos.
In one ad, a photo of Kim Kardashian was shown next to the words "undress any girl for free" and "try it." In another, two AI-generated photos of a young-looking girl sit side by side — one with her wearing a long-sleeved shirt, another appearing to show her topless, with the words "any clothing delete" covering her breasts. Over the past six months, these sorts of apps have gained unfortunate notoriety after they were used to generate fake nudes ofat the end of last year, students in Washington said they found the "undress" app they used to create fake nudes of their classmates via TikTok advertisements. Why go overseas for a nudify tool to exploit teen girls in your school when you can get them on Instagram?found that many of the ads its reporters came across had been taken down from the Meta Ad Library by the time they checked it out, while others were only struck down once it alerted a company spokesperson to their existence. "Meta does not allow ads that contain adult content," the spokesperson told the website, "and when we identify violating ads we work quickly to remove them, as we’re doing here."published its story, suggesting that like with so many content enforcement efforts, Meta is taking a whac-a-mole approach to banning these sorts of ads even as others crop up.that Google was readily directing searchers to deepfake porn that not only featured celebrities spoofed into nude photos, but also of lawmakers, influencers, and other public figures who didn't consent to such usage of their images. When doing a cursory search, Google still showed "MrDeepFakes," the biggest purveyor of such content, first when searching for "deepfake porn."found that one of the apps in question both prompted users to pay a $30 subscription fee to access its NSFW capabilities and, ultimately, was not able to generate nude images. Still, it's terrifying that such things are being advertised on Instagram at all, especially considering that 50 percent of teens,
United States Latest News, United States Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
How to turn off Instagram's automatic setting, limiting political content on social media platformHere's how to get around Instagram's political curbs in just a few steps.
Read more »
How to turn off Instagram's automatic setting, limiting political content on social media platformHere's how to get around Instagram's political curbs in just a few steps.
Read more »
How to turn off Instagram's automatic setting, limiting political content on social media platformHere's how to get around Instagram's political curbs in just a few steps.
Read more »
How to turn off Instagram's automatic setting, limiting political content on social media platformHere's how to get around Instagram's political curbs in just a few steps.
Read more »
Minor league baseball players charged in San Diego with illegally profiting off fast food mergerValhalla High's Jordan Qsar and Valley Center High's Austin Bernard accused of insider trading before Jack in the Box's purchase of Del Taco
Read more »
Center for Countering Digital Hate accuses X of profiting off antisemitismElon Musk's X has encouraged and profited from the spread of hateful posts in the wake of the Israel-Hamas war, according to the Center for Countering Digital Hate, a nonprofit research group.
Read more »
