AI applications for undressing women in the pictures. Growing popularity and legal threats


Graphika, a company dealing in analyzing traffic and trends on social networking sites, has conducted a study that showed that pages and applications that promise “underestimating women” from photos are increasing interest among Internet users.
Of course, this kind of services are unethical, but there are no adequate legal regulations that would directly ban such solutions. The law does not keep up with artificial intelligence and its subsequent applications. Only in September this year, according to Graphika, were visited by 24 million Internet users from around the world.
Applications for “undressing”. How do they work?
Applications from the segment referred to as “nudify” They change photos, and sometimes also movies of dressed people to generate false naked images. Frequently obtained effects are very realistic, but it is always false content. Most applications and services for these purposes only work in women's pictures.
Graphika has analyzed over 30 pages that offer such services. Researchers came to the conclusion that The number of links advertising the services of these pages increased by 2400 percent. From the beginning of the year, especially on Reddit websites and on the X platform (formerly Twitter). In many other places we will also find such links, but they are much more often noticed and removed by moderators or appropriate algorithms.
In addition to websites, it also works Many groups on a telegram that provide access to undressing tools. Graphika reports that as many as 53 groups have at least a million users.
The use of AI to generate undressed photos or pornography without consent is a disturbing trend that has become even more serious due to the popularity of artificial intelligence. In recent years, some Deepfake content has already appeared depicting famous naked people, but now this threat is becoming much more popular. The problem is so serious that not only affects celebrities. In November, information appeared that some high school students in New Jersey had to translate from naked photos that proved to be false.
The FBI warns about manipulation of photos
In June, the FBI issued a warning regarding the increase in photo manipulation for SexTortion (sexual extortion). “Malicious people use the technology and services of content manipulation to use photos and videos – usually from the social media accounts of a person, open internet or demanded from the victim – in order to transform them into sexual images that look reliable in similarity to the victim, and then distribute them in social media, public forums or on pornographic pages” FBI website. Photos are sometimes used as sextortion to obtain a ransom from the person presented on them.
Unfortunately, AI programs are also used to undress minors. In September in Spain, artificially generated naked photos of over 20 young girls were discovered. Most of the photos were created using photos in full clothes downloaded from the Instagram accounts of individual people. After modifying the photos using an application based on “Clothoff” technology, naked photos were available on WhatsApp groups.
The ability to dismantle photos of celebrities, classmates, strangers on the bus, managers, colleagues and children are within a few clicks. Currently, there is no right to prohibit the creation of such photos, although they are forbidden to use with minors.
When it comes to photos of adults, it seems that these applications and images that hit the market remain legal. According to Time Tiktok and Meta have blocked the keyword “undressed” to limit access to AI programs of this kind. Google has also removed some application and pages for undressing.
The dark side of artificial intelligence
The psychotherapist Lisa Sanfilippo, specializing in sexual trauma, told the Indian Business Insider that creating false naked images “is a serious violation of the privacy of people who can bring victims of intense trauma.”
She stated that For the victim, the view of such images can be very destabilizing and even traumatic. He emphasizes that there is no way to agree to such technologies. “It's just an abuse when someone takes something from another person who has not been transmitted to him earlier,” Sanfilippo summed up.
The problem of using AI to create false naked images becomes common and raises serious concerns about privacy, security and ethics. The increase in availability and technological advancement of AI tools makes it more and more difficult to fight this phenomenon. The need for coordinated effort from both technology companies and law enforcement agencies to effectively counteract this disturbing trend. For now, it looks like this is the dark side of artificial intelligence that we can't deal with yet.
Author: Grzegorz Kubera, Business Insider Polska journalist




