Regulation and awareness are called for by experts as fake nudes become the latest AI nightmare.

0
64
AI

Many websites that create Deepfake Nude images using AI follow a freemium business model.

Over the past few months, hundreds of “undressing” websites have appeared, as have occasionally mobile applications that are released as Android Application Packages (APKs) and can be downloaded from sources other than the Google Play Store.

Dozens of schoolgirls in the small Spanish town of Almendralejo were horrified to learn that everyone at school had shared and spread their nude photos, which they had taken with an artificial intelligence-powered “undress” app, on their phones.

Another incident that made headlines included a high school in New Jersey when a student used artificial intelligence to make deepfake, sexual photographs of his female peers.

Despite taking place thousands of miles apart but at around the same time, the misuse of artificial intelligence to create deepfakes of gullible victims linked the two cases. 24 million people saw fictitious nude websites in September 2023 alone, per a study conducted by social media analytics company Graphika. Regretfully, search engines like Google have not imposed any restrictions on the hundreds of “undressing” websites that have surfaced in recent months.

The majority of these websites (and occasionally mobile applications that can be downloaded from outside the Google Play Store and released as an Android Application Package (APK) for smartphones) generate nudes from any image that users input; some of these creations are so realistic that it is difficult to tell they are not real. Although deep fakes are not a new issue, anyone can now more easily create AI-generated nude photos of everyday people thanks to modern artificial intelligence techniques. Everyone is at risk due to the ease with which AI tools may be used, but women and children are particularly vulnerable since they are common targets of Deepfake AI porn.

 

Since it’s not being done by hand, there is no sensitivity. Technically, all you’re doing is utilizing the technology. Thus, all the feelings associated with performing an act are gone. In an interview with IndianExpress.com, psychologist Anjali Mahalke stated, “I believe these [AI] are making it easy to commit a crime.” It is unfortunate when individuals produce pornographic content or break the law. Below them, a conflict is taking place. a trauma of some kind, or it stems mostly from guilt, and that shame turns into a wound akin to narcissism. She went on, “There’s no regret, guilt, or any other negative spectrum of emotions in the brain.”

The fact that these websites function as digital goods that are marketed to customers in the hopes of gaining users and then making money off of them through add-on services is even more concerning. Many websites that create Deepfake Nude images using AI follow a freemium business model. Users can create pictures for free at first, but in order to access advanced features (like body and age customizations), they must purchase additional credits using cryptocurrency platforms like Coinbase and well-known payment processors like PayPal.

According to these websites, the content is intended only for “entertainment purposes and not to insult anyone.” Mahalke clarified this statement, saying, “99% of the time, deepfake content features women, and 90% of the content is about pornography. The prevalence of women as the subjects of these kinds of publications is really alarming.

Even though everyone can benefit from AI technologies, it is important to avoid all parts of this new technology from being so widely available, particularly in cases when misuse is evident. Because all it takes to alter someone’s image is a single photo of them and a website or smartphone application, the misuse of AI to produce deepfake nude photographs is growing more prevalent. It’s still true that any picture made with a purported “nudify tool” can be illegal and violate someone else’s privacy and dignity. Remarkably, the services are described as “non-consensual intimate imagery” on some of the websites itself.

Ultimately, some organization or individual in the background developed this AI with a certain goal in mind, which is represented by the algorithm the AI will be fed and the language models (LLMs) it will be reading. The language content will include learning modules that will enable the AI to carry out activities like photo morphing, whereby the AI is consuming information to fulfill its intended role. Therefore, the activity that is occurring is the responsibility of the person who wrote the algorithm, supplied the LLMs to the AI, or is otherwise participating, according to Malhotra.

Although fake nudes were already common before AI photos gained popularity, making one no longer requires being a dishonest person. Even children are using these websites for recreational purposes all around the world, and as a result, they often wind up making pictures that might be used to intimidate or harass someone. Usually made from stock photos of people taken down from social media posts, these fakes have the potential to appear on the same sites as real ones, but with terrible consequences. Even social media platforms are not able to identify such content, which adds to their overall incapacity to handle explicit content. This exacerbates the situation.

98% of these deepfake websites, according to Mahalke, have a specific goal. In other words, if they are specific websites, the government can surely set regulatory measures. She went on to say that academic institutions had to make an effort to educate the public about the risks associated with this new technology.

It’s time for everyone to realize how dangerous this new threat is and to pass strict legislation that forbids the creation and distribution of AI-generated nudes without permission. One way to help distinguish real content from software-generated content would be to watermark and label AI-generated content. It is also the responsibility of major tech companies like Google and Meta to integrate deepfake detectors into their platforms to prevent users from uploading pornographic images and videos. However, legal experts are also calling for regulations to safeguard people whose images or videos were used to make sexually explicit content and posted online, as well as for the regulation of AI.

“We are governed by legislation, specifically the POCSO Act. Any anyone who disseminates child pornography is subject to punishment under Sections 13 through 16, regardless of how the kid is involved in the pornographic conduct. According to Mahalke, “We have addressed it in child laws, but not in adult laws.” Nevertheless, Mahalke noted that the prosecution rate for child pornography is a pitiful 1%.

LEAVE A REPLY

Please enter your comment!
Please enter your name here