Apple prevents AI applications from undressing people in photos

Apple has recently taken action against several AI-powered applications that are capable of removing a person’s clothes without much effort. These apps have been causing concern for many, especially teenagers, as they are designed to create controversial nude images.

The move comes after an investigation by 404 Media highlighted the problematic nature of such AI imaging applications. The investigation uncovered a growing trend of these apps being advertised on Instagram with claims like “free undressing for any girl.” These ads typically direct users to the App Store, where they are described as “art creation tools.”

Apple has been proactive in its approach to this issue. Upon receiving direct links to the apps and ads, the company quickly removed the violative apps from the App Store to prevent the misuse of technology to create harmful content.

However, this is just the beginning of the battle against harmful digital content. AI imaging technology has been used to create fake nude images of teenagers in schools across the US and Europe. Platforms like Meta have taken steps to remove such ads, but there are still many challenges in effectively controlling this content.

While Apple’s actions are a step in the right direction, they highlight the ongoing challenges in digital content moderation. Companies like Google have also faced scrutiny for directing users to sites known for their sexually explicit deep fake content involving celebrities and public figures.

It is important to stop this trend in its tracks to protect the safety and privacy of users on social networks. Apple knows that this is not an easy fight, and deleted apps can still be replaced with new nude photo-generating apps.

Related posts

Google launches Gemini 2.0 – comprehensive AI that can replace humans

NVIDIA RTX 5090 can be 70% more powerful than RTX 4090?

iOS 18.2 launched with a series of groundbreaking AI features