A recent report by Indicator, a publication focused on exposing digital deception, maps out a growing network of so-called “nudify” and “undress” websites that generate explicit images without consent. These services harness artificial intelligence to manipulate user-supplied photographs and produce false nude portraits at scale. They exploit gaps in policy enforcement by relying on mainstream cloud providers and authentication systems. Millions of visitors flock to these platforms each month despite mounting evidence of abuse.
Researchers tracked 85 such websites and found a troubling pattern: the tools are used to create nonconsensual explicit imagery of women and girls, in some cases amounting to child sexual abuse material. Social media portraits and private photos are stolen, altered, and redistributed, leaving victims with little control over the spread of these intimate images. Kids and teenagers have also become targets, with cyberbullies using the systems to harass classmates.
Indicator’s analysis shows these 85 platforms drew an average of 18.5 million visits per month over the past six months. Traffic varied by region and site popularity, but the total user count highlights persistent demand. Despite public outcry and calls for regulation, visitor numbers have barely dipped.
Most of these platforms monetize through pay-as-you-go credits or subscription tiers. Users purchase blocks of credits or sign up for monthly plans that promise unlimited image transformations. Prices range from a few dollars for basic credit bundles to more expensive subscriptions offering advanced features.
Alexios Mantzarlis, cofounder of Indicator and an online safety researcher, calls the ecosystem a “lucrative business” enabled by Silicon Valley’s lax stance on AI. “They should have ceased providing any and all services to AI nudifiers when it was clear that their only use case was sexual harassment,” Mantzarlis says of major tech providers. He warns that without firmer action, these platforms will continue to embed themselves in the broader adult industry.
The infrastructure powering these sites relies heavily on top-tier technology firms. Indicator’s study found that 62 of the 85 platforms use Amazon Web Services for hosting or Cloudflare for content delivery, while 54 depend on Google’s sign-on system to handle user authentication. Payment processing, domain registration, and other critical services are supplied by mainstream companies. Smaller specialist vendors fill in the remaining gaps, creating a resilient tech stack for operators.
In a statement, Amazon Web Services spokesperson Ryan Walsh says AWS enforces its terms of service and responds promptly to reported violations. “When we receive reports of potential violations of our terms, we act quickly to review and take steps to disable prohibited content,” Walsh says, adding that customers and security teams can flag issues for review. He notes that AWS requires users to comply with applicable laws when hosting content.
Google spokesperson Karl Ryan points out that developers integrating Google’s sign-in must agree to policies banning illegal or harassing content. “Some of these sites violate our terms, and our teams are taking action to address these violations, as well as working on longer-term solutions,” Ryan says. He stresses that restrictions are in place to prevent misuse of authentication tools.
Cloudflare did not reply to requests for comment. Indicator’s report omits the names of individual nudify platforms to avoid driving additional traffic. Security experts say naming the sites could inadvertently promote them, which remains one of the most effective means of limiting their reach.
Nudify and undress services trace their roots back to 2019, after the first explicit deepfake videos emerged online. Early tools required manual editing and specialized hardware, but as AI algorithms improved, automated pipelines appeared that could process images in seconds. Open source code and third-party vendors quickly filled in gaps for model hosting, user interface design, and backend integration.
On a typical platform, a user uploads a photo, selects a gender or body type template, and activates an AI model that reconstructs the image as a fake nude. Sites often tout anonymity and privacy safeguards, obscuring the fact that user photos and metadata are stored on third-party servers where they can be misused or leaked.
Indicator researchers and investigative colleague Santiago Lakatos employed open source site intelligence tools such as Built With and WHOIS lookup to map each platform’s stack. They recorded domain registration records, content delivery routes, hosting clusters, and identity service integrations, yielding a comprehensive view of how mainstream and niche vendors power the ecosystem.
Based on subscription costs, credit-pack pricing, estimated customer conversion rates, and recorded web traffic, analysts estimate that 18 of the sites generated between $2.6 million and $18.4 million over the last six months. Extrapolated, that suggests a potential annual revenue of about $36 million. Research authors note this figure likely undercounts total earnings, since transactions on external messaging apps or unlisted platforms are not included.
Whistleblower data leaked to the German news outlet Der Spiegel suggests at least one leading platform operates on a multimillion-dollar annual budget. Another nudify service has publicly boasted about making millions in profit. These revelations underscore the financial scale of the operation and the challenge of stemming its growth.
Analysis of the ten most-visited services shows that U.S.-based users top the list for site visits, followed by India, Brazil, Mexico, and Germany. Search engines account for a large share of referrals, steering curious visitors toward AI-generated image tools. Social media posts and community forums have also directed traffic to these platforms over time.
Operators have diversified their referral strategies. Recent months saw an uptick in paid affiliate programs and cross-promotion deals with adult entertainment websites. Some Russian cybercrime groups have even packaged trojanized versions of popular nudify platforms, distributing them to users under the guise of legitimate downloads.
According to 404 Media reporting, at least one website has produced sponsored videos featuring adult performers to attract a wider user base. Affiliates earn commissions on new sign-ups, creating a self-reinforcing marketing loop that extends beyond organic search traffic.
“Our analysis of the nudifiers’ behavior strongly indicates their desire to build and entrench themselves in a niche of the adult industry,” Lakatos says. “They will likely continue to try to intermingle their operations into the adult content space, a trend that needs to be countered by mainstream tech companies and the adult industry as well.”
Media stories over several years have tracked how deepfake abuse rings rely on credit card processors, ad networks, search engine optimization, and cloud infrastructure, but little coordinated action has followed. Individual companies remove illegal content on a complaint basis, but no industrywide effort exists to disrupt the broader network of AI nudification services.
Henry Ajder, an expert on AI and deepfakes who first uncovered growth in the nudification ecosystem in 2020, says, “Since 2019, nudification apps have moved from a handful of low-quality side projects to a cottage industry of professionalized illicit businesses with millions of users. Only when businesses like these who facilitate nudification apps’ ‘ perverse customer journey ' take targeted action will we start to see meaningful progress in making these apps harder to access and profit from.”
As scrutiny increased, many platforms adopted single sign-on systems from Google, Apple, and Discord to simplify account creation and reduce friction. Developer accounts linked to these identity providers were disabled after earlier reporting, but site creators introduced an “intermediary site” that “poses as a different URL for the registration,” allowing 54 of the 85 services to continue using Google’s sign-in undetected.
Legal pressure has begun to mount across jurisdictions. San Francisco’s city attorney sued 16 services that generate nonconsensual imagery. Microsoft has identified and warned developers behind celebrity deepfakes. Meta filed a lawsuit against the operator of a nudify app that repeatedly placed ads on its social network. In May, US president Donald Trump signed the Take It Down Act, mandating that tech companies remove nonconsensual intimate images quickly. The UK government has moved to criminalize the creation of explicit deepfakes.
These measures may shutter some services, but experts caution that operators will migrate to darker corners of the internet. “Yes, this stuff will migrate to less regulated corners of the internet—but let it,” Mantzarlis says. “If websites are harder to discover, access, and use, their audience and revenue will shrink. Unfortunately, this toxic gift of the generative AI era cannot be returned. But it can certainly be drastically reduced in scope.”

