You have to register first, to connect to this user.
About me
ClothOff is a highly controversial AI-powered "nudify" platform that employs advanced deep learning models, including GANs and diffusion algorithms, to digitally remove clothing from uploaded photos. Primarily available via clothoff.net, it offers dedicated apps for Android, iOS, and MacOS, along with tools for generating hyper-realistic nude images, custom undress videos with realistic motion and effects, face swaps (including specialized porn variants), multi-uploads, adjustable body parameters (such as breast and butt size), sex poses and sets, queue skipping, and an API for adult content creation.
The service provides free trials for basic photo and video undressing, with premium features accessed through one-time purchases of VIP Coins (no recurring subscriptions) for superior quality, faster processing, and advanced options. It markets itself as a leading "pocket porn studio" while claiming stringent privacy protections: no data storage, automatic deletion of uploads, no distribution without consent, and technical safeguards making it impossible to process images of minors (with automatic account bans for attempts). ClothOff explicitly prohibits non-consensual use, illegal activities, and content involving anyone under 18, and states it partners with Asulabel to donate funds toward supporting victims of AI abuse.
Despite these assertions, ClothOff has faced severe ethical condemnation and legal scrutiny for enabling non-consensual deepfake pornography and child sexual abuse material (CSAM). A major federal lawsuit in New Jersey (Jane Doe v. AI/Robotics Venture Strategy 3 Ltd., the platform's operator) alleges it facilitated the creation and distribution of hyper-realistic fake nudes of a minor from her social media photos, invoking the TAKE IT DOWN Act to demand image removals, data destruction, AI training prohibitions, damages, and potential shutdown. Supported by Yale Law clinics, the case underscores harms including bullying, harassment, and lasting emotional distress.
Investigative reports from Der Spiegel, Ars Technica, Bellingcat, The Guardian, and others link operations to regions in the former Soviet Union (including Belarus), document the acquisition of numerous rival nudify services, and highlight its involvement in widespread global abuse cases, particularly among schools. Facing blocks in countries like the UK and Italy, as well as removal of its Telegram bot, the platform attracts millions of monthly users and continues operations amid intensifying calls for comprehensive AI regulations. ClothOff denies liability for user misconduct.