On May 19, President Trump signed the Take It Down Act into law, which aims to prevent the spread of nonconsensual intimate imagery (NCII). The act addresses both real images and videos and “digital forgeries,” or deepfakes, which have been created or manipulated by generative AI and contain a real person’s likeness. Targeting harms including revenge porn and explicit deepfakes, Take It Down is officially known as the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act. Currently a state-level patchwork, the act fills a legislative gap against NCII at the federal level. Additional regulation is in progress to address deepfake content of individuals more broadly, notably the No Fakes Act. RELATED: Generative AI, Celebrity Deepfakes & Digital Replicas: A Special Report Take It Down makes it a federal crime for any person to knowingly post nonconsensual “intimate visual depictions” of others, with penalties that could include fines and/or up to three years in prison and requires covered online platforms to remove reported content upon request within 48 hours, with civil penalties for those failing to comply enforceable by the FTC. The notice and removal requirements for covered platforms don’t take effect until May 19, 2026. The act has become much more relevant and critically needed as generative AI has scaled the nonconsensual deepfake problem into a crisis with little recourse for victims, who include public figures as well as non-famous real people, including children, teens and, disproportionately, women. Among the act’s endorsers was SAG-AFTRA, which has been consistently fighting for regulation to protect its members from AI harms, whether inside or outside the workplace. RELATED: How Celebrity Reps Are Fighting the Flood of Unauthorized AI Content Regardless of the relative newness of generative AI, the bill is long overdue. “It’s crazy we’ve gotten this far into the internet and we’re just now making it a law that says if it’s you in an explicit situation, you get to say whether that stays up published or not,” said Luke Arrigoni, CEO at Loti, a deepfake detection and takedown service that works primarily with public figures. The law should yield several initial outcomes: 1. Create a more direct path to NCII takedowns on major public platforms: Having a law that specifically addresses NCII makes it easier and faster for victims to get takedowns, as it compels platforms to take action quickly on requests. “This is giving victims real recourse and mechanisms to get this stuff off of online platforms,” said Josh Weigensberg, IP litigation partner at Pryor Cashman. While the law won’t prevent the creation of NCII or sharing in private online spaces such as email, it does directly combat its spread on the public internet — which, for many, is the primary concern. Most clearly, covered platforms as defined by the act include social media (i.e., a website, service or app that services the public and “primarily provides a forum for user-generated content”). 2. Compel platforms that haven’t previously complied: “Take It Down has been really effective for platforms that didn’t want to work with us. Most sites have been compliant, even the most nefarious ones, but it’s sites like X that we’ve had to use the Take It Down Act for. Now we have a superpower with them,” said Arrigoni. 3. Compel platforms to proactively ban NCII: Multiple 404 Media reports, for example, have detailed how the popular AI model hosting and sharing site Civitai allows users to AI-generate nonconsensual pornographic images and videos as well as share LoRAs (custom AI models) of real-world celebrities. This past Friday, Civitai changed its policy to ban AI models designed to generate the likeness of real people, citing the Take It Down Act as a precipitating factor. 4. Squash sites intentionally distributing NCII: Just days after Congress passed Take It Down, the prominent deepfake porn site MrDeepFakes, on which the vast majority of content was NCII deepfake videos featuring celebrities, shut down. Staying up without otherwise changing its business would almost certainly have exposed the site to civil and potentially criminal penalties, as a covered platform under the act (i.e., its “regular course of trade or business” is to “publish, curate, host or make available content of nonconsensual intimate visual depictions”). Advocacy groups have criticized the law, arguing it’s going to enable overreach, censorship and selective enforcement and undermine user privacy. Some speculate Trump would end up trying to abuse the law to censor negative speech about him, as he hinted in a speech to Congress earlier this year. Other concerns included that: Some of these critiques will need to be addressed. For example, language in the act doesn’t explicitly list E2EE services under excluded platforms, meaning they might indeed be considered covered platforms. It’s further possible that some platforms won’t comply, at which point it will be up to the FTC to bring cases against them, and if the agency fails to take enforcement action, victims would need alternative recourse. “I expect we’ll see some refinement [of the law] in the future for a victim that tries to get nonconsensual images taken down from a site and the site isn’t complying. The act is clear that the FTC can step in and help, but we’re looking at options the victim would have in that scenario, additional steps he or she could take,” said Weigensberg. Yet free-speech concerns feel abstracted from the reality of NCII online, which the law is narrowly scoped to deter, disincentivize and help real victims remove. “Some of these First Amendment concerns presuppose that bad actors are going to abuse the notice and takedown process that’s designed to help victims of deepfake pornography and sharing of other nonconsensual intimate images,” said Weigensberg. “There are so many victims of deepfake pornography and other nonconsensual intimate images being shared, and they need tools to be able to fight it. This seems like a very good one.” VIP+ Unearths Gen AI Insights From All Angles — Pick a Story