Minnesota Bans AI Nudification Apps, Sets $500K Fine per Violation
Minnesota becomes the first state to ban AI nudification apps, allowing fines up to $500,000 per illegal image. Law takes effect August.
*TL;DR Minnesota has enacted the nation’s first ban on AI nudification apps, authorizing fines of up to $500,000 for each flagged fake nude.
Context The law targets software that can automatically remove clothing or add sexual features to photos of real people. It follows a high‑profile case in which a Minnesota resident used an app to create nude images of more than 80 acquaintances. Advocacy groups, including the National Sexual Assault Hotline operator RAINN, pushed for legislation after victims reported harassment and threats.
Key Facts - The Minnesota Senate approved the ban with a unanimous 65‑0 vote, after the House passed the measure the previous week. - The attorney general may levy fines of up to $500,000 for each AI‑generated nude that is flagged under the new statute. Collected penalties will fund services for survivors of sexual assault, domestic violence, child abuse and related crimes. - Developers of “nudify” platforms face civil liability, punitive damages and possible blocking of their products within the state. - The law exempts tools that require a user’s technical skill to produce a nude image, preserving legitimate software such as professional editing suites. - Governor Tim Walz is expected to sign the bill, which would take effect in August.
What It Means The legislation creates the first state‑level legal barrier against AI‑driven image manipulation that can be weaponized for non‑consensual sexual content. By attaching steep financial penalties, Minnesota aims to deter developers from releasing easy‑use nudification services and to provide a clear recourse for victims. The law also signals to other jurisdictions that regulatory action is possible without broadly restricting legitimate image‑editing tools. Watch for how other states respond and whether federal lawmakers consider similar measures.
Continue reading
More in this thread
Conversation
Reader notes
Loading comments...