IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Opinion: Congress Should Outlaw Deepfake Nude Images

The U.S. Senate is considering legislation that would make AI-generated nude photos a federal crime and give victims assurance that such images can be removed quickly from the Internet.

Congress_1200
(TNS) — Taylor Swift may be the best-known victim whose images have been manipulated with AI into pornography. But creators of such nude "deepfakes" have spread this vile and frightening new form of online abuse across the nation. Washington lawmakers enacted protections earlier this year, but Congress needs to act.

Those targeted — predominantly women and teenage girls — have little recourse in many parts of the country. Even in Swift's case, one such image circulated on X, the site formerly known as Twitter, 47 million times before that website removed it, according to The Guardian.

Chaired by Sen. Maria Cantwell, D-Wash., the Senate Commerce Committee is considering legislation that would make such deepfakes a federal crime and give victims assurance they can be removed quickly from the internet. Congress should act swiftly to enact the bill.

Washington is among at least 14 states that already have penalties for AI-generated deepfakes. Earlier this year, Caroline Mullet, daughter of state Sen. Mark Mullet, bravely testified to this deeply disturbing but increasingly common trend: A classmate circulated fake images he'd first captured of girls at homecoming then digitally manipulated with an AI app to make photos contain nudity. Lawmakers voted unanimously to place those images on par with state child pornography possession laws, as well as create a way for victims depicted to sue creators and publishers in court.

But the internet does not stop at state lines. Criminalizing the behavior across all 50 states and U.S. territories is the only way to ensure uniformity for all who fall victim to this humiliating new online exploitation. As well, publishers have a duty to remove images or face penalties by the Federal Trade Commission under legislation being considered in Congress known as the TAKE IT DOWN Act.

The act, or "Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks," would do two things. First, it would make AI-generated fake nudes punishable by prison time — two years if the victim is an adult; three if they're a minor. Second, they would require publishers — whether a small website publisher or a massive social media company like Meta — to remove such imagery within 48 hours of contact by the victim.

Cantwell has a chance to introduce the bill into the Senate Commerce Committee. While she's not the prime sponsor, 10 Republicans and 7 Democrats have signed on, making the effort deeply bipartisan. The senator is an outspoken champion of establishing digital privacy protections for Americans; she recently told the editorial board she supports the bill.

The editorial board also backs comprehensive digital privacy protections in legislation Cantwell introduced alongside U.S. Rep. Cathy McMorris Rodgers, R-Spokane, earlier this year.

Independent of that legislation, TAKE IT DOWN is also sorely needed. Everyone, from Taylor Swift to teenagers growing up in an age where AI can create such damaging, harmful content, deserves that much.

© 2024 The Seattle Times. Distributed by Tribune Content Agency, LLC.