Congress passes ‘Take It Down Act’ with overwhelming bipartisan support, fighting back against deepfakes and harmful images
Congress doesn’t agree on much these days, but the House just passed the Take It Down Act with a resounding 409-2 vote. The measure now moves on to the desk of President Trump. (Photo by Jonathan Cosens Photography)
In a rare show of bipartisan agreement, Congress yesterday gave final approval to the Take It Down Act, a measure designed to protect individuals against the harm of intimate images published without their consent. The House of Representatives approved the measure on a vote of 409 to 2.
The bill, S. 146, will speed up the removal of a troubling type of online content: non-consensual intimate imagery, or NCII. The bill passed the Senate in February and now moves to the desk of President Trump, who is expected to sign it.
a good first step
The Act makes it illegal to “knowingly publish” or threaten to publish intimate images without a person’s consent, including AI-created deepfakes.
It also requires websites and social media companies to remove such material within 48 hours of notice from a victim. The platforms must also take steps to delete duplicate content. Many states have already banned the dissemination of sexually explicit deepfakes or revenge porn, but the Take It Down Act is a rare example of federal regulators imposing on internet companies.
The bipartisan measure was introduced by Sen. Ted Cruz (R-Texas) and Sen. Amy Klobuchar (D-Minnesota) and gained the support of First Lady Melania Trump, who lobbied for the bill on Capitol Hill in March.
Sen. Cruz said the measure was inspired by Elliston Berry and her mother, who visited his office after Snapchat refused for nearly a year to remove an AI-generated deepfake of the then 14-year-old.
states moving to protect against ai-powered fraud, identity theft
The Take It Down Act is a significant move on the part of Congress, but it by no means covers all the risks and harms associated with digital likenesses—especially with the onrush of AI technology. At the state level, a number of lawmakers are working to create safeguards to protect individuals against AI-generated imagery, video, and audio.
In California, Senate Bill 11 would codify the inclusion of computer-manipulated or AI-generated images or videos in the state’s right of publicity law and criminal false impersonation statutes.
Sponsored by Sen. Angelique Ashby, the proposal would also require those selling or providing access to technology that manipulates images, video, and audio, to create a warning for consumers about their personal liability if they violate state law.
“The rise of artificial intelligence presents great opportunities,” Ashby told the Senate Committee on Public Safety last week. “However, there is a lack of legal framework for addressing deepfakes and nonconsensual images and videos. This leaves individuals vulnerable for various forms of exploitation, identity theft, scams, misinformation, and misrepresentation of their character.”
Ashby noted that harmful AI deepfakes disproportionately affect women and girls. “Of all the deepfake videos, 95 percent are sexually explicit and feature women who did not consent to their creation,” she said. “While these deepfakes often target public figures, easily accessible AI software allows users to create non-consensual content of anyone.”
SB 11 would address the misuse of AI technology by:
Clarifying the existing definition of ‘likeness’ under state law to include AI-generated content
Requiring consumer warnings on AI software
Establishing violations for the misuse of AI technology
Preventing AI-assisted evidence tampering in the courts
Transparency Coalition Co-founder Rob Eleveld testified at last week’s SB 11 hearing, noting that “deepfakes have quickly become one of the most tangible and widespread harms of AI.”
“These deepfakes can be generated by literally anyone with no real knowledge needed,” he added. “Girls in high schools across the country are being scarred in their youth by fake nudes and fake pornographic videos with their likenesses. Today, victims have no legal recourse, and there are no penalties or accountability to discourage these abuses.”