Senators unveil the TRAIN Act, bipartisan bill to protect creators from unauthorized AI training

Sen. Marsha Blackburn (R-TN), who last month removed the AI moratorium from the federal budget bill, joined with three Senate colleagues earlier today to introduce the TRAIN Act, which protects artists and other creators from unauthorized use of their work for AI training.

August 5, 2025 — A bipartisan coalition of four U.S. senators earlier today introduced the Transparency and Responsibility for Artificial Intelligence Networks (TRAIN) Act, to help creators, musicians, artists, writers, and others access the courts to protect their copyrighted works if and when they are used to train generative artificial intelligence (AI) models.

The TRAIN Act allows copyright holders to access training records used for AI models to determine if their work was used. 

The bill was authored by Sen. Marsha Blackburn (R-TN), Sen. Peter Welch (D-VT), Sen. Josh Hawley (R-MO), and Sen. Adam Schiff (D-CA).

“Tennessee is home to a thriving creative community filled with musicians, artists, and creators who must have protections in place against the misuse of their content,” said Sen. Blackburn. “The TRAIN Act would protect creators by allowing them to access the courts to find out if their work is being used to train generative AI models and seek compensation for that misuse.” 

background on the issue

Musical artists and other creative industry leaders have raised alarms about the use of copyrighted works to train generative AI models, calling out AI developers for using artists’ work without consent or compensation.

The TRAIN Act seeks to solve the “black box” problem by allowing creators to know when and how their works are being used. Few AI companies currently share how their models are trained and nothing in the law requires them to do so. 

Sen. Welch said: “This is simple: if your work is used to train AI, there should be a way for you, the copyright holder, to determine that it’s been used by a training model, and you should get compensated if it was. We need to give America’s musicians, artists, and creators a tool to find out when AI companies are using their work to train models without artists’ permission. As AI evolves and gets more embedded into our daily lives, we need to set a higher standard for transparency.” 

What THE TRAIN ACT would do

The TRAIN Act would promote transparency about when and how copyrighted works are used to train generative AI models by enabling copyright holders to obtain this information through an administrative subpoena.

Modeled on the process used for matters of internet piracy, the bill would provide access to the courts for copyright holders with a good faith belief that their copyrighted material was used. Only training material with their copyrighted works need be made available.

The bill would also ensure that subpoenas are granted only upon a copyright owner’s sworn declaration that they have a good faith belief their work was used to train the model, and that their purpose is to protect their rights.

Failure to comply with a subpoena creates a rebuttable presumption that the model developer made copies of the copyrighted work.

"AI should be in service to the American people—not the other way around," Sen. Hawley said. "But under current law, Big Tech's AI companies are stealing the works of today's creators as they box out the next generation of creators."

transparency coalition reacts

Jai Jaisimha, co-founder of the Transparency Coalition, added the organization’s voice to many others supporting the bill. He said:

“The Transparency Coalition welcomes the introduction of the TRAIN Act which will provide creators and copyright owners additional protection from their copyrighted works being used in AI training without their consent. The Act deftly addresses the need for transparency around AI training inputs and empowers creators to seek redress from the appropriate judicial forum.” 

The latest AI Policy News

Next
Next

Illinois Gov. Pritzer signs nation’s first law banning AI therapy chatbots