AI Leader Q&A: Vermont Rep. Monique Priestley on moving a ‘Kids Code’ into law

Jan. 6, 2026 — Over the past two years Vermont Rep. Monique Priestley (D-Bradford) has emerged as one of the leading voices on digital safety and data privacy issues.

In June 2025 the Vermont Age-Appropriate Design Code she sponsored and championed—along with Senators Wendy Harrison, Seth Bongartz, and Patrick Brennan—was signed into law by Gov. Phil Scott. This statewide “Kids Code” measure will require tech companies to implement privacy-by-default and safety-by-design protections for kids online. This means not collecting or selling their data, setting high privacy standards by default, and avoiding manipulative design. The full text of the new law is available here.

In July the bipartisan Future Caucus tapped her to lead its National Task Force on State AI Policy along with co-chair Doug Fiefia, a Republican legislator from Utah.

Priestley represents a rural Vermont district centered around the town of Bradford, where she runs The Space On Main, a nonprofit community workspace she founded in 2017. The Space on Main offers a variety of programming including business planning and youth STEM. She’s been a tech person from a young age, “taking apart computers and building web sites from age eight or so,” and wiring her high school for Ethernet. Over her career she’s worked in database administration, web development, data migration, and other tech-related fields.

In this Q&A, Rep. Priestley spoke with Transparency Coalition Editorial Lead Bruce Barcott.

A KIDS CODE THAT FOCUSES ON PRODUCT SAFETY, NOT CONTENT

TCAI: The Vermont Kids Code you led this past year implements safety measures for minors in digital services. Can you give us an overview of what the bill actually does?

Rep. Monique Priestley: The way that it differs from some of the other age-appropriate design codes is that it really doesn't focus at all on content. It focuses on product liability and product safety.

So in the same way we regulate cars and other physical products, we’re trying to make sure that we have safe digital products as well.

The reason we didn’t focus on content is, because that runs into freedom of speech concerns. And then there are a lot of issues that get into Section 230 issues.

Any of the bills that focus on content can end up getting challenged in court by software companies. It’s much harder to challenge something that is inherently designed to be addictive and manipulative and basically unsafe. It’s harder to make the case for the actual features themselves being unsafe versus the content that users put into it.

Business & Tech lobbies hesitate to oppose a kids safety bill

 TCAI: What were some of the concerns your colleagues had about the bill initially, and how did you address those concerns?

Rep. Priestley: We’ve been working on data privacy bill, and that is an issue that is generally very heavily lobbied. It gets everybody worked up in the name of business and innovation.

When it comes to a kids' safety bill, the [business and tech] lobbies aren't very public because it's a bad look for business to basically admit that they're hurting kids. To argue that it’s somehow good for business and innovation. So that means we don't see as much pressure on our colleagues either.

If something is in the name of protecting kids, people generally want to pass it no matter what it is. Which introduces a whole different complication: We have to make sure we’re passing something that's responsible and that people are actually paying attention.

Name the specific design features

TCAI: Let’s dive into the word "design" in the title of the bill. You’ve spoken previously about how measures like this can offer businesses the opportunity to build safer products and platforms. Can you tell us more about that?

Rep. Priestley: This bill is basically going after the harms that we’re were seeing perpetuated against kids—trying to make sure that we're looking at the design of those harmful features and trying to stop them.

For instance: the endless scroll, where a kid is spending hours and hours and hours just swiping through content. So in order to address that, the design feature is going after that endless scroll functionality.

Another feature was looking at things like kids being served up content that might be encouraging somebody to commit suicide or enabling an eating disorder. The design feature there is trying to go after how the algorithm serves up content.

And then there are things like unknown adults contacting kids to sell them drugs or to rape them or kidnap them or engage in sextortion. So the feature there is making it so that kids have a choice of who contacts them, making sure those contacts are approved.

There are a lot of things like that, in addition to really strict privacy protections defining what data is vulnerable and determining how it’s shared.

parent groups provided critical support

 TCAI: When you were moving this bill through the legislature, what was particularly helpful in terms of gathering support?

Rep. Priestley: There were really amazing parent groups from all over the state [expressing support]. This was also a bill that the medical societies started to get behind.

And then public education organizations like the NEA weighed in. That was helpful. We also had really great support from the Electronic Privacy Information Center and other national consumer protection orgs.

bipartisan support led Gov. Phil scott to sign the bill

 TCAI: You were able to move the bill through the legislature. But then as with any bill in any state, you have to get it signed by the governor. How, how were you able to engage Gov. Phil Scott on this issue?

Rep. Priestley: This was an interesting one because of the drama that happened around the privacy bill.

[Editor’s note: Rep. Priestley and colleagues passed a state data privacy bill in 2024, but it was vetoed by Gov. Phil Scott. The veto was largely due, said Scott, to the bill’s inclusion of a private right of action. A revised privacy bill was introduced in 2025, but was ultimately tabled and will be re-introduced in 2026.]

We had a private right of action [in the Kids Code bill]. That meant that if a kid is harmed, a, a parent can sue directly rather than having to go through the Attorney General.

We referenced our underlying Consumer Protection Act from the 1960s, which under the product liability provisions gives people the right to sue a company.

I wasn't sure that the bill would get past the Governor’s desk because of [the inclusion of a private right of action]. But it came down to is that this was a super bipartisan issue. A lot Republicans were saying we have to do everything we can to protect kids. Our governor is Republican, as is the chair of my committee. So that was really helpful.

Looking ahead: leading the future caucus ai policy taskforce 

TCAI: You’re serving as co-chair of the new Future Caucus AI Policy Taskforce, serving with Republican Rep. Doug Fiefia of Utah. What do you hope to accomplish there?

 Rep. Priestley: That’s one of my favorite projects right now, working with Doug. He and I could not be more different [on many issues] but what we come together on is wanting to have regulation that encourages innovation and is pro-business, but also has consumer safety and mental health protections.

It’s a group of 12 including Doug and myself. We’re going to have our first meeting in January, where we actually start to dive in. We’re working on an overall primer, an educational piece for not only our task force but for Gen Z and Millennial legislators all over the country.

We’re trying to make it an extremely bipartisan group, and we’re working to make sure we’re hearing from a wide variety of stakeholders. More recently, we’re working with the White House’s Executive Order on AI, talking through all the things that come up with that.

Next
Next

Transparency Coalition launches ‘Parents Playbook for AI’ to help families navigate the AI era