Gov. Kathy Hochul wants to make New York a leader in fighting the use of artificial intelligence in misleading and unscrupulous ways.


What You Need To Know

  • Gov. Kathy Hochul revealed detailed legislation that, if signed into law, would give New York the toughest laws on the books nationally

  • The regulations would impose criminal penalties and expand existing statute that could lead to jail sentences, tied to unauthorized use of AI including impersonation, identity theft and the unlawful distribution or publication of fake images, video or audio

  • Hochul's state of the state also would put New York government at the forefront of AI research through the creation of a consortium of private and public universities called "Empire AI”

She promised New Yorkers that she will launch the state into a leadership role in the artificial intelligence industry.

“AI is already the single most consequential technological and commercial advancement since the invention of the internet. The global AI market was already valued at $150 billion last year and it’s projected to reach $1.3 trillion by 2030,” she said during her 2024 State of the State address on Jan. 9 in Albany. “Whoever dominates the AI industry will dominate the next era of human history!”

This week, she revealed detailed legislation that, if signed into law, would give New York the toughest laws on the books nationally.

The regulations would impose criminal penalties and expand existing statute that could lead to jail sentences, tied to unauthorized use of AI including impersonation, identity theft and the unlawful distribution or publication of fake images, video or audio.

“It’s very clear we have a problem with deep fakes and we’re not very good at discerning if they’re true or not, we’ve seen a number spread like wildfire,” said Manhattan state Assemblyman Alex Bores of Manhattan.

Bores sponsors separate, existing legislation also aimed at regulating artificial intelligence, but said he backs the other prong of Hochul’s proposal, that would make political campaigns follow new disclosure laws if AI is used in communications before an election takes place.

“Whether that was the Biden call in New Hampshire, county Leader Keith Wright or even outside of politics, what we saw distributed about Taylor Swift,” he said. “The bounds of what is possible here are as wide as anything and the only thing that comes close is nuclear power.”

Daniel Colson heads the nonprofit and advocacy group Artificial Intelligence Policy Institute, which was founded in 2023.

He said it’s important that the New York government take the lead now.

“Building government AI capacity in house is very important for government to be able to manage the risks from the technology,” he told NY1 in a Zoom interview Friday.

Colson argues it is important New York’s laws don’t just penalize the abuse resulting from AI applications, but hold companies liable on the front end, while developing technology.

“You need the models to be tested in a classified environment with compute resources at scale to actually be able to know what the risks are, especially in security and national security issues,” he said.

Colson said government should push AI research to the forefront, because its growth is inevitable.

Hochul proposed the creation of a consortium, including private and public universities called “Empire AI” aimed at leading the state’s research.

Private tech companies — Microsoft and Open AI — are already leading the way.

“That’s where the public is at. They overwhelmingly are concerned about this and want government to lead the way, but mostly because they don’t trust the tech companies to lead the way in a responsible way,” said Colson.

He warned that growth will soon be out of our control.  

“The top three most cited AI scientists of all time over the last year have all flipped, saying we are terrified of the future and we basically regret our careers,” said Colson.

The governor wants the state legislature to green light her proposals before the next election cycle, billing it as a top priority.