A statue on the first floor of the state Capitol.
The "Genius of Connecticut" under the dome of the state Capitol. Yehyun Kim / ctmirror.org

Connecticut jumped into the fast-changing world of artificial intelligence Thursday with the Senate’s unanimous passage of a bill that would place the state among the first to begin setting standards for the use of AI tools.

Sen. James Maroney, D-Milford, said the legislation is the product of a working group whose labors shifted from back burner to urgent with the release of an easy-to-use AI tool, ChatGPT.

As described by MIT Technology Review, the chatbot developed by Open AI of San Francisco “exploded into the mainstream almost overnight,” albeit after decades of research. Microsoft is building it into office software and its search engine, Bing. Google is rushing to compete.

“ChatGPT is the fastest adopted online service,” Maroney said. “In just two months, it garnered 100 million users. It took TikTok nine months. And then before that, it took Instagram, I think it was two and a half years to get to 100 million users.”

As co-chair of the legislature’s General Law Committee, Maroney led the successful bipartisan effort last year to create a law protecting the privacy of consumer data, filling a void created by congressional inaction.

“We knew that we wanted to look at AI. We knew it was coming,” Maroney said.

They just didn’t know how soon it would arrive.

Senate Bill 1103 sets deadlines for the state to make an inventory of all AI tools used by state agencies and set “policies and procedures concerning the development, procurement, implementation, utilization and ongoing assessment of systems that employ artificial intelligence.”

It also would create a permanent working group appointed by legislative leaders and the governor, drawn from companies that develop and use AI, academics with an expertise in technology and public policy and members of the Connecticut Academy of Science and Engineering.

The panel would be tasked with developing an artificial intelligence bill of rights and recommending best practices for “the ethical and equitable use of artificial intelligence” in state government and the regulation of AI in the private sector.

The data privacy law passed last year was the product of negotiations with industry groups initially opposed to the evolving state-by-state approach and merchants fearful of liability for the actions of web designers or payment processors.

This year, the committee received expressions of support and some concern about AI standards.

The job search company, Indeed, urged caution to the lawmakers in assessing AI, noting it uses algorithms to match employers and employees. The company suggested it target only “final decision” systems.

“This is the highest risk decision-making system, which has the power to make significant decisions without any human involvement,” Indeed wrote to the General Law Committee.

Jess Zaccagnino, the policy counsel for ACLU Connecticut, urged passage in its written public hearing testimony.

“Algorithms and artificial intelligence can perpetuate racial bias and inequity and deeply change how people interact with the government,” Zaccagnino wrote.

The bill was rewritten after talks with stakeholders, Maroney told colleagues.

“We have done some negotiation on the underlying bill to make sure that we were able to actually implement the bill. And the goal is looking at the safety of artificial intelligence and more transparency,” Maroney said.

“There are lots of questions: Are we using it? Are we not using it?” he said. “And I always feel that sunlight disinfects. And so this will make it clear where we are using it and that we’re testing it before it’s used.”

After the debate, Maroney said Connecticut cannot afford to wait.

“I think the argument you’ll hear to not do something now is that, ‘It’s too new.’ We can’t prevent this technology from coming out,” Maroney said. “But then if we wait a few years, the argument’s going to be, ‘It’s too established. You can’t change the rules of the road on this now.’”

Maroney said the intent is to understand and regulate AI, not try to stop its development or use. One use of AI is to manage, analyze and use vast data sets, something likely to be adopted by health care, insurance and other Connecticut industries, he said.

“We can definitely be a leader in that space with the insurance, commercial research and the excellent hospitals that we have here,” Maroney said. “We just want to make sure when we’re doing it, we’re doing it safely. And [that] we’re doing proper testing.”

The bill now goes to the House. The administration of Gov. Ned Lamont has expressed support.

Mark is the Capitol Bureau Chief and a co-founder of CT Mirror. He is a frequent contributor to WNPR, a former state politics writer for The Hartford Courant and Journal Inquirer, and contributor for The New York Times.