Associated media – Linked media
California lawmakers last month advanced about 30 new AI measures aimed at protecting consumers and jobs, one of the largest efforts yet to regulate the new technology.
The bills aim to impose the toughest nationwide restrictions on artificial intelligence, which some technologists warn could kill entire categories of jobs, throw elections into chaos with misinformation and pose national security risks. California’s proposals, many of which have won broad support, include rules to prevent AI tools from discriminating in housing and health services. They also aim to protect intellectual property and jobs.
The California Legislature, which is expected to vote on the proposed laws by Aug. 31, has already helped shape U.S. tech consumer protections. In 2020 the state passed a privacy law that limited the collection of user data, and in 2022 it passed a child safety law that created protections for those under 18.
“As California has seen with respect to privacy, the federal government will not act, so we believe it is critical that California step up and protect our own citizens,” said Rebecca Bauer-Kahan, the Democratic assemblywoman who chairs the State Assembly Privacy and Consumer Protection Committee.
As federal lawmakers drag out AI regulation, state lawmakers have filled the void with a flurry of bills poised to become de facto regulations for all Americans. Tech laws like California’s often set a precedent for the nation, largely because lawmakers across the country know it can be difficult for companies to comply with a patchwork that crosses state lines.
State lawmakers across the country have proposed nearly 400 new AI laws in recent months, according to the advocacy group TechNet. California leads the states with a total of 50 proposed bills, though that number has shrunk as the legislative session has progressed.
Colorado recently enacted a comprehensive consumer protection law that requires AI companies to use “reasonable care” when developing the technology to avoid discrimination, among other issues. In March, the Tennessee Legislature passed the ELVIS Act (Ensuring Likeness Voice and Image Security Act), which protects musicians from using their voices and likenesses in AI-generated content without their explicit consent.
It’s easier to pass laws in many states than at the federal level, said Matt Perault, executive director of the Center on Technology Policy at the University of North Carolina at Chapel Hill. Forty states now have “trifecta” governments, in which both houses of the legislature and the governor’s office are run by the same party – most since at least 1991.
“We’re still waiting to see which proposals actually become law, but the sheer number of AI bills introduced in states like California shows how interested lawmakers are in this topic,” he said.
And the state proposals are having a ripple effect globally, said Victoria Espinel, chief executive of the Business Software Alliance, an advocacy group representing large software companies.
“Countries around the world are reviewing these drafts for ideas that can influence their decisions on AI laws,” he said.
Related media – Linked media