• Home
  • Blog
  • AI News
  • California’s AI Law Proves Oversight and Innovation Can Coexist

California’s AI Law Proves Oversight and Innovation Can Coexist

Updated:October 2, 2025

Reading Time: 3 minutes
A legal gavel

California Governor Gavin Newsom has signed SB 53 into law, marking the nation’s first comprehensive AI safety and transparency bill. 

The legislation, which applies to large AI developers, demonstrates that regulation and innovation can coexist. It compels them to disclose their safety and security practices. 

These disclosures must cover how they prevent catastrophic risks, including cyberattacks on critical infrastructure and the misuse of AI for biological threats.

Importantly, SB 53 also mandates that companies follow through on the safety measures they outline. 

The Office of Emergency Services will oversee compliance, ensuring that safety promises translate into practice. 

This structure makes the law more about accountability than about creating new burdens. Companies already engage in many of the practices SB 53 formalizes.

Regulation and Innovation

Adam Billen, vice president of public policy at Encode AI, argues that the law achieves balance.

Speaking on TechCrunch’s Equity podcast, he explained that policymakers can protect the public while still supporting progress.

He noted that most labs already run safety tests and publish model cards. Yet competition sometimes leads firms to relax standards. 

Billen warned that without oversight, companies may cut corners to outpace rivals. “Bills like this are important,” he said, “because they ensure firms live up to the commitments they make.”

Adam Billen, VP Encode AI 
Source: Encode AI 

Silicon Valley

Reaction to SB 53 has been less intense than to SB 1047, a bill Governor Newsom vetoed last year. 

Still, many industry leaders continue to claim that regulation threatens U.S. competitiveness, especially against China. That fear has spurred political spending. 

Venture capital firms such as Andreessen Horowitz, companies like Meta, and high-profile figures, including OpenAI’s Greg Brockman, have invested heavily in super PACs supporting pro-AI candidates. 

Earlier this year, many of these players also backed a proposal for a 10-year moratorium on state-level AI regulation.

Encode AI and more than 200 allied groups opposed the moratorium, and they succeeded in blocking it. Yet Billen cautions that similar efforts are underway in Congress.

Exemptions 

Senator Ted Cruz recently introduced the SANDBOX Act. It would allow AI firms to apply for waivers that exempt them from federal rules for up to ten years. 

In addition, lawmakers are discussing a new bill that would set a single federal AI standard. Supporters may present the measure as a compromise. 

However, Billen warns that it could effectively override state laws. “Narrowly scoped federal legislation could delete federalism for the most important technology of our time,” he said.

China 

Billen acknowledged that the U.S. must take China seriously in the global AI race. However, he argued that blocking state-level bills is not the right response. 

Most of those bills, he noted, focus on issues such as deepfakes, transparency, algorithmic bias, child safety, and government use of AI.

“Are bills like SB 53 the thing that will stop us from beating China? No,” he said. “It is intellectually dishonest to suggest otherwise.”

Instead, he pointed to export controls and chip access as the real tools for maintaining U.S. leadership. 

“If the thing you care about is beating China in the race on AI, and I do care about that, then the things you would push for are export controls in Congress. 

You would make sure that American companies have the chips. But that’s not what the industry is pushing for.”

AI Chips

Efforts like the Chip Security Act aim to prevent advanced AI chips from being diverted to China by enforcing stricter controls and tracking systems. 

The CHIPS and Science Act already seeks to expand U.S. chip manufacturing and reduce dependence on foreign supply chains.

Yet large technology companies have not always embraced these measures. Nvidia, one of the world’s most important AI chipmakers, has resisted stronger controls. 

China remains one of its most profitable markets. Billen suggested that OpenAI’s muted position may stem from its dependence on Nvidia’s hardware and a desire to avoid conflict with a vital supplier.

Policy has also been inconsistent at the federal level. In April 2025, the Trump administration expanded export restrictions on advanced chips to China. 

But only three months later, it allowed Nvidia and AMD to resume limited sales, in exchange for a 15% revenue share.

“You see people on the Hill moving towards bills like the Chip Security Act that would put export controls on China,” Billen said. 

“In the meantime, there’s going to continue to be this propping up of the narrative to kill state bills that are actually quite light-touch.”

A Necessary Process

For Billen, SB 53 is a reminder of how policymaking is supposed to work. He described it as “very ugly and messy,” yet also as proof that compromise between industry and lawmakers is still possible.

The bill, he argued, reflects a healthy democratic process in which competing interests negotiate, adapt, and eventually agree on safeguards that balance innovation and protection.

“I think SB 53 is one of the best proof points that that can still work,” he said.

Lolade

Contributor & AI Expert