“Veto or Not to Veto?”: Playing with Fire and Reaching into the Flames

Loading the Elevenlabs Text to Speech AudioNative Player...

Veto or Not to Veto?
A Metaphor of Creating Fire and Getting Burned or Reaching Out Only to Get Bitten.

by Kevin Bihan-Poudec, and ChatGPT

The Context:

This blog post addresses the ongoing debate around California Senate Bill 1047, also known as the AI Safety Act or the AI Existential Risk Act. Governor Newsom faces a pivotal decision that could shape the future of AI regulation. Throughout the post, I’ll insert my own thoughts alongside insights from a recent article on Forbes.com.

Will he, or won’t he? That is the burning question that has the tech world, political landscape, and AI experts on edge. As Governor Newsom sits poised to decide the fate of California Senate Bill 1047—the Safe and Secure Innovation for Frontier Artificial Intelligence Models Act—it feels like we’re all reaching out toward a fire, unsure if we’ll get burned. And yet, this decision carries the weight of the future, possibly shaping the survival of humanity as we know it. To sign or veto? That’s not just a question of governance; it’s a question of existence.

Article Breakdown:

Forbes: Will he, or won’t he?

Kevin: That’s the question everyone is asking, or are they? Is the general public even aware of what’s at stake? The media isn’t covering it enough and while humanity’s future might hang in the balance, the masses remain largely oblivious.

Forbes: The big question right now is whether Governor Newsom will sign California Senate Bill 1047, titled the “Safe and Secure Innovation for Frontier Artificial Intelligence Models Act.”

This bill, also referred to as the AI Safety Act or the AI Existential Risk Act, has gained significant attention due to its attempt to introduce legally binding AI governance. If passed, it would be the first legislation of its kind.

A lot is at stake.

Kevin: Absolutely. The implications go far beyond job displacement or economic concerns. If unchecked, powerful AI models costing hundreds of millions could become an existential threat to humanity. What if we reach a "point of no return"? Could AI become autonomous? Could it create systems beyond our control? Not signing SB 1047 would be similar to unleashing a beast and simply hoping for the best. It’s like playing with fire and thinking you won’t get burned. (Trust me, as a child, I tried that more than once. Lesson learned.)

Forbes: As it stands, SB 1047 is on Governor Newsom’s desk, and he must make a decision by September 30, 2024. He can either sign it, veto it, or let it pass automatically through a “pocket signature.”

What’s Being Said About SB 1047:

AI pioneers are divided—some praise it as necessary, while others argue it will hinder progress. The debate is heated, with strong arguments on both sides.

Key Points in Favor of SB 1047:

  1. Prevents catastrophic AI risks.

  2. Holds AI creators accountable.

  3. Requires a standard of care in AI development.

  4. Provides for emergency shutdowns or kill switches.

  5. Sets a model for other states and the federal government.

  6. Urgency—action must be taken before AI spirals out of control.

  7. Societal benefits outweigh costs.

  8. Failure to enact could lead to disaster.

Key Points Against SB 1047:

  1. Broad government overreach.

  2. Stifles innovation.

  3. Excessive bureaucracy and compliance burdens.

  4. Impractical and misguided approach.

  5. Legal battles will ensue.

  6. Premature action; more research is needed.

  7. Costs outweigh the benefits.

  8. Enacting it could hinder U.S. competitiveness.

Kevin: Whether you're for or against it, the stakes are unprecedented. The irony is that the decision affecting millions, if not billions, rests with one individual: Governor Newsom.

Forbes: AI Bills Aplenty In California

At a major tech conference, Dreamforce, on September 17, 2024, Governor Newsom signed several pending AI bills, including one criminalizing AI-generated blackmail and another requiring the disclosure of AI-generated content.

Kevin: These are significant steps, but why hasn’t SB 1047 been signed? Could it be that SB 1047 comes with a personal cost for Newsom? His phone must be ringing off like crazy with calls and texts from the most powerful people in tech.

Forbes: Governor Newsom remarked at the same event: “What are the demonstrable risks in AI, and what are the hypothetical risks? I can’t solve for everything. What can we solve for?”

Kevin: This logic feels flawed. The whole point is that we don’t know the hypothetical risks, so why take the chance? Deciding on unknowns is not a sound approach. If even AI creators aren’t sure where AI will lead, how can we ignore the potential dangers? I wonder if Governor Newsom has seen Oppenheimer recently.

Forbes: Some suggest Newsom’s remarks indicate that SB 1047 might not get signed.

Kevin: Translation: "I’m taking millions from tech CEOs and putting their interests above the safety of Californians and the world."

Why SB 1047 Might Not Make It:

1. AI Luminaries Are Split:

Experts are divided on whether SB 1047 should be enacted. The debate is a toss-up.

2. Major AI Tech Firms Oppose SB 1047:

Kevin: No surprise there. This bill would hinder their plans to profit from AI that can control its users.

3. Nancy Pelosi Opposes SB 1047:

Kevin: It’s unclear if Pelosi fully understands the latest technological trends. Still, she looks great.

Forbes: Pelosi argued that while well-intentioned, SB 1047 is more harmful than helpful.

Kevin: What’s more harmful than unchecked AI itself? Isn’t long-term prosperity more important than short-term innovation?

Forbes: SB 1047 is the elephant in the room, much like the labor crisis caused by AI-driven hiring freezes. No one is talking about it.

Kevin: I’ve personally experienced this. I know firsthand the impact of unregulated AI on people’s lives.

Forbes: Many AI laws have already been enacted, but SB 1047 might not make the cut.

Kevin: It’s like ignoring a meteor hurtling toward Earth with a close to 100% chance of impact. Hollywood often knows what’s coming next (Don’t Look Up, anyone?).

Forbes: Some argue Congress should take the lead on AI regulation.

Kevin: More delays only give big tech more control. AI is evolving too quickly for human legislation to keep up.

Forbes: Kicking the can down the road is another option.

Kevin: America used to lead by thinking forward. What happened?

Forbes: If SB 1047 is vetoed, it doesn’t stop future AI legislation. We might see a similar bill introduced after the November elections.

Kevin: If Trump is elected, I doubt AI regulation will be a priority. The tech CEOs will lobby even harder to avoid restrictions.

Governor Newsom’s Signals:

Forbes: Newsom’s remarks at Dreamforce seemed to soften the blow of a potential veto.

Kevin: It seems unlikely that he’ll sign it. If he were going to, why hasn’t he already? It feels like he’s getting ready to veto while counting his future tech-funded millions.

What Happens Next?

Forbes: Time is running out, and we’ll know by September 30, 2024, whether SB 1047 gets signed.

Kevin: This feels like the longest week of my life. If SB 1047 doesn’t get signed, AI might take over before I even reach retirement.

Forbes: A pocket signature seems unlikely. Either Governor Newsom will sign or veto the bill.

Kevin: You can’t please everyone. At least he should prioritize the safety of those he serves.

Forbes: Life is tough sometimes. Newsom has to decide between signing SB 1047 or vetoing it.

Kevin: Tough indeed, especially when the future of humanity hangs in the balance.

Forbes: Will he, or won’t he?

Kevin: Time will soon tell, but this decision could bring us closer to technological singularity and the future of humankind might rest on a single signature.

 

Read a revised version generated by artificial intelligence based on the following prompt. Read on to experience how AI interprets and responds to it.

Veto or Not to Veto?

Kevin: But is anyone really asking that question? It feels like the general public is blissfully unaware. While Governor Newsom debates over the fate of AI, our lives and futures hang in the balance, but where’s the media coverage? It’s almost as though this monumental decision is being made in a vacuum.

As Forbes recently discussed, SB 1047 has garnered a lot of attention due to its sweeping nature. It’s an AI-focused bill that could be the first of its kind, establishing legally binding governance over powerful AI models. The stakes? Everything from preventing job displacement and economic disruption to addressing the existential threat AI could pose to humanity. We’re standing at the edge of a cliff, with only Governor Newsom deciding whether we take a step forward or back.

Forbes: “A lot is at stake.”

Kevin: A lot indeed. Imagine if we don’t sign this bill and powerful AI models, costing upwards of $100 million, go unchecked. We could be unleashing a beast beyond our control, a technology that could evolve past our understanding. To not sign SB 1047 is akin to playing with fire and hoping we don’t get burned—a gamble with far more than jobs at risk. It’s like the childhood fallacy of thinking you can touch fire and not get hurt. Spoiler: you always get burned.

Forbes: The decision must be made by September 30, 2024. Time is ticking.

Governor Newsom finds himself in a particularly precarious position. He could approve SB 1047, veto it, or let it pass into law without taking action. The bill is divisive, with both staunch supporters and fervent opponents. AI luminaries are split, as are tech firms and political figures like Nancy Pelosi, who has come out against the bill.

Kevin: It’s wild to think that one person—Governor Newsom—holds the fate of this decision. AI could impact millions, maybe billions of lives, yet the choice lies with one individual. What does this say about how we handle existential threats?

Forbes: Governor Newsom recently signed several other AI-related bills, but SB 1047 remains the elephant in the room.

Kevin: Sure, SB 926 criminalizing AI-generated blackmail is great. SB 942 requiring AI content disclosure? Also, great. But why the hesitation with SB 1047? Perhaps the weight of this bill is heavier on the Governor’s shoulders. Maybe there’s more at play here—like the pressure from tech CEOs whose fortunes are tied to unchecked AI development.

Forbes: Governor Newsom himself remarked, “What are the demonstrable risks in AI and what are the hypothetical risks? I can’t solve for everything.”

Kevin: But isn’t that the whole point? We don’t know what the hypothetical risks are—that’s exactly why we should tread carefully. The Governor’s comments seem to lean toward ignoring the unknowns, yet that’s where the danger lies. If the risks are uncertain, why gamble with the future?

The irony is thick. It’s almost as though by vetoing the bill, Governor Newsom would be reaching out his hand to be bitten by the very beast he could have controlled.

Forbes: Major AI tech firms are also opposed to SB 1047.

Kevin: Of course they are. This bill stands to rein in their unchecked ambitions. They’re not worried about innovation or progress; they’re worried about profit. And as Nancy Pelosi has said, the bill is “well-intentioned but ill-informed.” But isn’t the real question what’s more harmful—stifling innovation or allowing AI to develop without restraint?

Forbes: If SB 1047 doesn’t pass, the California legislature can still claim victory for other AI-related bills.

Kevin: Victory? Is it really a victory to pass smaller, less consequential bills while ignoring the one that could protect us from existential risk? It’s like addressing a minor cut while ignoring the gaping wound that needs stitches.

Forbes: Some suggest the blame could easily be shifted to the federal government, saying AI regulation should be handled by Congress.

Kevin: Sure, Congress could take this on—if they weren’t bogged down by bipartisan gridlock and election-year distractions. But let’s be real, while we wait for federal action, AI is evolving at an exponential rate. Legislators can’t keep up. And who benefits from that delay? Big tech.

Forbes: Some suggest that vetoing SB 1047 would simply be “kicking the can down the road.”

Kevin: Exactly. Procrastination on this issue could lead to irreversible consequences. This isn’t about slowing down AI; it’s about controlling a technology that even its creators don’t fully understand.

Forbes: The clock is ticking, and we’ll soon know whether Governor Newsom will sign or veto the bill.

Kevin: The world is watching. Well, at least some of us are. For those paying attention, the decision to sign or veto SB 1047 could determine whether AI becomes our greatest ally or our worst enemy. Time will tell, but the longer we wait, the more we risk.

It’s not just about making a decision; it’s about making the right decision. Governor Newsom, don’t let us reach into the flames just to get burned.

Previous
Previous

Hollywood's Biggest Names Demand Action! From J.J. Abrams to Mark Hamill, over 125 of Hollywood’s most influential figures urge Gov. Newsom to Sign AI Safety Bill

Next
Next

Why France is Leading in AI, and Why the U.S. Needs to Catch Up: An Approach for AI Advocacy (On-going)