Bonum Certa Men Certa

California State Assembly Has Approved S.B. 1047, Which Can be Used Against Proprietary Software

posted by Roy Schestowitz on Aug 30, 2024

California Dreaming postcard

Plus, sage old advice from Capt. Grace Hopper

THE American national archives have just been augmented by an epic old lecture or two of a scientist-celebrity (or national treasure) whose work interests us. This has been receiving some publicity, but what interests us most is how she talks of "cost of conversion" aka cost of proprietary, non-standard software. There's more through to the next 10 minutes or so. The subject came up in IRC several times this week (some people brought it up). "The Grace Hopper video is quite good," one person said. "The lecture was 1 year before GNU. FWIW the GNU Manifesto turns 40 next year. Something to put in the calendar."

Here is a lecture (in two parts): "Capt. Grace Hopper on Future Possibilities: Data, Hardware, Software, and People (Part One, 1982)" and "Capt. Grace Hopper on Future Possibilities: Data, Hardware, Software, and People (Part Two, 1982)".

This is rather timely because of growing recognition of the threat of proprietary software, even if most lawmakers allude to it using buzzwords like "hey hi" (AI). It means software that does something you neither understand nor control. Forget about the formal scientific definitions; those have long been abandoned by the media, which seems to be allocated a budget to hype up the bubble and feed the Ponzi scheme until the bubble "pops".

"The California State Assembly approved S.B. 1047 against "AI"," we got told. "Of course the devil is in the details, but if done right (that's a big *if*) then it could play a role in the advancement of FOSS and Open Data."

Revisit this old talk by Geer and especially see paragraphs starting with: "1. If you deliver your software with complete and buildable source code and a license that allows disabling any functionality or code the licensee decides, your liability is limited to a refund."

"Hey hi" typically means both the training set and the code are unavailable. Thus, the blackbox isn't even understood by its maker. That ought not be done.

As per press reports [1], "California state lawmakers attempted to introduce 65 bills touching on AI this legislative season" and [2] a "proposed law would require companies working on AI to test their technology before selling it for “catastrophic” risks". The law "would require big A.I. companies to test their systems for safety". [3]

The Verge says "Senator Scott Wiener, the bill’s main author, said SB 1047 is a highly reasonable bill" and we suggest reading "hey hi" as proprietary software because in practice that's just what they often refer to.

Ideally, the code will be Free software and models, if training upon some data, will make the data openly and freely available for audits (in order to prevent mischief). One way to make algorithms intentionally misbehave is to manipulate or bias the training set. So both are strictly needed.

Related/contextual items from the news:

  1. Musk voices support for California bill requiring safety tests on AI models

    California state lawmakers attempted to introduce 65 bills touching on AI this legislative season, according to the state’s legislative database, including measures to ensure all algorithmic decisions are proven unbiased and protect the intellectual property of deceased individuals from exploitation by AI companies. Many of the bills are already dead.

  2. California AI bill 1047, opposed by Pelosi, passes State Assembly

    The proposed law would require companies working on AI to test their technology before selling it for “catastrophic” risks such as the ability to instruct users in how to conduct cyberattacks or build biological weapons. Under the proposed law, if companies fail to conduct the tests and their tech is used to harm people, they could be sued by California’s attorney general. The bill only applies to companies training very large and expensive AI models, and its author, Democratic state Sen. Scott Wiener, has insisted it will not impact smaller startups seeking to compete with Big Tech companies.

  3. California Legislature Approves A.I. Safety Bill

    The State Assembly approved the measure, known as S.B. 1047, which would require big A.I. companies to test their systems for safety before releasing them to the public. The bill would also give the state’s attorney general the power to sue A.I. makers for serious harms caused by their technologies, like death or property damage.

  4. California State Assembly passes sweeping AI safety bill

    Senator Scott Wiener, the bill’s main author, said SB 1047 is a highly reasonable bill that asks large AI labs to do what they’ve already committed to doing: test their large models for catastrophic safety risk. “We’ve worked hard all year, with open source advocates, Anthropic, and others, to refine and improve the bill. SB 1047 is well calibrated to what we know about forseeable AI risks, and it deserves to be enacted.”

Other Recent Techrights' Posts

Microsoft's GitHub is Losing Traffic, Based on an Extensive Web Survey, and Its Future is Uncertain
Remember that Microsoft keeps close to its chest the operations and finances of GitHub (because it's embarrassing!)
[Meme] Shoestring Budget With Record Profits (Because Hundreds of Thousands of Fake European Patents Get Granted)
Record profits? EPO staff does not benefit!
 
Links 14/10/2024: Keeping Multiple Blogs, Wrestling With Misinformation
Links for the day
[Meme] Class of Microsoft
"Everything started with Microsoft DOS!"
History Education and Rejecting Creation Myths
The creator of Linux isn't the creator of GNU/Linux
How to Follow Our Updates About EPO (or Everything Else for That Matter)
follow us via RSS feeds
EPO Administration: Wait Several Months or Until Next Year for Clarifications
"After the intranet announcements of 18 September and 27 September and recent emails from CIGNA concerning opting into the VECOZO network, colleagues have been contacting us with queries and requests for guidance."
Over at Tux Machines...
GNU/Linux news for the past day
IRC Proceedings: Sunday, October 13, 2024
IRC logs for Sunday, October 13, 2024
Unrest at the European Patent Office as School Costs Eat Away the Income
"Letter to the administration on the Education Allowance - DISDH - German School"
Gemini Links 13/10/2024: ArcMenu, Emacs decide-mode, Midnight Pub Mass-Deletion Option
Links for the day
Links 13/10/2024: Science, Politics, and Some Gemini
Links for the day
Links 13/10/2024: Writing, Remembering John Wheeler, Voice Cloning
Links for the day
Certificate Authority Let's Encrypt Falls to 0.7% in Geminispace (It Was Around 12% Just 2 Years Ago and 7.5% This Past February)
Let's Encrypt is down again
Gemini Links 13/10/2024: Self-hosting Snac2 and Invasion of e-ink
Links for the day
SDxCentral, which the Linux Foundation Paid to Produce Marketing SPAM, Has Now Become Slop (LLM Spew) Disguised as 'Articles'
Google should delist it
Over at Tux Machines...
GNU/Linux news for the past day
IRC Proceedings: Saturday, October 12, 2024
IRC logs for Saturday, October 12, 2024
Links 12/10/2024: More Site Blocking, China's Hostility, and Evan Gershkovich's Upcoming Book
Links for the day
"Security Advantages" Explained by a Scammy "Security" Site That Uses LLMs to Spew Out Garbage
destroying the Web by saturating it with "bullshit".
Links 12/10/2024: Boeing to Cut 17,000 Jobs, Medieval Sleeping Habits, Warning About Liquidweb
Links for the day
Links 12/10/2024: Health, Safety and Climate Concerns
Links for the day
Gemini Links 12/10/2024: Ensemble and Assembler
Links for the day
Over at Tux Machines...
GNU/Linux news for the past day
Links 12/10/2024: TikTok Layoffs and Risk of More Wars
Links for the day
IRC Proceedings: Friday, October 11, 2024
IRC logs for Friday, October 11, 2024