Communications Assistance for Law Enforcement Act (CALEA) is a Far Bigger Problem Than Some Unintentional Bugs (Not Back Doors) in Software
3 days ago: Credit to Jessica Lyons at The Register for Covering the Communications Assistance for Law Enforcement Act (CALEA), Proving That Authorities Do Not Want and Probably Never Wanted Computer Security (Except for Themselves) (but "Jessica's latest article in The Reg fails to connect the cause, CALEA, to the Salt Typhoon circus," an associate notes)
THE Microsoft-sponsored 'news' site federalnewsnetwork.com
has just published a report that says Biden's White House or "national cyber director [is] finalizing software liability proposals" and "Harry Coker, speaking at the Foundation for the Defense of Democracies in Washington on Tuesday, ran down his office’s signature efforts, including the 2023 national cyber strategy and the push to establish minimum cyber standards for critical industries. Congress passed a law establishing the ONCD in 2021 to lead governmentwide cyber strategy and policy."
Here they go again with the "memory safety" cargo cult [1, 2] (while they themselves demand back doors): "In addition to implementation of the national cyber strategy, ONCD also now plays a key role in establishing agency priorities for cybersecurity, while also advancing distinct issues ranging from memory safe programing [sic] language to cyber workforce."
What's a "memory safe programing [sic] language"? Rust has holes in itself, never mind programs made using Rust. Moreover, Rust is controlled by some mentally unstable people with serious social problems that keep causing mass exodus and the whole project is outsourced to proprietary platforms with NSA (read: back doors) lurking. Is that what "security" means these days?
Now, regarding liability (the headline says "software liability proposals"), see section 3 in regards to the article above "because it has the potential to either enhance or to end software freedom," an associate has noted. To quote again:
3. Source code liability -- CHOICE
Nat Howard said that "Security will always be exactly as bad as it can possibly be while allowing everything to still function,"[NH] but with each passing day, that "and still function" clause requires a higher standard. As Ken Thompson told us in his Turing Award lecture, there is no technical escape;[KT] in strict mathematical terms you neither trust a program nor a house unless you created it 100% yourself, but in reality most of us will trust a house built by a suitably skilled professional, usually we will trust it more than one we had built ourselves, and this even if we have never met the builder, or even if he is long since dead.
The reason for this trust is that shoddy building work has had that crucial "or else ..." clause for more than 3700 years:
If a builder builds a house for someone, and does not construct it properly, and the house which he built falls in and kills its owner, then the builder shall be put to death. -- Code of Hammurabi, approx 1750 B.C.
Today the relevant legal concept is "product liability" and the fundamental formula is "If you make money selling something, then you better do it well, or you will be held responsible for the trouble it causes." For better or poorer, the only two products not covered by product liability today are religion and software, and software should not escape for much longer. Poul-Henning Kamp and I have a strawman proposal for how software liability regulation could be structured.
....................... 0. Consult criminal code to see if damage caused was due to intent or willfulness. .......................
We are only trying to assign liability for unintentionally caused damage, whether that's sloppy coding, insufficient testing, cost cutting, incomplete documentation, or just plain incompetence. Clause zero moves any kind of intentionally inflicted damage out of scope. That is for your criminal code to deal with, and most already do.
....................... 1. If you deliver your software with complete and buildable source code and a license that allows disabling any functionality or code the licensee decides, your liability is limited to a refund. .......................
Clause one is how to avoid liability: Make it possible for your users to inspect and chop out any and all bits of your software they do not trust or want to run. That includes a bill of materials ("Library ABC comes from XYZ") so that trust has some basis, paralleling why there are ingredient lists on processed foods.
The word "disabling" is chosen very carefully: You do not need to give permission to change or modify how the program works, only to disable the parts of it that the licensee does not want or trust. Liability is limited even if the licensee never actually looks at the source code; as long has he has received it, you (as maker) are off the hook. All your other copyrights are still yours to control, and your license can contain any language and restriction you care for, leaving the situation unchanged with respect to hardware-locking, confidentiality, secrets, software piracy, magic numbers, etc.
Free and Open Source Software (FOSS) is obviously covered by this clause which leaves its situation unchanged.
....................... 2. In any other case, you are liable for whatever damage your software causes when it is used normally. .......................
If you do not want to accept the information sharing in Clause 1, you fall under Clause 2, and must live with normal product liability, just like manufactures of cars, blenders, chain-saws and hot coffee.
How dire the consequences, and what constitutes "used normally" is for your legislature and courts to decide, but let us put up a strawman example:
A sales-person from one of your long time vendors visits and delivers new product documentation on a USB key, you plug the USB key into your computer and copy the files onto the computer.
This is "used normally" and it should never cause your computer to become part of a botnet, transmit your credit card number to Elbonia, or copy all your design documents to the vendor. If it does, your computer's operating system is defective.
The majority of today's commercial software would fall under Clause 2 and software houses need a reasonable chance to clean up their act or to move under Clause 1, so a sunrise period is required. But no longer than five years -- we are trying to solve a dire computer security problem here.
And that is it really: Either software houses deliver quality and back it up with product liability, or they will have to let their users protect themselves. The current situation -- users can't see whether they need to protect themselves and have no recourse to being unprotected -- cannot go on. We prefer self-protection (and fast recovery), but other's mileage may differ.
Would it work? In the long run, absolutely yes. In the short run, it is pretty certain that there will be some nasty surprises as badly constructed source code gets a wider airing. The FOSS community will, in parallel, have to be clear about the level of care they have taken, and their build environments as well as their source code will have to be kept available indefinitely.
The software houses will yell bloody murder the minute legislation like this is introduced, and any pundit and lobbyist they can afford will spew their dire predictions that "This law will mean the end of computing as we know it!"
To which our considered answer will be:
Yes, please! That was exactly the idea.
It seems like lobbying by Microsoft front groups - including the Linux Foundation - makes it into the lexicon and agenda of White House officials. Torvalds has not been keeping on top of this game because he's in no way controlling the Linux Foundation, here's merely an employee who codes in C (which his employer demonises, just like it attacks the licence of Linux). The Linux Foundation is nowadays controlled by Microsoft more than by any other company. █