The Goals of Computer Science Versus the Goals of Consumerism (Planned Obsolescence)
JUST over a week ago I published "Old Does Not Mean Bad and Older is Not Always Worse" and I am pleased to say that I continue to listen to CDs from the 1990s using an optical drive. That it works as well as it did 30 years ago says a lot about longevity.
Newer is not always better (in every way). It may be better in some ways, but trade-offs exist and businesses' interests do not overlap users' interests, except sometimes. Today's "modern" tech gives us violent rioters because it's profitable.
Now, recall the peer-reviewed article "A Plea for Lean Software": (by Niklaus Wirth, who died not so long ago after he had contributed so much to Computer Science)
Memory requirements of today's workstations typically jump substantially--from several to many megabytes--whenever there's a new software release. When demand surpasses capacity, it's time to buy add-on memory. When the system has no more extensibility, it's time to buy a new, more powerful workstation. Do increased performance and functionality keep pace with the increased demand for resources? Mostly the answer is no. The author contends that software's girth has surpassed its functionality, largely because hardware advances make this possible. He maintains that the way to streamline software lies in disciplined methodologies and a return to the essentials. He explores the reasons behind software's increasing heft and relates the history of Project Oberon as an example of how software should be built. Oberon's primary goal was to show that software can be developed with a fraction of the memory capacity and processor power usually required without sacrificing flexibility, functionality, or user convenience. The Oberon system has been in use since 1989, serving purposes that include document preparation, software development, and computer-aided design of electronic circuits, among many others. The system includes storage management, a file system, a window display manager, a network with servers, a compiler, and text, graphics, and document editors.
Linus Torvalds yesterday: "Hopefully we've gotten rid of the bulk of the silly noise here in rc2, and not added too much new noise, so that we can get on with the process of finding more meaningful issues."
The elephant in the room: (from an AMD-funded site)
Last August I wrote an article about the open-source AMD GPU kernel driver crossing 5 million lines of code -- including their overzealous header files -- and following the recent Linux 6.11 merge window curiosity got the best of me with how much larger the kernel driver is now that the initial RDNA4 support is merged... Well, it's about to cross 5.8 million lines, or about a 16% increase just over the past year.
A million lines of code in 12 months is... a bit much. That's just for one brand (AMD).
Yes, just for AMD!
Linux has become a "write-only" kernel; every company that pays the "Linux Foundation tax" can dump endless code into it. It won't be audited by anyone other than this company's employees (that's just not feasible).
Who's gonna pay the price?
The poor user. █