Bonum Certa Men Certa

Permacomputing

posted by Roy Schestowitz on Nov 28, 2023

Original here (attribution and licence at the bottom, too)


This is a collection of random thoughts regarding the application of permacultural ideas to the computer world.

See also: Permacomputing update 2021

Some have tried to connect these worlds before (WikiWikiWeb's Permaculture article; Kent Beck's short-lived idea of Permaprogramming), but these have mostly concentrated on enhancing software engineering practices with some ideas from gardening. I am more interested in the aspect of cultural and ecological permanence. That is, how to give computers a meaningful and sustainable place in a human civilization that has a meaningful and sustainable place in the planetary biosphere.

1. Problem

Over the last few hundred years of human civilization, there has been a dramatic increase in the consumption of artificially produced energy. In the overarching story, this is often equated with "progress".

In the computer world, this phenomenon gets multiplied by itself: "progress" facilitates ever greater densities of data storage and digital logic, thus dramatically exploding the availability of computing resources. However, the abundance has also caused an equivalent explosion in wastefulness, which shows in things like mindblowingly ridiculous hardware requirements for even quite trivial tasks.

At the same time, computers have been failing their utopian expectations. Instead of amplifying the users' intelligence, they rather amplify their stupidity. Instead of making it possible to scale down the resource requirements of the material world, they have instead become a major part of the problem. Instead of making the world more comprehensible, they rather add to its incomprehensibility. And they often even manage to become slower despite becoming faster.

In both computing and agriculture, a major issue is that problems are too often "solved" by increasing controllability and resource use. Permaculture takes another way, advocating methods that "let nature do the work" and thus minimize the dependence on artificial energy input. Localness and decentralization are also major themes in the thought.

What makes permacultural philosophy particularly appealing (to me) is that it does not advocate "going back in time" despite advocating a dramatic decrease in use of artificial energy. Instead, it trusts in human ingenunity in finding clever hacks for turning problems into solutions, competition into co-operation, waste into resources. Very much the same kind of creative thinking I appreciate in computer hacking.

The presence of intelligent life in an ecosystem can be justified by its strengthening effect. Ideally, humans could make ecosystems more flexible and more resilient because of their ability to take leaps that are difficult or impossible for "unintelligent" natural processes. The existence of computers in a human civilization can be justified by their ability to augment this potential.

2. Physical resources

2.1. Energy

Permaculture emphasizes resource-sensitivity. Computers primarily use electricity, so to them resource-sensitivity primarily means 1) adapting to changes in energy conditions and 2) using the available energy wisely. Today's computers, even mobile ones, are surprisingly bad at this. This is partially due to their legacy as "calculation factories" that are constantly guaranteed all the resources they "need".

In permacomputing, intense non-urgent computation (such as long machine learning batches) would take place only when a lot of surplus energy is being produced or there is a need for electricity-to-heat conversion. This requires that the computer is aware of the state of the surrounding energy system.

At times of low energy, both hardware and software would prefer to scale down: background processes would freeze, user interfaces would become more rudimentary, clock frequencies would decrease, unneeded processors and memory banks would power off. At these times, people would prefer to do something else than interact with computers.

It is often wise to store energy for later use. Flywheels are a potential alternative to chemical batteries. They have similar energy densities (MJ/kg) but require no rare-earth materials and last for decades or centuries instead of mere years.

2.2. Silicon

IC fabrication requires large amounts of energy, highly refined machinery and poisonous substances. Because of this sacrifice, the resulting microchips should be treasured like gems or rare exotic spices. Their active lifespans would be maximized, and they would never be reduced to their raw materials until they are thoroughly unusable.

Instead of planned obsolescence, there should be planned longevity.

Broken devices should be repaired. If the community needs a kind of device that does not exist, it should preferrably be built from existing components that have fallen out of use. Chips should be designed open and flexible, so that they can be reappropriated even for purposes they were never intended for.

Complex chips should have enough redundancy and bypass mechanisms to keep them working even after some of their internals wear out. (In a multicore CPU, for instance, many partially functioning cores could combine into one fully functioning one.)

Chips that work but whose practical use cannot be justified can find artistic and other psychologically meaningful use. They may also be stored away until they are needed again (especially if the fabrication quality and the storage conditions allow for decades or centuries of "shelf life").

Use what is available. Even chips that do "evil" things are worth considering if there's a landfill full of them. Crack their DRM locks, reverse-engineer their black boxes, deconstruct their philosophies. It might even be possible to reappropriate something like Bitcoin-mining ASICs for something artistically interesting or even useful.

Minimized on-chip feature size makes it possible to do more computation with less energy but it often also means increased fragility and shorter lifespans. Therefore, the densest chips should be primarily used for purposes where more computation actually yields more. (In entertainment use, for example, a large use of resources is nothing more than a decadent esthetic preference.)

Alternatives to semiconductors should be actively researched. Living cells might be able to replace microchips in some tasks sometime in the future.

Once perfectly clean ways of producing microchip equivalents have been taken to use, the need for "junk fetishism" will probably diminish.

2.3. Miscellaneous

Whenever bright external light is available, displays should be able to use it instead of competing against it with their own backlight. (See: Transflective LCD)

Personally-owned computers are primarily for those who dedicate themselves to the technology and thus spend considerable amounts of time with it. Most other people would be perfectly happy with shared hardware. Even if the culture and society embraced computers more than anything else, requiring everyone to own one would be an overkill.

3. Observation and interaction

The first item in many lists of permacultural principles is "Observe and interact." I interpret this as primarily referring to a bidirectional and co-operative relationship with natural systems: you should not expect your garden to be easily top-down controllable like an army unit but accept its quirkiness and adapt to it.

3.1. Observation

Observation is among the most important human skills computers can augment. Things that are difficult or impossible for humans to observe can be brought within human cognitive capacity by various computational processes. Gathered information can be visualized, slight changes and pattern deviances emphasized, slow processes sped up, forecasts calculated. In Bill Mollison's words, "Information is the critical potential resource. It becomes a resource only when obtained and acted upon."

Computer systems should also make their own inner workings as observable as possible. If the computer produces visual output, it would use a fraction of its resources to visualize its own intro- and extrospection. A computer that communicates with radio waves, for example, would visualize its own view of the surrounding radio landscape.

Current consumer-oriented computing systems often go to ridiculous lengths to actually prevent the user from knowing what is going on. Even error messages have become unfashionable; many websites and apps just pretend everything is fine even if it isn't. This kind of extreme unobservability is a major source of technological alienation among computer users.

The visualizations intended for casual and passive observation would be pleasant and tranquil while making it easy to see the big picture and notice the small changes. Tapping into the inborn human tendency to observe the natural environment may be a good idea when designing visualizers. When the user wants to observe something more closely, however, there is no limit in how flashy, technical and "non-natural" the presentation can be, as long as the observer prefers it that way.

3.2. Yin and yang hacking

Traditional computer hacking is often very "yang". A total understanding and control of the target system is valued. Changing a system's behavior is often an end in itself. There are predefined goals the system is pushed towards. Optimization tends to focus on a single measurable parameter. Finding a system's absolute limits is more important than finding its individual strengths or essence.

In contrast, "yin" hacking accepts the aspects that are beyond rational control and comprehension. Rationality gets supported by intuition. The relationship with the system is more bidirectional, emphasizing experimentation and observation. The "personality" that stems from system-specific peculiarities gets more attention than the measurable specs. It is also increasingly important to understand when to hack and when just to observe without hacking.

The difference between yin and yang hacking is similar to the difference between permaculture and industrial agriculture. In the latter, a piece of nature (the field) is forced (via a huge energy investment) into an oversimplified state that is as predictable and controllable as possible. Permaculture, on the other hand, emphasizes a co-operative (observing and interacting) relationship with the natural system.

Yang hacking is quite essential to computing. After all, computers are based on comprehensible and deterministic models that tiny pieces of nature are "forced" to follow. However, there are many kinds of systems where the yin way makes much more sense (e.g. the behavior of neural networks is often very difficult to analyze rationally).

Even the simplest programmable systems have a "yin" aspect that stems from the programmability itself. Also, taking the yang type of optimization to the extreme (like in the sub-kilobyte demoscene categories), one often bumps into situations where the yin way is the only way forward.

Intellectual laziness may sometimes result in computing that is too yin. An example would be trying to use a machine learning system to solve a problem before even considering it analytically.

3.2.1. Processes

There are many kinds of computational processes. Some produce a final definitive result, some improve their result gradually. Some yield results very quickly, some need more time.

Computing world still tends to prefer classic, mainframe-style processes that are one-shot and finite. No improvement over previous results, just rerun the entire batch from scratch. Even when a process is naturalistic, slow, gradual and open-ended – as in many types of machine learning – computer people often force it into the mainframeishly control-freaky framework. Some more "yin-type" attitude would be definitely needed.

4. Progress

The fossil-industrial story of linear progress has made many people believe that the main driver for computer innovation would be the constant increase of computing capacity. I strongly disagree. I actually think it would be more accurate to state that some innovation has been possible despite the stunting effect of rapid hardware growth (although this is not a particularly accurate statement either).

The space of technological possibilities is not a road or even a tree: new inventions do not require "going forward" or "branching on the top" but can often be made from even quite "primitive" elements. The search space could be better thought about as a multidimensional rhizomatic maze: undiscovered areas can be expected to be found anywhere, not merely at the "frontier". The ability to speed fast "forward" on a "highway of technology" tends to make people blind to the diversity of the rhizome: the same boring ideas get reinvented with ever higher specs, and genuinely new ideas get downplayed.

The linear-progressivist idea of technological obsolescence may stem from authoritarian metaphors: there may only be one king at a time. This idea easily leads to an impoverished and monocultural view of technology where there is room for only a handful of ideas at a time.

Instead of technological "progress" (that implies constant abandoning of the old), we should consider expanding the diversity and abundance of ideas. Different kinds of technology should be seen as supporting each other rather than competing against each other for domination.

In nature, everything is interdependent, and these interdependencies tend to strengthen the whole. In technology, however, large dependency networks and "diversity of options" often make the system more fragile. Civilization should therefore try to find ways of making technological dependencies work more like those in nature, as well as ways of embracing technological diversity in fruitful ways.

5. Programming

Programmability is the core of computing and the essence of computer literacy. Therefore, users must not be deliberately distanced from it. Instead, computer systems and user cultures should make programming as relevant, useful and accessible as possible.

Any community that uses computers would have the ability to create its own software. A local software would address local needs better than the generic "one size fits all" solutions would.

Rather than huge complex "engines" that can be reconfigured for different requirements, there would be sets of building blocks that could be used to create programs that only have the features necessary to fill their given purposes.

Most of today's software engineering practices and tools were invented for a "Moore's law world" where accumulation, genericity and productization are more important than simplicity and resource-sensitivity. New practices and tools will be needed for a future world that will no longer approve wasteful use of resources.

Optimization/refactoring is vitally important and should take place on all levels of abstraction, by both human and AI codecrafters.

Ideally, it would be possible to invent and apply esoteric tricks without endangering the clarity or correctness of the main code (by separating the problem definition from implementation details, for example). It might be wise to maintain databases for problem solutions, optimization/refactoring tricks and reduction rules and develop ways to (semi)automatically find and apply them.

6. Software

There are many kinds of software, and very few principles apply to all of them. Some programs are like handheld tools, some programs are like intelligent problem-solvers, some programs are like gears in an engine, and some programs are nothing like any of those.

6.1. Dumb programs

A program that is intended to be like a tool should be understandable, predictable and wieldy. It should be simple enough that a proficient user can produce an unambiguous and complete natural-language description of what it does (and how). Ideally, the actual executable program would not be larger than this description.

The ideal wieldiness may be compared to that of a musical instrument. The user would develop a muscle-memory-level grasp of the program features, which would make the program work like an extension of the user's body (regardless of the type of input hardware). There would be very few obstacles between imagination and expression.

The absolute number of features is not as important as the flexibility of combining them. Ideally, this flexibility would greatly exceed the intentions of the original author of the program.

6.2. Smart programs

In addition to what is commonly thought as artificial intelligence, smartness is also required in tasks such as video compression and software compilation. Anybody/anything intending to perform these tasks perfectly will need to know a large variety of tricks and techniques, some of which might be hard to discover or very specific to certain conditions.

It is always a nice bonus if a smart program is comprehensible and/or uses minimal resources, but these attributes are by no means a priority. The results are the most important.

One way to justify the large resource consumption of a smart program is to estimate how much resources its smartness saves elsewhere. The largest savings could be expected in areas such as resource and ecosystem planning, so quite large artificial brains could be justified there. Brains whose task is to optimize/refactor large brains may also be large.

When expanding a "dumb" tool-like program with smartness, it should never reduce the comprehensibility and wieldiness of the core tool. It should also be possible to switch off the smartness at any time.

6.2.1. Artificial intelligence

Artificial intellects should not be thought about as competing against humans in human-like terms. Their greatest value is that they are different from human minds and thus able to expand the intellectual diversity of the world. AIs may be able to come up with ideas, designs and solutions that are very difficult for human minds to conceive. They may also lessen the human burden in some intellectual tasks, especially the ones that are not particularly suitable for humans.

Since we are currently in the middle of a global environmental crisis that needs a rapid and complete redesign of the civilization, we should co-operate with AI technology as much as we can.

AI may also be important as artificial otherness. In order to avoid a kind of "anthropological singularity" where all meaning is created by human minds, we should learn to embrace any non-human otherness we can find. Wild nature is the traditional source of otherness, and a contact with extraterrestrial lifeforms would provide another. Interactions with artificial intelligence would help humans enrich their relationships with otherness in general.

6.3. Automation

Permaculture wants to develop systems where nature does most of the work, and humans mostly do things like maintenance, design and building. A good place for computerized automation would therefore be somewhere between natural processes and human labor.

Mere laziness does not justify automation: modern households are full of devices that save relatively little time but waste a lot of energy. Automation is at its best at continuous and repetitive tasks that require a lot of time and/or effort from humans but only a neglectable amount of resources from a programmable device.

6.4. Maintenance

Many programs require long-term maintenance due to changing requirements and environments. This is an area where gardening wisdom can be useful. A major difference is that a software program is much easier to (re)create from scratch than a garden.

Most changes to a program tend to grow its size/complexity. This effect should be balanced with refactoring (that reduces the size/complexity). The need for refactoring is often disregarded in today's "Moorean" world where software bloat is justified by constant hardware upgrades. In an ideal world, however, the constant maintenance of a program would be more likely to make it smaller and faster than to bloat it up.

Programs whose functionality does not change should not require maintenance other than preservation. In order to eliminate "platform rot" that would stop old software from working, there would be compatibility platforms that are unambiguously defined, completely static (frozen) and easy to emulate/virtualize.

7. Culture

7.1. Relationship with technology

Any community that uses a technology should develop a deep relationship to it. Instead of being framed for specific applications, the technology would be allowed to freely connect and grow roots to all kinds of areas of human and non-human life. Nothing is "just a tool" or "just a toy", nobody is "just a user".

There would be local understanding of each aspect of the technology. Not merely the practical use, maintenance and production but the cultural, artistic, ecological, philosophical and historical aspects as well. Each local community would make the technology locally relevant.

Each technology would have one or more "scenes" where the related skills and traditions are maintained and developed. Focal practices are practiced, cultural artifacts are created, enthusiasm is roused and channeled, inventions are made. The "scenes" would not replace formal institutions or utilitarian practices but would rather provide an undergrowth to support them.

No technology should be framed for a specific demographic segment or a specific type of people. The "scenes" should embrace and actively extend the diversity of their participants.

Theoretical and practical understanding are equally important and support one another. Even the deepest academic theorist would sometimes make their hands dirty in order to strengthen their theory, and even the most pragmatic tinkerer would deepen their practice with some theoretical wisdom.

7.2. Telecommunication

The easiest way to send a piece of information between two computers should always be the one that uses the least energy without taking too much time. The allowable time would depend on the context: in some cases, a second would be too much, while in some others, even several days would be fine. If the computers are within the same physical area, direct peer-to-peer links would be preferred.

When there are multiple simultaneous recipients for the same data, broadcast protocols would be preferred. For high-bitrate transfers (e.g. streaming video), shared broadcasts would also be culturally encouraged: it is a better idea to join a common broadcast channel than request a separate serving of the file.

General-purpose communication platforms would not have entertainment as a design priority. The exchange of messages and information would be slow and contemplative rather than fast and reactive. In public discussion, well-thought-out and fact-based views would be the most respected and visible ones.

Communication networks may very well be global and the protocols standardized, but the individual sites (platforms, forums, interfaces, BBSes) would be primarily local. Global, proprietary social media services would not be desirable, as they enforce the same "one size fits all" monoculture everywhere.

All the most commonly needed information resources would be available at short or moderate distances. A temporary loss of intercontinental network connection would not be something most users would even notice.

It should be easy to save anything from the network into local files. "Streaming-only" or other DRM-locked media would not exist.

People would be aware of where their data is physically located and prefer to have local copies of anything they consider important.

Any computer should be usable without a network connection.

7.3. Audiovisual media

Many people prefer to consume their audiovisual media at resolutions and data rates that are as high as possible (thus consuming as much energy as possible). This is, of course, an extremely unsustainable preference.

There are countless ways, most of them still undiscovered, to make low and moderate data complexities look good — sometimes good enough that increased resolution would no longer improve them. Even compression artifacts might look so pleasant that people would actually prefer to have them.

For extreme realism, perfection, detail and sharpness, people would prefer to look at nature.

7.4. Commons

Societies should support the development of software, hardware and other technology in the same way as they support scientific research and education. The results of the public efforts would be in the public domain, freely available and freely modifiable. Black boxes, lock-ins, excessive productization and many other abominations would be marginalized.


Written by Ville-Matias "Viznut" Heikkilä.
2020-06-24: initial release
2020-06-30: license added, cosmetic changes
2021-08-12: linked to 2021 update
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Other Recent Techrights' Posts

Last Week's EPO Strike Was the Biggest (Highest Participation Rate), Hours Ago General Assembly Discussed Next (Growing) Intensity of Strikes
Well done and well attended
 
Gemini Links 23/03/2026: "Mandatory" Bad Things and Dangers of Perfection Aspirations
Links for the day
SLAPP Censorship - Part 20 Out of 200: All Roads Lead to Rome and to GAFAM Funding
Now about 10% into this series
Mass Layoffs at HashiCorp, IBM Hid Them
The media did not mention those layoffs
Microsoft Downgraded on Concerns (Lack of Growth) Amid Silent Layoffs in 2026
The press isn't functioning anymore
Links 23/03/2026: Gulf Water at Risk, Heatwave in Malaysia
Links for the day
Slop Means False, New Article by Cybershow
"We are living in a world that is rapidly divesting from reality."
Debianism election 2026 community poll created, everybody can vote
Reprinted with permission from Daniel Pocock
Links 23/03/2026: "Shocking Peter Thiel Antichrist Lectures", Robert Mueller Remembered
Links for the day
The Scandal Bigger Than IBM/Red Hat Layoffs is the de Facto "Media Blackout" About Those Layoffs
So we have a media crisis, aside from the economic crises
Gemini Links 23/03/2026: Geminispace/Elpher Enhancement and the Cerberus Cinco
Links for the day
Fear is Not a Legitimate Factor
Smart people know that trying to prevent moral people from doing the "Right Thing" will backfire
Fuel Autonomy and What It Teaches Us About Software Autonomy (or Software Freedom)
Need we wait until a "software Pearl Harbor" or protect ourselves proactively by weaning ourselves off of GAFAMware?
Scheduled Maintenance This Coming Wednesday
Other than that, all is the same and we carry on as usual
Most Press Articles About IBM Are LLM Slop, Sometimes With Slop Images
IBM basically laid off almost 1,000 people last week [...] At the moment about 75% of the 'articles' we see about IBM (in recent days) are some kind of slop
Links 23/03/2026: Security Breaches, Energy Shortages, Another SRA Scandal, and Patents on Nature
Links for the day
Over at Tux Machines...
GNU/Linux news for the past day
IRC Proceedings: Sunday, March 22, 2026
IRC logs for Sunday, March 22, 2026
Streisand Effect and Justice
This weekend this site has served over 8 million Web requests
Gemini Links 22/03/2026: "Woman of Tomorrow" and "First Steps in Geminispace"
Links for the day
SLAPP Censorship - Part 19 Out of 200: They Were Ill-prepared for Tough Questions in Cross-Examination
Very ill-prepared for the deteriorating situation caused by their clients' past behaviour towards many people, including high-profile figures who offered to testify
The Media Sold Out to Slop Bros
If you wish for the hype to stop, then stop participating in it
EPO Strike a Week From Now, After That Strikes Can Become Permanent
A week from tomorrow there will be another strike
The Only Non-IBM Staff in Fedora Council/Leadership Attacks Booting Freedom (Just Like the Master Wants)
Last week IBM laid off almost 1,000 people in Confluent and the media didn't write anything about it, so don't expect anyone in what's left of the media to comment on Fedora's demise and silent layoffs at Red Hat
Just Like a Founder of XBox Said, Microsoft XBox is Collapsing, Management Continue to Jump Ship
Nowadays Microsoft tries to promote this idea that Windows is XBox and XBox is Windows
Links 22/03/2026: Slop Triggers Emergency at Meta, Energy Prices Rise Sharply
Links for the day
Links 22/03/2026: Microsoft 'Open' 'AI' in Legal Trouble (Plagiarism, Distortion, Misrepresentation); Facebook/Meta Kills Off "Horizon Worlds"
Links for the day
Racism Dressed Up as "Choice"
Racism is rampant at IBM
Probably an All-Time Record
Our investment in our own SSG is paying off
Your Site Should Implement Its Own Search (Before It's Too Late)
GAFAM was never trustworthy
Gemini Links 22/03/2026: LLM Slop Attacks USENET, Announcing Pig (New Game in Gemini Protocol)
Links for the day
Over at Tux Machines...
GNU/Linux news for the past day
IRC Proceedings: Saturday, March 21, 2026
IRC logs for Saturday, March 21, 2026
SLAPP Censorship - Part 18 Out of 200: Third Parties Funding Attacks on the Messengers, Lawsuits Against GAFAM-Critical Voices That Uphold Real National Security
Women are like kryptonite to them
Never Trust People Who Write Their Own Wikipedia Pages (Vanity Pages About Themselves) or Ask Friends to Do So. Also: Jono Bacon is Married to Microsoft.
We'd hardly be the first to point out Wikipedia isn't what it seems
No Tolerance for Attacks on Family Members
Being a Free software activist ought not lead to "collateral damage" like attacks on family members, including doxing
Sirius Open Source is Just a Zombie Firm With Shell Entities
Many companies fake their health and their size
Communities Can Only Survive When Trust Prevails
PCLinuxOS is still a vibrant and authentic community
Techrights Was Always a Community Site
The harder we're attacked, the more people participate in the site
Maintenance Reminder
We'll carry on publishing
Behind the PR Smokescreen and Microsoft-Sponsored Chaff, Microsoft Layoffs in "AI" Alleged This Month
In an age when ~1,000 simultaneous layoffs aren't enough to receive any media coverage, what can we expect remaining publishers to tell us about Microsoft layoffs in 2026?
EPO "Cocaine Communication Manager" - Part VIII - Mobbing and Silencing of Dissenting Staff
that's the very cornerstone of functional democracies with real opposition parties
Bluewashing at Confluent: Some Workers to Leave Within 3 Months (IBM Mass Layoffs)
Is the "era of AI" an era when none of the media will mention over 800 layoffs? [...] There's a lesson here about the state of the contemporary media, not just IBM and bluewashing
Microsoft OpenAI, Drowning in Debt and Forced to Make Significant Cuts (as Reports Reveal This Month), Does Hiring Disguised as "Takeovers" to Fake Value or Alleged Potential
Remember what happened to Skype last year
Reader Shares Recent Memes on Slop and 'Coding' by LLMs
"just some funny memes I thought were relevant to current coverage."
Slop Does Not Replace Art, It Contaminates Everything With Reckless Nonsense
many Computer Scientists do not want programs to get contaminated by slop
Coders Don't Just Reject 'Vibe Coding' Because They're "Luddites", They Just Know the True Cost of Slop
if some programmer says slop sucks, don't rush to assume selfishness or defence of one's occupation
When Nobody Else Covers the News
There's an obvious "media blackout" regarding the mass layoffs
Links 21/03/2026: David Botstein Dies, Slop as Censorship Apparatus
Links for the day
Links 21/03/2026: Metastablecoin Fragmentation and Crescent Moon
Links for the day
Gemini Links 21/03/2026: Historic Ada Docs; The Lurking LLM on the SmolNet
Links for the day
HSBC the Latest Failed Bank Using Slop as Excuse for Its Financial Failure
"HSBC is planning on cutting as many as 20,000 jobs in the near future as the company allies with AI revolution."
Invitation to General Assembly After 1,200 EPO Workers Participated in the Demonstration 3 Days Ago
"the strike of 19 March was also very well followed."
A/Prof Susan G Kleinmann, Enkelena Haxhija & Debian-private risk to MIT
Reprinted with permission from Daniel Pocock
Over at Tux Machines...
GNU/Linux news for the past day
IRC Proceedings: Friday, March 20, 2026
IRC logs for Friday, March 20, 2026