10.26.22

Gemini version available ♊︎

How Digital Systems Fail Our Institutions

Posted in Free/Libre Software, Google, Microsoft at 8:16 am by Guest Editorial Team

By Dr. Andy Farnell

Classroom

Average Joe “just wants stuff to
work”. He goes along with whatever technology is placed in
front of him. For Joe, geeks fighting religious battles over
technology are a curious spectacle. As a dispassionate
pragmatist, he mistakes fervour for pedantry. He cannot see the
serious ideological fault-lines within technology which will
determine how we live, work and build societies together.

Joe is happy to be dubbed “a user”, a term otherwise applied
to drug addicts and insincere friends, despite actually being
the person who is getting used. For him, “algorithms in the
cloud” and other nonsensical tropes stand in for meaningful
explanations of how his life is run by invisible others.

Joe once thought that things are run by the government he
voted for, based on reliable facts he read in the press,
carefully weighed in his clear mind, itself the product of an
unbiased education. He believes in these
institutions, whose function underwrite his
existence.

But Joe’s life is now determined by “digital infrastructure”
increasingly concentrated in giant data-centres, under the
control of unelected, profit-seeking organisations. Joe is a
victim of what we will simply refer to as “systems”.

A “system” can be defined as cybernetic, ecological,
biological, social, political or operational. But, to use words
as ordinary people mean them, a system is “increased work and
stress I won’t have any choice about, and won’t get paid for”.
Systems are ever-expanding, hostile impositions. Systems are a
failure of engaged, humanistic, liberal democratic life

Systems turn bad

The words “New System” strike dread into the heart of any
employee. Big organisations make ideal testing grounds for
inhumane systems. For example, the scandal associated with
Cambridge Analyticawas really no more than a failed
research project in data science, whose implications and ethics
scared the crap out of the public. It was possible only because
of a system, the “walled garden” called Facebook, fleecing 50
million people of their data. Since then nothing has changed
and the commodification of surveillance data for influence has
been normalised.

“Each September in
universities, untested systems go live as administrators and
students return to do battle over workflows and control of
resources.”
At a more mundane, everyday level,
institutions hold a captive audience of guinea-pigs. In
academia it is students and staff, on whom we can run
algorithms and experiments by decree of “policy”, thus avoiding
messy ethics and scrutiny real researchers would endure.

Each September in universities, untested systems go live as
administrators and students return to do battle over workflows
and control of resources. As timetables shift and slip into
place, students scour campus corridors for elusive lecture
rooms. Many hours of teaching will be lost as access systems,
attendance registers, login portals, classroom AV, and
assessment tools grumble and grind, then fail. Everyone will be
beaten into compliance, under veiled threats alluding to
“necessary regulation”, “best security practices” and “higher
powers” and so on. It is the will, not of any identifiable
tyrant, but of “the system”.

No door remains unprotected by card access, no classroom or
corridor free of motion detection, face-recognition, CCTV, and
no computer accessible except through a tedium of slow,
draconian, security processes. Arbitrarily, at any time and
without warning, centralised IT are free to alter systems and
“policies” that underwrite them. They can move web pages,
change login processes, block emails, remove services, target
groups or individuals within a panopticon and labyrinth that
would be the envy of B. F. Skinner, famed tormentor of
rats.

We live with this because we have been conditioned to it, as
rats who have forgotten life before the maze. Fifty years of
believing computers are “necessary” has etched its mark. Of
course systems are there “to help us”. They offer
“convenience”. And foremost, they provide “security”, that
elusive quality we are constantly told we need, but somehow
never feel we have. During thirty years of teaching, I’ve seen
many systems introduced. The chilling effect on the engagement,
openness and curious spirit of students has been palpable.
Systems inhibit. Systems disable.

However, this is all fascinating for me, as a computer
scientist and systems theorist, because I’ve had a perfect
environment to study the damaging effects of encroaching
systems on real people trying to do simple, timeless activities
like teach and learn.

The unsurprising CHAOS report of 2018

1
tells us “most information systems fail”. They deliver
less certainty, less reliably and less accessibly. Five minutes
using any major search engine should convince you, the game is
no longer to deliver information on request, but to extract it
fromyou. Search is just one example of how many
technologies today are distorted and broken, operating with
perverse incentives and hidden agendas counter to the wellbeing
of their “users”.

But even the systems we pay for work against us. The
unintended consequence of the machinery to deliver cheap, fast,
efficient, uniform, accountable, secure education leads in
totality, to catastrophic cost for university students and
professors.

It doesn’t have to be this way of course. The promise of the
“information age” envisioned by optimistic pioneers of the 60s
and 70s, still lurks beneath the surface of society, frustrated
and itching to emerge.
Techrightshas been holding a torch to abusive technology
for decades. Today it is joined by new projects like
The Center for Humane Technology

2
and hundreds of prominent thinkers trying to reform
technology against the big-money interests of Microsoft, Google
and the like.

How did we get so lost in counterproductive technology? It
is perplexing because we have cheaper and more powerful
computers than ever. Software is for the most-part, less buggy.
Yet each year our every-day experience of technology worsens.
We wait longer, feel more frustrated, more scrutinised and
bullied by tech, and are less productive. A new paper by Pablo
Azar of the Federal Reserve Bank of New York

3
notes how “computer saturation” lies at the heart of
productivity slowdown. We have too many computers for our own
good now. We’re at “peak tech”.

“To my surprise, my
experiment with teaching computer science using nothing but the
benign technologies of a whiteboard pen and £25 Raspberry Pi is
an astounding success, loved by all the students.”
As a
computer scientist I’m horrified by what I see in educational
tech. Our helpless dependency perfectly tracks de-skilling and
outsourcing to unaccountable cloud providers and “algorithms”.
As a teacher of technology, technology is now the reason I want
to leave teaching. A karmic reckoning perhaps. Each semester I
watch it harm our students’ learning experience and feel less
able to be the humane, generous, engaging mentor I’d like to
be. To my surprise, my experiment with teaching computer
science using nothing but the benign technologies of a
whiteboard pen and £25 Raspberry Pi is an astounding success,
loved by all the students. It seems ever clearer that the
university, other than as a physical meeting space, has nothing
to offer.

Browbeaten by systematic, institutional technology I’ve
witnessed students in tears because opaque “systems” have
miscalculated grades, wrongly accused them of plagiarism,
overcharged them, cut-off their internet, evicted them from
accommodation, confused them with other students, lost
assignments…

Most corrosive is the sense of helplessness. Regardless of
how willing, attuned, tactful, or experienced a professor may
be, having to say “there’s nothing I can do, the system won’t
let me”, is galling.

Obstructive as broken systems may be, it is the fervour of
their apologists that saddens me more. Edu-tech zealots simply
cannot hear that students “just want engaging in-person
teaching”. For them, ever more centralised learning systems,
omniscient portals, blended fusion centres, and AI augmented VR
technologies are the only way forward. They are enchanted.

It’s said that people don’t leave bad jobs they leave bad
bosses. I think
people leave bad systems. You can argue with a bad boss,
but not a bad system. A perfect system retains the calm tone
and unblinking red eye of Arthur C. Clarke’s HAL computer, even
as it destroys itself and those around it. It is the Microsoft
system that defiantly against your will, updates itself to a
“better” Windows version, and then crashes to a halt
complaining your computer is not powerful enough. Nobody
deserves any person or thing so chaotic and insolent in their
life, and are wise to separate.

I firmly believe the precarious mental health of students is
directly attributable to the brutality of systems they face
daily. We’ve driven out humiliation and the cane from schools
only to create new forms of technological violence under the
pompous auspice of “preparing them for reality” – a
technological reality that for Jon Askonas writing in the New
Atlantic is “just a game now”.

4

Why we persist with bad systems

“Over-systemisation” is not news. John Gall’s “Systemantics”


5
describes man’s struggle against himself through the
folly of systems. They are, “solidified resistance to change”
and, in Nietzsche’s words, the “will to a lack of integrity”.
And so we must ask – since universities are about changing
minds and seeking a better world through truth and integrity -
what place do rigid, opaque and self-interestedly dishonest
systems have in our institutions? How did they get here, and
why do we keep building them?

“As technologists
we retain a naive view of systems as tools to help
us.”
One of the reasons is ideology. In no small way we
believein systems. For a warning about the future we
might look to history. Despite many political and economic
theories, the sudden fall of the Soviet Union in 1991 remains
mysterious. Misery came as much from technocratic worship of
centralised bureaucracy as communist ideology. Yet it is seldom
noted how collapse was hastened by the introduction of
computers in 1990. Could it be that the demise of
anyideology is accelerated once augmented with AI,
algorithms and automation?

We know that bureaucracies of Max Weber’s kind exhibit
compound growth of about five percent, but forget that under
Moore’s Law digital systems have grown in complexity a million
times.

6
What was banal but beneficent has been catapulted way
beyond Neil Postman’s Technocracy or even Kafka’s ludicrous
nightmares – by which I mean the imposition of other people’s
values by oblique means. Bad systems create work, push-down
responsibility and suck-up power.

As technologists we retain a naive view of systems as tools
to help us. In the words of Steve Jobs they are “bicycles for
our minds”. But few minds, even riding Jobs’ bicycle, can
contemplate the distance between Apple’s 1984 Superbowl advert
and Edward Snowden’s 2013 message. It is the same distance
between Kraftwerk’s “It’s more fun to compute” and “If you’ve
nothing to hide you’ve nothing to fear”. It is no less than the
transition from computers as tools we use, into tools used to
control us.

We’ve come to think of software as Heideggerian technology;
bare utilities to amplify the whims of our mind-body. In a
competitive culture like ours, they soon become weapons ranged
against each other, primed for ideological battle and
information warfare rather than cooperation.

This bleak ‘totalising’ technology of Heidegger is all
around us today, as instrumental systems that act upon us, and
lenses through which we are forced to see the whole world. In
that digital world they are the implementation of policy set
out by power as a means of determining the behaviour of others.
Ceding control of our tools to others lets them limit our
capabilities.

“I think that what
we teach by way of computer science, software engineering,
project management, data and AI technologies, adds up to a
fantasy still rooted in the 1980s, that sees the developer and
“user” as agents creating an “experience”, not as the subjects
of systems that now control them.”
So are we
misunderstanding “systems”? Are we teaching the wrong things
about organisation, structure and planning? My duty as a
sceptical professor is to deeply question the ethics and
purpose of what I teach, lest my graduates only contribute to
world problems.

I think that what we teach by way of computer science,
software engineering, project management, data and AI
technologies, adds up to a fantasy still rooted in the 1980s,
that sees the developer and “user” as agents creating an
“experience”, not as the subjects of systems that now control
them.

That’s why I’ll be assigning the lesser-known writings of
systems theorists Norbert Weiner

7
and Donella Meadows

8
in a class on computing systems this semester. We’ll ask
things like:

  • What technologies could we get rid of?
  • Which systems have, on balance, been a mistake?
  • If digital mass communication is leading to less truth
    and happiness, how do we gracefully switch it off?
  • What will count as “information” once AI begins to
    generate ceaseless tides of plausible but fake sound, images
    and prose?
  • As research students how can we be brutally sceptical not
    only of sources, but the systems we are asked to use?
  • How do we deal with the proliferation of untrustworthy
    systems designed to confuse and betray us for profit?

Questioning our worship of systems permits entrenched
ideologies to be rooted-out. Why do we even have such an
obsession with systems?

One fault is that we confuse systems with solutions. Systems
are substitutes for solutions. Solutions may be ways out of
systems, but systems always beget more systems, create more
problems, needing maintenance and more resources.

Building new systems is profitable. We talk about a “digital
tech industry” worth trillions of dollars. In addition, the
gadgets and services that flood our planet, while fun, are
addictive, ephemeral and ultimately unsatisfying. Despite a
million-fold increase in speed, no technology is ever fast
enough. Despite dizzying advances in materials science, no
modern gadget is durable beyond several months.

A finite gamut of human activities like checking bus times
or weather, writing a letter, or drawing a picture, hasn’t
changed since the 1970s. The low-hanging “killer apps” of
electronic mail, spreadsheets and databases are long behind us.
What is touted as “new” is rehashed technology with a new spin
on extracting profit. As markets get more crowded the means of
extraction get ever more brutal and invasive.

“Systems impose
another insidious effect, being totalitarian.”
One branch
of now problematic thinking grew out of the 1970s project of
automation and
systems analysis. Coupled with the logic of efficiency,
no human action or decision may exist where a machine could
conceivably replace it.

In some sense, systems represent our unrequited desire for
finality, and a note of Fascism lies therein, as Heidegger
noted (and some claim celebrated). One does not proclaim a
Thousand Year Reich or Grand New Order as a “work in progress”
or stop-gap project subject to review. Systems promise
certainty and reliability in an uncertain world. As well as
appealing to the authoritarian mind they temporarily assuage
the anxious and insecure that their needs will be met.

But static structures are a poor response to a dynamic
world. Cybernetic governance and algorithmic societies are a
pale substitute for leadership and statesmanship reflecting a
loss of faith in the human mind. Systems are fleeting models of
a world as we wish it to be, and so all systems are permanently
under attack from outside reality and internally from their own
ceaseless transformation.

Add to this mix the need for economic growth and these
factors add up to systems that are ephemeral yet expansive.
Constantly in a state of turmoil, they reach out to every
corner of life, into our shops, children’s toys, cars and
kitchen appliances, as an always shifting ambivalent force
whose presence and absence we fear equally.

Systems impose another insidious effect, being totalitarian.
The desire to create uniform, accessible services seems
laudable. But that is the function of standards. Systems
enforce the lowest common denominator of the parochial
implementation, flattening intellectual life, oppressing
difference, diversity and innovation. They represent problems
which once systemised are universalised and preserved. Systems
slow down actual progress.

A judge was once asked, “So, what is the best justice
system?”, and replied “There is no best. Only the least worst.
Ideally we would not have any system”. That does not mean we
would have no justice. Only a fool confuses the tool with its
purpose. In political science it is noted that the “The English
have a system, which is no system. It’s also a system, only
better”.

Systems of the future (The English
Way)

It is time we re-imagine digital technology as utility
separate from the conceit of “systems”. So, how can we do
that?

It turns out we already looked at this. It happened in the
field of operating systems. These are the programs that make
computers themselves do useful work. Operating systems
underwent a series of radical evolutions in a twenty year
period between 1960 and 1980.

Learning from the failure of many large monolithic systems
we arrived at the “Unix Philosophy”, which connects principles
of clean software engineering, devolved responsibility, peer
relations, and natural distribution.

This returns us to an earlier, more general and benign
definition of a system, as “interacting but interdependent
assemblage of elements organised toward common purposes”. Note
the plurality invoked.

“Their response was
to wind back the clock, to shut it down by replacing user-owned
systems by old fashioned monolithic systems of command and
control.”
Such a philosophy tends toward small,
reconfigurable, standardised, freely exchangeable and
transparent micro-systems. Emerging in the 1980s, principles of
Free Software – that the system is owned, and is directly
changeable by its users – completed a broader philosophy which
sparked the “dot-com” boom, and the entirety of the Internet,
Web and Silicon Valley as we see it today.

A confluence of military budgets, brilliant academic minds
and opportunity for growth in West coast America circa 1980,
parallels the unlikely conditions precipitating the industrial
revolution in 1750s England. Mirroring the latter’s descent
into Dark Satanic Mills, our own revolution has fallen from
grace.

Like capitalism itself, a system able to create so much
wealth became dangerous to those first to amass wealth and
power as its fruits. Their response was to wind back the clock,
to shut it down by replacing user-owned systems by old
fashioned monolithic systems of command and control. Through
“cloud” technologies we have regressed to the Mainframes of the
1960s. These exist today in the guise of “Big Tech” companies
like Microsoft, Google, Amazon and Facebook. Ironically, these
have colonised the academic institutions that gave birth to the
very conditions of their growth, stifling the source of fresh
innovation.

Desystemetisation

“De-clouding”, “on prem repatriation”, “de-googling”, “own
clouds”, “low tech”, “digital veganism” … there are many
emerging takes on the countervailing trends, back toward more
humane and people-controlled technology.

I have written extensively, in the Times Higher and
elsewhere, on what I see as the dangers of Big Tech encroaching
into education.

The function of Higher Education is not to pander to
industry as delegated, state-subsidised training schools, but
to challenge and redefine industry, sacrificing its sacred cows
for progress.

“No good university
should impose inflexible one-size-fits-all products from
companies like Microsoft with it’s Office365, or Google’s
Orwellian spyware.”
One project I would love to see is
the “zero centralised IT” school or university. It would take
extraordinary courage to create, but is a place I would send my
children in a heartbeat. My time as a computer expert has
taught me there’s much less to be learned
throughtechnology than we are led to think, although it
is important to learn
abouttechnology. Can we create learning academies where
the rules are:

  • Technology is for teachers and researchers to
    manage.
  • They can build any internal systems they like, hardware
    or software, to meet teaching and research needs, but it will
    be ephemeral. No grand schemes, empires or impositions on
    others.
  • We will employ well paid, skilled support staff. However,
    the role of “IT” is strictly subservient to the core
    activities of teaching and research. It’s there to support
    and serve.
  • Interoperability and choice are paramount, particularly
    the choice
    notto partake in any technology or system.

Any such college will excel and set a lasting trend. It will
attract staff that are confident in their digital literacy and
able to work with others on a peer footing, through standards
and mutuality.

For those that value the principles of education and
research, freedom of enquiry, intellectual self-determination,
disputation, and the dialectic between alternative views, the
mission now is to push back at Big Tech and get it out of
education. No good university should impose inflexible
one-size-fits-all products from companies like Microsoft with
it’s Office365, or Google’s Orwellian spyware.

The systems we use, and allow to be
used on us, set the limits of our world. Allowing
Big-Tech systems into our universities creates a deflationary
spiral. They are not just the water in which we swim but the
glass of the invisible fish-tank that contains us. Where
technology is concerned let the English rules apply – the best
system is no system … which is not the same as “no
technology”, but better.

Acknowledgements

Thanks to Edward Nevard, Daniel James and
Techrightsreaders for helpful comments, corrections
and suggestions.

Bibliography

Footnotes:


1

https://en.wikipedia.org/wiki/Standish_Group


2

https://www.humanetech.com/


3

Pablo Azar, “Computer Saturation and the
Productivity Slowdown,” Federal Reserve Bank of New York
Liberty Street Economics, October 6, 2022

https://static1.squarespace.com/static/5bb2b20316b6405766b4d8a2/t/6335bd37f804834edaa13ae3/1664466233286/MooresLawAndEconomicGrowth.pdf


4

https://www.thenewatlantis.com/publications/reality-is-just-a-game-now


5

https://en.wikipedia.org/wiki/Systemantics


6

Conservatively 1,048,576 times, being
twenty powers of two in forty years.


7

https://math.tufts.edu/people/featured-profiles/norbert-wiener


8

https://donellameadows.org/systems-thinking-book-sale/

Share in other sites/networks: These icons link to social bookmarking sites where readers can share and discover new web pages.
  • Reddit
  • email

Decor ᶃ Gemini Space

Below is a Web proxy. We recommend getting a Gemini client/browser.

Black/white/grey bullet button This post is also available in Gemini over at this address (requires a Gemini client/browser to open).

Decor ✐ Cross-references

Black/white/grey bullet button Pages that cross-reference this one, if any exist, are listed below or will be listed below over time.

Decor ▢ Respond and Discuss

Black/white/grey bullet button If you liked this post, consider subscribing to the RSS feed or join us now at the IRC channels.

DecorWhat Else is New


  1. 36,000 Soon

    Techrights is still growing; in WordPress alone (not the entire site) we’re fast approaching 36,000 posts; in Gemini it’s almost 45,500 pages and our IRC community turns 15 soon



  2. Contrary to What Bribed (by Microsoft) Media Keeps Saying, Bing is in a Freefall and Bing Staff is Being Laid Off (No, Chatbots Are Not Search and Do Not Substitute Web Pages!)

    Chatbots/chaffbot media noise (chaff) needs to be disregarded; Microsoft has no solid search strategy, just lots and lots of layoffs that never end this year (Microsoft distracts shareholders with chaffbot hype/vapourware each time a wave of layoffs starts, giving financial incentives for publishers to not even mention these; right now it’s GitHub again, with NDAs signed to hide that it is happening)



  3. Full RMS Talk ('A Tour of Malicious Software') Uploaded 10 Hours Ago

    The talk is entitled "A tour of malicious software, with a typical cell phone as example." Richard Stallman is speaking about the free software movement and your freedom. His speech is nontechnical. The talk was given on March 17, 2023 in Somerville, MA.



  4. Links 28/03/2023: KPhotoAlbum 5.10.0 and QSoas 3.2

    Links for the day



  5. The Rumours Were Right: Many More Microsoft Layoffs This Week, Another Round of GitHub Layoffs

    Another round of GitHub layoffs (not the first [1, 2]; won’t be the last) and many more Microsoft layoffs; this isn’t related to the numbers disclosed by Microsoft back in January, but Microsoft uses or misuses NDAs to hide what’s truly going on



  6. All of Microsoft's Strategic Areas Have Layoffs This Year

    Microsoft’s supposedly strategic/future areas — gaming (trying to debt-load or offload debt to other companies), so-called ‘security’, “clown computing” (Azure), and “Hey Hi” (chaffbots etc.) — have all had layoffs this year; it’s clear that the company is having a serious existential crisis in spite of Trump’s and Biden’s bailouts (a wave of layoffs every month this year) and is just bluffing/stuffing the media with chaffbots cruft (puff pieces/misinformation) to keep shareholders distracted, asking them for patience and faking demand for the chaffbots (whilst laying off Bing staff, too)



  7. Links 28/03/2023: Pitivi 2023.03 is Out, Yet More Microsoft Layoffs (Now in Israel)

    Links for the day



  8. IRC Proceedings: Monday, March 27, 2023

    IRC logs for Monday, March 27, 2023



  9. Links 27/03/2023: GnuCash 5.0 and Ubuntu 20.04 LTS on Phones

    Links for the day



  10. Links 27/03/2023: Twitter Source Code Published (But Not Intentionally)

    Links for the day



  11. IRC Proceedings: Sunday, March 26, 2023

    IRC logs for Sunday, March 26, 2023



  12. Links 26/03/2023: OpenMandriva ROME 23.03, Texinfo 7.0.3, and KBibTeX 0.10.0

    Links for the day



  13. The World Wide Web is a Cesspit of Misinformation. Let's Do Something About It.

    It would be nice to make the Web a safer space for information and accuracy (actual facts) rather than a “Safe Space” for oversensitive companies and powerful people who cannot tolerate criticism; The Web needs to become more like today's Gemini, free of corporate influence and all other forms of covert nuisance



  14. Ryan Farmer: I’m Back After WordPress.com Deleted My Blog Over the Weekend

    Reprinted with permission from Ryan



  15. Civil Liberties Threatened Online and Offline

    A “society of sheeple” (a term used by Richard Stallman last week in his speech) is being “herded” online and offline; the video covers examples both online and offline, the latter being absence of ATMs or lack of properly-functioning ATMs (a growing problem lately, at least where I live)



  16. Techrights Develops Free Software to Separate the Wheat From the Chaff

    In order to separate the wheat from the chaff we’ve been working on simple, modular tools that process news and help curate the Web, basically removing the noise to squeeze out the signal



  17. Links 26/03/2023: MidnightBSD 3.0 and FreeBSD 13.2 RC4

    Links for the day



  18. IRC Proceedings: Saturday, March 25, 2023

    IRC logs for Saturday, March 25, 2023



  19. Links 26/03/2023: More TikTok Bans

    Links for the day



  20. Links 25/03/2023: Gordon Moore (of Moore's Law) is Dead

    Links for the day



  21. Links 25/03/2023: Decade of Docker, Azure Broken Again

    Links for the day



  22. [Meme] Money Deducted in Payslips, But Nothing in Pensions

    Sirius ‘Open Source’ has stolen money from staff (in secret)



  23. IRC Proceedings: Friday, March 24, 2023

    IRC Proceedings: Friday, March 24, 2023



  24. The Corporate Media is Not Reporting Large-Scale Microsoft Layoffs (Too Busy With Chaffbot Puff Pieces), Leaks Required to Prove That More Layoffs Are Happening

    Just as we noted days ago, there are yet more Microsoft layoffs, but the mainstream media gets bribed to go “gaga” over vapourware and chaffbots (making chaff like “Bill Gates Says” pieces) instead of reporting actual news about Microsoft



  25. Sirius 'Open Source' Pensiongate: Time to Issue a Warrant of Arrest and Extradite the Fake 'Founder' of Sirius

    Sirius ‘Open Source’ is collapsing, but that does not mean that it can dodge accountability for crimes (e.g. money that it silently stole from its staff since at least 12 years ago)



  26. Links 24/03/2023: Microsoft's Fall on the Web and Many New Videos

    Links for the day



  27. IRC Proceedings: Thursday, March 23, 2023

    IRC logs for Thursday, March 23, 2023



  28. Links 24/03/2023: Social Control Media Bans Advancing

    Links for the day



  29. Links 24/03/2023: GNU Grep 3.10 and Microsoft Accenture in a Freefall

    Links for the day



  30. Links 23/03/2023: RSS Guard 4.3.3 and OpenBSD Webzine

    Links for the day


RSS 64x64RSS Feed: subscribe to the RSS feed for regular updates

Home iconSite Wiki: You can improve this site by helping the extension of the site's content

Home iconSite Home: Background about the site and some key features in the front page

Chat iconIRC Channel: Come and chat with us in real time

Recent Posts