11.21.22

France Shows Leadership in Banning Microsoft Office 360 and Gulag Workplace

Posted in Europe, Free/Libre Software, Google, Microsoft at 2:19 pm by Dr. Roy Schestowitz

Video download link | md5sum 03513ce64897e9b7a98e1eccc9465fbd
Protecting Children From Foreign Espionage
Creative Commons Attribution-No Derivative Works 4.0

Summary: The French ministry responsible for education has come to accept that spying operations of the United States have no place in French schools, no matter the alleged “cost”

THE very large nation of France, one of the most economically potent countries in Europe, is rejecting what was supposed to be rejected right from the get-go. It’s a big blow for surveillance ambitions of the United States and one can hope France will prioritise Free software, not just something “domestic” (it can be both). Dr. Andy Farnell wrote about this weeks ago. He had witnessed the same problem (firsthand experience) in the UK.

Quite a few articles in English can be found about this, but a lot more are in French and some were included in our Daily Links several days ago. The short story is, Microsoft Office 365 and Google Workspace are banned for use in schools. Why stop there? Why only schools? The subject is discussed further in the above video.

11.02.22

Google is Banning GNU/Linux Videos and GNU/Linux Channels

Posted in GNU/Linux, Google at 3:23 pm by Dr. Roy Schestowitz

Download link (first 3m:54s; full video)

Summary: It is already widely known that mentioning “Linux” in YouTube titles can get one immediately ‘demonetised’, sometimes deranked/shadowbanned, but it is getting worse as very old videos are being retroactively used to deplatform GNU/Linux proponents (the above is a new example; we saw or heard of more examples in the recent past and sometimes the producers permanently lose the channel or have too small an audience to get noticed)

10.31.22

People Who Adopt Gmail Help Google Attack E-mail in General

Posted in Google, Protocol, Servers at 10:07 am by Dr. Roy Schestowitz

Video download link | md5sum 2f53949551a60a6d9d691f18043d1405
Gmail is Not Email But Attack on Email
Creative Commons Attribution-No Derivative Works 4.0

Summary: Google has become a big problem and Gmail is massive liability to the global E-mail system; its market share needs to be be significantly lowered (the same is true when it comes to Web browsers; therein, whatever Google does becomes a de facto ‘standard’)

THE other day we covered the way Google critics resorted to a partisan framing, basically distracting from Google’s war on independent or small mail relays, the vast majority of which perfectly legitimate ones. Gmail is not a framework for delivering E-mail but for rejecting E-mail, usually based on some flimsy process with a corporate bias. Forget about politics.

All this false partisanship is a Public Relation (PR)) tactic. Google prefers is that way.

Today we deal with this anecdotal story that “90-95% of the spam I receive originates from servers under Google’s control. Do you guys bother to check outgoing messages, or do you just filter and block incoming messages?”

Google is subjecting everyone to vastly higher standards than it subjects itself to. CoC-like thinking of double standards.

There have been similiar agonising stories lately.

“We need to encourage friends, family, colleagues and other peers to shun centralised E-mail systems…”I myself have long experienced the pain of ISPs (or big American companies) discriminating against mail relays like mine. In fact, at one point I was losing a lot of mail or was unable to respond to mail after a close relative lost a family member. It’s hard to forget the amount of damage this caused, even if that was more than 16 years ago!

E-mail is meant to reliably send mail; but the entrepreneurs behind E-mail (the real ones, not the fraud who threatened me for calling him out) did not envision companies like Google hoarding a lot of the system and then blocking loads of relays without any oversight, let alone independent scrutiny and fines. We need to encourage friends, family, colleagues and other peers to shun centralised E-mail systems; the endgame might be the end of E-mail as an open system.

10.27.22

More Problems With Google’s “Insecure Apps” Alert and SeaMonkey Mail

Posted in Google, Protocol, Security at 3:44 pm by Guest Editorial Team

Reprinted with permission from Ryan

I went to get my email yesterday using SeaMonkey Mail over IMAP.

Google logged me out of OAuth and then SeaMonkey said it failed to fetch my mail.

So I tried to log back in and it said I had an “insecure app” and to try again with another “app”.

After playing around with the User Agent again, I noticed that Firefox 106’s would work, but since Mozilla releases Firefox versions every 6 weeks, and Google is obviously making it impossible to continue logging in using the older version after another week or so, I decided to play around with User Agents until I found something that worked.

It turns out Firefox 102’s user agent doesn’t work for OAuth even though it’s an ESR.

So I decided to fake a Thunderbird “102.12” on “Windows 10” UA.

Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:102.0) Gecko/20100101 Thunderbird/102.12.0

I don’t know if Google logs you out and pops up an “insecure app” alert over minor revisions to Thunderbird, but it’s likely. The current release is actually 102.4 according to the Web site. This 102.12 bogus UA would therefore probably buy me about 8-9 months before I have to come back and bump it again.

You can use this value for these “new string”s in about:config

general.useragent.override.google.com

and

general.useragent.override.google.com

And that should be the last you hear about Google for a while.

You will obviously have to come back and bump it again sometime next year.

My guess is that when the next major version is out, use that followed by “.12.0 at the end of the Thunderbird part at the end, but not on the Gecko version.

OAuth is turning into a major usability disaster and there’s not any guarantee that simple UA hacks will keep SeaMonkey working. Google could actually resort to testing browser features that it knows are only in the latest “supported” applications.

E-mail is Simple, Secure and Robust If Decentralised and Treated as Text, Not as Web Pages (Also, Webmail is Regressive)

Posted in Google, Protocol, Security at 5:56 am by Dr. Roy Schestowitz

Video download link | md5sum 9f1cb3590e5f61ccf065a38724f37fe0
E-mail is Not Web Pages
Creative Commons Attribution-No Derivative Works 4.0

Summary: E-mail has become a sordid mess; even though E-mail by far predates the Web (nowadays it’s almost exclusively the latter) what many folks refer to as E-mail is basically some Web site and messages are in fact Web pages, not text

LAST night we wrote about Google's ongoing attack on E-mail, both as a protocol and as a service. Think of Gmail as an attack on E-mail, much in the same way that GitHub is an attack on Git. Gmail is not E-mail. Protocols and obstacles (for relays to please) are added all the time and when it comes to reading E-mail, nowadays people are expected to use “apps” that can’t even cope with plain-text E-mail (which is how Git is traditionally managed; Bugzilla can be the same). There’s an effort to herd people into sites like Googlemail/Gmail and Microsoft/GitHub, presenting them with JavaScript (typically proprietary) disguised as pages instead of simple text. This is bad for usability, security, and all sorts of other basic ‘fitness-for-purpose’ criteria. Ask a blind person about accessibility of GitLab, which became very little but a big pile of JavaScript.

The video above concerns this article about webmail problems. It offers “many reasons not to use “webmail” in place of e-mail — a case for commodity services, open standards, and free choice of client software” as our associate put it this morning.

E-mail is a simple protocol that predates the Web. Let’s not impose HTML and the Web on E-mail users. We’re ruining things.

References:

  1. Military Intelligence? DoD finally (FINALLY) bans HTML e-mail | TechRepublic
  2. Why NOT to use HTML in e-mail
  3. HTML e-mail not worth the risk | Network World
  4. Why HTML in E-Mail is a Bad Idea
  5. The Ascii Ribbon Campaign official homepage
  6. 7 reasons why HTML email is a bad thing

10.26.22

Don’t Outsource Schools and Universities to Pentagon-Subsidised Agents of Oppression

Posted in Free/Libre Software, GNU/Linux, Google, Microsoft at 9:28 pm by Dr. Roy Schestowitz

Video download link | md5sum 8332488ce16dcd0395e8d9097324b2bb
Outsourcing of Schools
Creative Commons Attribution-No Derivative Works 4.0

Summary: Technology has turned from a tool of enablement and emancipation into a facilitator of oppression and subjugation over those “using” it (or being used, remotely, by the real owners)

AS we’ve just noted, today's main article was a guest article about education centres being outsourced and further complicated for no real purpose other than graft.

There is an asymmetric relationship in today’s computers/computing systems; the people who sit in front of the computers (e.g. “smart” phones) barely control them. They’re mostly being controlled, but they’re under the impression that’s not the case. Richard Stallman has been warning about this for a very long time and we’re meant to ignore him, based on a campaign of defamation.

Either way, the video above goes through today’s long article (in Gemini) and adds more personal thoughts. We hope to be covering those sorts of issues a lot more often in the future.

Political Bias is a Distraction From Google’s Abuse of Power Over E-mail

Posted in Deception, Google at 7:56 pm by Dr. Roy Schestowitz

Video download link | md5sum 649075c605903b8625d94f16c8093b26
Google as E-mail Cop
Creative Commons Attribution-No Derivative Works 4.0

Summary: It’s easy to get distracted by the media and think that Google’s manipulation of E-mail traffic is a political rather than a corporate issue

THE E-mail system as a decentralised system is under attack. Life is getting a lot more difficult for those wishing to use E-mail without outsourcing to companies like Microsoft and Google.

To put things in perspective, Microsoft isn’t even commanding the market anymore (Hotmail is a fossil and Outlook/Exchange are systems for losing mail and getting cracked). Here’s one recent graph (biased by demography):

Email Client Market Share

When we last checked (about a year ago), Microsoft only had about 2% market share in E-mail, so let’s focus on Google instead.

The video above explains how the media frames it, but the real issue is not that “Google’s spam filter is blocking spam,” to quote an associate. “That is a red herring. The more serious problem is that Google’s spam filter is blocking nearly all independently operated mail servers.”

“Here is one case study” (and another).

With further complications being added older E-mail clients cannot keep up and sometimes they’re shunned completely. The centralisation of E-mail is bad for a whole bunch of reasons. “Oversight, surveillance, and (in the case of employers) micromanagement,” an associate noted, adding that: “With the advent of Microsoft Outlook, it’s not only insecure but also highly unreliable and 10% – 20% of messages for Outlook/Exchange go missing.”

“When we last checked (about a year ago), Microsoft only had about 2% market share in E-mail, so let’s focus on Google instead.”The thing not to get distracted by is stuff like this or that, framing it in the context of political parties and orientations.

More needs to be said about — and against — the E-mail consolidation or monopolies/oligopolies because several institutions outsource their E-mail, even some governments. Our associate speaks of “the Appeal To Novelty (argumentum ad novitatem) that the drones in purchasing and the suits in the C-suite evaluate software based on very few other criteria other than version number. Recall in the NT vs Netware days they jump the versions up to have a higher number than the competition. Then renamed it to “2000″.”

This is a separate but related issues that’ll be addressed in today’s fifth video (the above is the first of five).

How Digital Systems Fail Our Institutions

Posted in Free/Libre Software, Google, Microsoft at 8:16 am by Guest Editorial Team

By Dr. Andy Farnell

Classroom

Average Joe “just wants stuff to
work”. He goes along with whatever technology is placed in
front of him. For Joe, geeks fighting religious battles over
technology are a curious spectacle. As a dispassionate
pragmatist, he mistakes fervour for pedantry. He cannot see the
serious ideological fault-lines within technology which will
determine how we live, work and build societies together.

Joe is happy to be dubbed “a user”, a term otherwise applied
to drug addicts and insincere friends, despite actually being
the person who is getting used. For him, “algorithms in the
cloud” and other nonsensical tropes stand in for meaningful
explanations of how his life is run by invisible others.

Joe once thought that things are run by the government he
voted for, based on reliable facts he read in the press,
carefully weighed in his clear mind, itself the product of an
unbiased education. He believes in these
institutions, whose function underwrite his
existence.

But Joe’s life is now determined by “digital infrastructure”
increasingly concentrated in giant data-centres, under the
control of unelected, profit-seeking organisations. Joe is a
victim of what we will simply refer to as “systems”.

A “system” can be defined as cybernetic, ecological,
biological, social, political or operational. But, to use words
as ordinary people mean them, a system is “increased work and
stress I won’t have any choice about, and won’t get paid for”.
Systems are ever-expanding, hostile impositions. Systems are a
failure of engaged, humanistic, liberal democratic life

Systems turn bad

The words “New System” strike dread into the heart of any
employee. Big organisations make ideal testing grounds for
inhumane systems. For example, the scandal associated with
Cambridge Analyticawas really no more than a failed
research project in data science, whose implications and ethics
scared the crap out of the public. It was possible only because
of a system, the “walled garden” called Facebook, fleecing 50
million people of their data. Since then nothing has changed
and the commodification of surveillance data for influence has
been normalised.

“Each September in
universities, untested systems go live as administrators and
students return to do battle over workflows and control of
resources.”
At a more mundane, everyday level,
institutions hold a captive audience of guinea-pigs. In
academia it is students and staff, on whom we can run
algorithms and experiments by decree of “policy”, thus avoiding
messy ethics and scrutiny real researchers would endure.

Each September in universities, untested systems go live as
administrators and students return to do battle over workflows
and control of resources. As timetables shift and slip into
place, students scour campus corridors for elusive lecture
rooms. Many hours of teaching will be lost as access systems,
attendance registers, login portals, classroom AV, and
assessment tools grumble and grind, then fail. Everyone will be
beaten into compliance, under veiled threats alluding to
“necessary regulation”, “best security practices” and “higher
powers” and so on. It is the will, not of any identifiable
tyrant, but of “the system”.

No door remains unprotected by card access, no classroom or
corridor free of motion detection, face-recognition, CCTV, and
no computer accessible except through a tedium of slow,
draconian, security processes. Arbitrarily, at any time and
without warning, centralised IT are free to alter systems and
“policies” that underwrite them. They can move web pages,
change login processes, block emails, remove services, target
groups or individuals within a panopticon and labyrinth that
would be the envy of B. F. Skinner, famed tormentor of
rats.

We live with this because we have been conditioned to it, as
rats who have forgotten life before the maze. Fifty years of
believing computers are “necessary” has etched its mark. Of
course systems are there “to help us”. They offer
“convenience”. And foremost, they provide “security”, that
elusive quality we are constantly told we need, but somehow
never feel we have. During thirty years of teaching, I’ve seen
many systems introduced. The chilling effect on the engagement,
openness and curious spirit of students has been palpable.
Systems inhibit. Systems disable.

However, this is all fascinating for me, as a computer
scientist and systems theorist, because I’ve had a perfect
environment to study the damaging effects of encroaching
systems on real people trying to do simple, timeless activities
like teach and learn.

The unsurprising CHAOS report of 2018

1
tells us “most information systems fail”. They deliver
less certainty, less reliably and less accessibly. Five minutes
using any major search engine should convince you, the game is
no longer to deliver information on request, but to extract it
fromyou. Search is just one example of how many
technologies today are distorted and broken, operating with
perverse incentives and hidden agendas counter to the wellbeing
of their “users”.

But even the systems we pay for work against us. The
unintended consequence of the machinery to deliver cheap, fast,
efficient, uniform, accountable, secure education leads in
totality, to catastrophic cost for university students and
professors.

It doesn’t have to be this way of course. The promise of the
“information age” envisioned by optimistic pioneers of the 60s
and 70s, still lurks beneath the surface of society, frustrated
and itching to emerge.
Techrightshas been holding a torch to abusive technology
for decades. Today it is joined by new projects like
The Center for Humane Technology

2
and hundreds of prominent thinkers trying to reform
technology against the big-money interests of Microsoft, Google
and the like.

How did we get so lost in counterproductive technology? It
is perplexing because we have cheaper and more powerful
computers than ever. Software is for the most-part, less buggy.
Yet each year our every-day experience of technology worsens.
We wait longer, feel more frustrated, more scrutinised and
bullied by tech, and are less productive. A new paper by Pablo
Azar of the Federal Reserve Bank of New York

3
notes how “computer saturation” lies at the heart of
productivity slowdown. We have too many computers for our own
good now. We’re at “peak tech”.

“To my surprise, my
experiment with teaching computer science using nothing but the
benign technologies of a whiteboard pen and £25 Raspberry Pi is
an astounding success, loved by all the students.”
As a
computer scientist I’m horrified by what I see in educational
tech. Our helpless dependency perfectly tracks de-skilling and
outsourcing to unaccountable cloud providers and “algorithms”.
As a teacher of technology, technology is now the reason I want
to leave teaching. A karmic reckoning perhaps. Each semester I
watch it harm our students’ learning experience and feel less
able to be the humane, generous, engaging mentor I’d like to
be. To my surprise, my experiment with teaching computer
science using nothing but the benign technologies of a
whiteboard pen and £25 Raspberry Pi is an astounding success,
loved by all the students. It seems ever clearer that the
university, other than as a physical meeting space, has nothing
to offer.

Browbeaten by systematic, institutional technology I’ve
witnessed students in tears because opaque “systems” have
miscalculated grades, wrongly accused them of plagiarism,
overcharged them, cut-off their internet, evicted them from
accommodation, confused them with other students, lost
assignments…

Most corrosive is the sense of helplessness. Regardless of
how willing, attuned, tactful, or experienced a professor may
be, having to say “there’s nothing I can do, the system won’t
let me”, is galling.

Obstructive as broken systems may be, it is the fervour of
their apologists that saddens me more. Edu-tech zealots simply
cannot hear that students “just want engaging in-person
teaching”. For them, ever more centralised learning systems,
omniscient portals, blended fusion centres, and AI augmented VR
technologies are the only way forward. They are enchanted.

It’s said that people don’t leave bad jobs they leave bad
bosses. I think
people leave bad systems. You can argue with a bad boss,
but not a bad system. A perfect system retains the calm tone
and unblinking red eye of Arthur C. Clarke’s HAL computer, even
as it destroys itself and those around it. It is the Microsoft
system that defiantly against your will, updates itself to a
“better” Windows version, and then crashes to a halt
complaining your computer is not powerful enough. Nobody
deserves any person or thing so chaotic and insolent in their
life, and are wise to separate.

I firmly believe the precarious mental health of students is
directly attributable to the brutality of systems they face
daily. We’ve driven out humiliation and the cane from schools
only to create new forms of technological violence under the
pompous auspice of “preparing them for reality” – a
technological reality that for Jon Askonas writing in the New
Atlantic is “just a game now”.

4

Why we persist with bad systems

“Over-systemisation” is not news. John Gall’s “Systemantics”


5
describes man’s struggle against himself through the
folly of systems. They are, “solidified resistance to change”
and, in Nietzsche’s words, the “will to a lack of integrity”.
And so we must ask – since universities are about changing
minds and seeking a better world through truth and integrity -
what place do rigid, opaque and self-interestedly dishonest
systems have in our institutions? How did they get here, and
why do we keep building them?

“As technologists
we retain a naive view of systems as tools to help
us.”
One of the reasons is ideology. In no small way we
believein systems. For a warning about the future we
might look to history. Despite many political and economic
theories, the sudden fall of the Soviet Union in 1991 remains
mysterious. Misery came as much from technocratic worship of
centralised bureaucracy as communist ideology. Yet it is seldom
noted how collapse was hastened by the introduction of
computers in 1990. Could it be that the demise of
anyideology is accelerated once augmented with AI,
algorithms and automation?

We know that bureaucracies of Max Weber’s kind exhibit
compound growth of about five percent, but forget that under
Moore’s Law digital systems have grown in complexity a million
times.

6
What was banal but beneficent has been catapulted way
beyond Neil Postman’s Technocracy or even Kafka’s ludicrous
nightmares – by which I mean the imposition of other people’s
values by oblique means. Bad systems create work, push-down
responsibility and suck-up power.

As technologists we retain a naive view of systems as tools
to help us. In the words of Steve Jobs they are “bicycles for
our minds”. But few minds, even riding Jobs’ bicycle, can
contemplate the distance between Apple’s 1984 Superbowl advert
and Edward Snowden’s 2013 message. It is the same distance
between Kraftwerk’s “It’s more fun to compute” and “If you’ve
nothing to hide you’ve nothing to fear”. It is no less than the
transition from computers as tools we use, into tools used to
control us.

We’ve come to think of software as Heideggerian technology;
bare utilities to amplify the whims of our mind-body. In a
competitive culture like ours, they soon become weapons ranged
against each other, primed for ideological battle and
information warfare rather than cooperation.

This bleak ‘totalising’ technology of Heidegger is all
around us today, as instrumental systems that act upon us, and
lenses through which we are forced to see the whole world. In
that digital world they are the implementation of policy set
out by power as a means of determining the behaviour of others.
Ceding control of our tools to others lets them limit our
capabilities.

“I think that what
we teach by way of computer science, software engineering,
project management, data and AI technologies, adds up to a
fantasy still rooted in the 1980s, that sees the developer and
“user” as agents creating an “experience”, not as the subjects
of systems that now control them.”
So are we
misunderstanding “systems”? Are we teaching the wrong things
about organisation, structure and planning? My duty as a
sceptical professor is to deeply question the ethics and
purpose of what I teach, lest my graduates only contribute to
world problems.

I think that what we teach by way of computer science,
software engineering, project management, data and AI
technologies, adds up to a fantasy still rooted in the 1980s,
that sees the developer and “user” as agents creating an
“experience”, not as the subjects of systems that now control
them.

That’s why I’ll be assigning the lesser-known writings of
systems theorists Norbert Weiner

7
and Donella Meadows

8
in a class on computing systems this semester. We’ll ask
things like:

  • What technologies could we get rid of?
  • Which systems have, on balance, been a mistake?
  • If digital mass communication is leading to less truth
    and happiness, how do we gracefully switch it off?
  • What will count as “information” once AI begins to
    generate ceaseless tides of plausible but fake sound, images
    and prose?
  • As research students how can we be brutally sceptical not
    only of sources, but the systems we are asked to use?
  • How do we deal with the proliferation of untrustworthy
    systems designed to confuse and betray us for profit?

Questioning our worship of systems permits entrenched
ideologies to be rooted-out. Why do we even have such an
obsession with systems?

One fault is that we confuse systems with solutions. Systems
are substitutes for solutions. Solutions may be ways out of
systems, but systems always beget more systems, create more
problems, needing maintenance and more resources.

Building new systems is profitable. We talk about a “digital
tech industry” worth trillions of dollars. In addition, the
gadgets and services that flood our planet, while fun, are
addictive, ephemeral and ultimately unsatisfying. Despite a
million-fold increase in speed, no technology is ever fast
enough. Despite dizzying advances in materials science, no
modern gadget is durable beyond several months.

A finite gamut of human activities like checking bus times
or weather, writing a letter, or drawing a picture, hasn’t
changed since the 1970s. The low-hanging “killer apps” of
electronic mail, spreadsheets and databases are long behind us.
What is touted as “new” is rehashed technology with a new spin
on extracting profit. As markets get more crowded the means of
extraction get ever more brutal and invasive.

“Systems impose
another insidious effect, being totalitarian.”
One branch
of now problematic thinking grew out of the 1970s project of
automation and
systems analysis. Coupled with the logic of efficiency,
no human action or decision may exist where a machine could
conceivably replace it.

In some sense, systems represent our unrequited desire for
finality, and a note of Fascism lies therein, as Heidegger
noted (and some claim celebrated). One does not proclaim a
Thousand Year Reich or Grand New Order as a “work in progress”
or stop-gap project subject to review. Systems promise
certainty and reliability in an uncertain world. As well as
appealing to the authoritarian mind they temporarily assuage
the anxious and insecure that their needs will be met.

But static structures are a poor response to a dynamic
world. Cybernetic governance and algorithmic societies are a
pale substitute for leadership and statesmanship reflecting a
loss of faith in the human mind. Systems are fleeting models of
a world as we wish it to be, and so all systems are permanently
under attack from outside reality and internally from their own
ceaseless transformation.

Add to this mix the need for economic growth and these
factors add up to systems that are ephemeral yet expansive.
Constantly in a state of turmoil, they reach out to every
corner of life, into our shops, children’s toys, cars and
kitchen appliances, as an always shifting ambivalent force
whose presence and absence we fear equally.

Systems impose another insidious effect, being totalitarian.
The desire to create uniform, accessible services seems
laudable. But that is the function of standards. Systems
enforce the lowest common denominator of the parochial
implementation, flattening intellectual life, oppressing
difference, diversity and innovation. They represent problems
which once systemised are universalised and preserved. Systems
slow down actual progress.

A judge was once asked, “So, what is the best justice
system?”, and replied “There is no best. Only the least worst.
Ideally we would not have any system”. That does not mean we
would have no justice. Only a fool confuses the tool with its
purpose. In political science it is noted that the “The English
have a system, which is no system. It’s also a system, only
better”.

Systems of the future (The English
Way)

It is time we re-imagine digital technology as utility
separate from the conceit of “systems”. So, how can we do
that?

It turns out we already looked at this. It happened in the
field of operating systems. These are the programs that make
computers themselves do useful work. Operating systems
underwent a series of radical evolutions in a twenty year
period between 1960 and 1980.

Learning from the failure of many large monolithic systems
we arrived at the “Unix Philosophy”, which connects principles
of clean software engineering, devolved responsibility, peer
relations, and natural distribution.

This returns us to an earlier, more general and benign
definition of a system, as “interacting but interdependent
assemblage of elements organised toward common purposes”. Note
the plurality invoked.

“Their response was
to wind back the clock, to shut it down by replacing user-owned
systems by old fashioned monolithic systems of command and
control.”
Such a philosophy tends toward small,
reconfigurable, standardised, freely exchangeable and
transparent micro-systems. Emerging in the 1980s, principles of
Free Software – that the system is owned, and is directly
changeable by its users – completed a broader philosophy which
sparked the “dot-com” boom, and the entirety of the Internet,
Web and Silicon Valley as we see it today.

A confluence of military budgets, brilliant academic minds
and opportunity for growth in West coast America circa 1980,
parallels the unlikely conditions precipitating the industrial
revolution in 1750s England. Mirroring the latter’s descent
into Dark Satanic Mills, our own revolution has fallen from
grace.

Like capitalism itself, a system able to create so much
wealth became dangerous to those first to amass wealth and
power as its fruits. Their response was to wind back the clock,
to shut it down by replacing user-owned systems by old
fashioned monolithic systems of command and control. Through
“cloud” technologies we have regressed to the Mainframes of the
1960s. These exist today in the guise of “Big Tech” companies
like Microsoft, Google, Amazon and Facebook. Ironically, these
have colonised the academic institutions that gave birth to the
very conditions of their growth, stifling the source of fresh
innovation.

Desystemetisation

“De-clouding”, “on prem repatriation”, “de-googling”, “own
clouds”, “low tech”, “digital veganism” … there are many
emerging takes on the countervailing trends, back toward more
humane and people-controlled technology.

I have written extensively, in the Times Higher and
elsewhere, on what I see as the dangers of Big Tech encroaching
into education.

The function of Higher Education is not to pander to
industry as delegated, state-subsidised training schools, but
to challenge and redefine industry, sacrificing its sacred cows
for progress.

“No good university
should impose inflexible one-size-fits-all products from
companies like Microsoft with it’s Office365, or Google’s
Orwellian spyware.”
One project I would love to see is
the “zero centralised IT” school or university. It would take
extraordinary courage to create, but is a place I would send my
children in a heartbeat. My time as a computer expert has
taught me there’s much less to be learned
throughtechnology than we are led to think, although it
is important to learn
abouttechnology. Can we create learning academies where
the rules are:

  • Technology is for teachers and researchers to
    manage.
  • They can build any internal systems they like, hardware
    or software, to meet teaching and research needs, but it will
    be ephemeral. No grand schemes, empires or impositions on
    others.
  • We will employ well paid, skilled support staff. However,
    the role of “IT” is strictly subservient to the core
    activities of teaching and research. It’s there to support
    and serve.
  • Interoperability and choice are paramount, particularly
    the choice
    notto partake in any technology or system.

Any such college will excel and set a lasting trend. It will
attract staff that are confident in their digital literacy and
able to work with others on a peer footing, through standards
and mutuality.

For those that value the principles of education and
research, freedom of enquiry, intellectual self-determination,
disputation, and the dialectic between alternative views, the
mission now is to push back at Big Tech and get it out of
education. No good university should impose inflexible
one-size-fits-all products from companies like Microsoft with
it’s Office365, or Google’s Orwellian spyware.

The systems we use, and allow to be
used on us, set the limits of our world. Allowing
Big-Tech systems into our universities creates a deflationary
spiral. They are not just the water in which we swim but the
glass of the invisible fish-tank that contains us. Where
technology is concerned let the English rules apply – the best
system is no system … which is not the same as “no
technology”, but better.

Acknowledgements

Thanks to Edward Nevard, Daniel James and
Techrightsreaders for helpful comments, corrections
and suggestions.

Bibliography

Footnotes:


1

https://en.wikipedia.org/wiki/Standish_Group


2

https://www.humanetech.com/


3

Pablo Azar, “Computer Saturation and the
Productivity Slowdown,” Federal Reserve Bank of New York
Liberty Street Economics, October 6, 2022

https://static1.squarespace.com/static/5bb2b20316b6405766b4d8a2/t/6335bd37f804834edaa13ae3/1664466233286/MooresLawAndEconomicGrowth.pdf


4

https://www.thenewatlantis.com/publications/reality-is-just-a-game-now


5

https://en.wikipedia.org/wiki/Systemantics


6

Conservatively 1,048,576 times, being
twenty powers of two in forty years.


7

https://math.tufts.edu/people/featured-profiles/norbert-wiener


8

https://donellameadows.org/systems-thinking-book-sale/

« Previous entries Next Page » Next Page »

RSS 64x64RSS Feed: subscribe to the RSS feed for regular updates

Home iconSite Wiki: You can improve this site by helping the extension of the site's content

Home iconSite Home: Background about the site and some key features in the front page

Chat iconIRC Channels: Come and chat with us in real time

New to This Site? Here Are Some Introductory Resources

No

Mono

ODF

Samba logo






We support

End software patents

GPLv3

GNU project

BLAG

EFF bloggers

Comcast is Blocktastic? SavetheInternet.com



Recent Posts