Guest post by Dr. Andy Farnell
In this mini-series:
Summary: Dr. Farnell tells tales he has encountered in academia, seeing that the infrastructure and the software are being outsourced to companies such as Microsoft
To avoid parroting the earlier article I'll more quickly summarise these areas and then move to speak about changes.
Disempowering technologies take away legitimate control from their operator. For example, not being able to stop a Windows upgrade at a critical moment. I've quit counting the classroom hours lost to inefficient and dysfunctional Microsoft products that ran amok out of my control while students stared out the window.
"I've quit counting the classroom hours lost to inefficient and dysfunctional Microsoft products that ran amok out of my control while students stared out the window."A graduate of mine started work on an NHS support team for IT and security. I made a joke about Windows running an update during a life or death moment in the emergency ward. He looked at me with deadly seriousness, to say "you think that doesn't happen?"
Entitled seizure of the owner’s operational sovereignty is one aspect of disempowerment. Another is discontinuation. The sudden removal of support for a product one has placed at the centre of one's workflow can be devastating. Google notoriously axe services in their prime of life. Weep at the headstones in the Google Graveyard.
Students education is suddenly "not supported". They experience risks from software with poor long-term stability - something large corporations seem unable to deliver. By contrast my go-to editor and production suite Emacs is almost 50 years old.
"...my go-to editor and production suite Emacs is almost 50 years old."
Dehumanising devices that silence discourse operate at the mundane level of issue ticketing and "no-reply emails". But more generally, dehumanising technology ignores or minimises individual identity. It imposes uniformity, devalues interpersonal relations and empathy.
When unaccountable algorithms exclude people from services - because their behaviour is deemed "suspicious" - it is not the validity of choices in question, rather the very conceit of abdicating responsibility to machines in order to make an administrator's job more "convenient".
So-called "AI" systems in this context are undisciplined junkyard dogs whose owners are too afraid to chain them. Is it even debatable that anyone deploying algorithms ought to face personal responsibility for harms they cause, as they would for a dog that bites?
In other dehumanising ways, enforced speech or identity is as problematic as censorship or disavowal. So technology fails equally as a drop-down form forcing an approved gender pronoun, or as automatic "correction" of messages to enforce a corporate "speech code". Sorry computer, but you do not get to "correct" what I am.
"In other dehumanising ways, enforced speech or identity is as problematic as censorship or disavowal."Systems of exclusion proliferate on university campuses, which are often seen as private experimental testing-grounds for "progressive" tech. Software developers can be cavalier, over-familiar and folksy in their presumptions. A growing arrogance is that everyone will choose, or be forced to switch to their system. Yet if they are anything universities ought to be a cradle of possibility, innovation and difference. They are supposed to be the place where the next generation of pioneers will grow and go on to overturn the status-quo.
That fragile creativity and diversity evaporates the moment anyone assumes students carry a smartphone, or a contactless payment card for the "cashless canteen". Assumptions flatten possibility. Instruments of exclusion always begin as "opportunity". Callous "progressives" insist that students "have a choice" while silently transforming that choice into assumptions, and then assumptions into mandates.
Destroying "inefficient" modes of interaction, like cash and library cards that have served us for centuries, gives administrators permission to lock their hungry students out of the refectory and library in the name of "convenience". They are aided by interloping tech monopolies who can now limit access to educational services when administrators set up "single-sign-in" via Facebook, Microsoft, Google or Linked-In accounts. Allowing these private companies to become arbiters of identity and access is cultural suicide.
"Allowing these private companies to become arbiters of identity and access is cultural suicide."Systems of extraction and rent-seeking are also flourishing in education. Whether that's Turnitin feasting on the copy-rights of student essays, or Google tracking and monitoring attention via "approved" browsers, then serving targeted advertising. Students are now the product, not the customers of campus life.
The more we automate our student's experience the more brutal it gets. Systems of coercion attached to UKVI Tier-4 attendance monitoring seem more like the fate of electronically tagged prisoners on parole. How anyone can learn in that environment of anxiety, where a plane to Rwanda awaits anyone who misses a lecture, is hard to fathom 1.
Gaslighting and discombobulation is psychological warfare in which conflicting and deliberately non-sequitur messages are used to sap morale, undermine confidence and sow feelings of fear, uncertainty and doubt.
That could hardly be a more fitting description of university administrators and ICT services whose constant mixed messages and contradictory policies disrupt teaching and learning.
"That could hardly be a more fitting description of university administrators and ICT services whose constant mixed messages and contradictory policies disrupt teaching and learning."We must inform all students by email - except where that violates the "bulk mail" or "appropriate use" policies. Staff should be readily available to students, except where it suits ICT to disable mail forwarding. We are to maintain inclusive and open research opportunities, except where blunt web censorship based on common keywords thwarts researchers of inconvenient subjects like terrorism, rape, hacking or even birth control.
Time-wasting technologies are those that force preformative make-work and bullshitting activities. They offer what Richard Stallman calls "digital disservices". For example; copying data, row by row, from one spreadsheet to another might be justified in an air-gapped top-secret facility. It is unacceptable where administrators, following a brain-dead "policy", have stupidly disabled copying via some dreadful Microsoft feelgood security "feature". This is the kind of poorly thought out "fine grained" drudge-making security that Microsoft systems seem to celebrate and the kind of features that power-hungry, controlling bosses get moist over. It is anti-work. This lack of trust is grossly insulting to workers toiling on mundane admin work under such low security stakes.
"...my university-approved Microsoft Office365 running on a Google Chrome browser seems designed to arrest my focus and frustrate all attempts to concentrate."
Technologies that distract are pernicious in education. Nothing saps learning more than tussling for the attention of students and staff as they try to work. Yet my university-approved Microsoft Office365 running on a Google Chrome browser seems designed to arrest my focus and frustrate all attempts to concentrate. Advertisements and corporate spam have no place in my teaching workflow, so I refuse to use these tools which are unfit for purpose.
Finally, only the military is guilty of more gratuitous waste than academia. To see garbage skips filled to the brim with "obsolete" computers, because they will not run Windows 11 is heartbreaking. Crippled at the BIOS level, they are designated as e-waste due to the inability of IT staff to use a simple screwdriver to remove hard-drives containing potential PII. Meanwhile students beg me for a Raspberry Pi because they cannot afford the extra hardware needed for their studies. ⬆
Except for those overseas students who might appreciate a free
flight back home for Christmas.