Bonum Certa Men Certa

Cybersecurity is a structural not behavioural problem.

posted by Roy Schestowitz on Jun 01, 2024,
updated Jun 01, 2024

Cybersecurity

Reprinted with permission from Cyber|Show.

Author: Dr. Andy Farnell

Figure 1: "Trickle down insecurity"

There's a bad idea at the heart of corporate models of cybersecurity. It leads to an endless, and mostly pointless cycle of poor-quality remedial or "naughty step" training. This puts workers who ought to need no operational knowledge of system security onto a merry-go-round of failure and re-training. It is costly, and wrong.

It's the belief that systems are essentially correct, but that behavioural problems lie with operators. Where have we seen this more prominently? In the Horizon Post-Office scandal of course.

Some of you may already be familiar with phishing simulations carried out by employers against staff. Those who fail get sent on a training programme, and are often deliberately humiliated or even fired.

Reverse psychology

There are a number of things very wrong with this:

Firstly and most shockingly, there's no actual evidence that putting cohorts through anti-phishing training really improves things. Or at least, there's a lower bound. In any phishing attack a small but seemingly fixed proportion of people will click. That's because the human factors are not purely rational or controllable.

For example, the real reason an employee keeps hitting phish emails may be that they are under extreme pressure to clear an inbox with thousands of outstanding items and only twenty allocated minutes per day to deal with communication backlog. There is simply not enough cognitive space to deal with that problem. It is a problem of working conditions and load.

After returning from "naughty step training" they go back to the same inbox - now with more outstanding work - and make the same mistakes immediately. What should really happen in the case such an employee fails a phish test is a full workload review, rate limiting, and declaration of "email bankruptcy" - where the inbox is simply cleared.

Entrapment by a trusted party is certain to destroy positive psychological relationships. It leads to abusive environments that set employees up to fail in order to send them on ineffective training before being thrown back into the same environment without any effective tools to change their behaviour.

This in turn harms security because it erodes trust in the IT team who become a source of fear rather than support. In the absence of any better security tactics these tests become entrenched in the security culture of a company who start to rely on them as "bad employee honeypots".

Let's look more closely and see what the problem really is:

As we can see, the employee training is only one part of the picture. And, as we shall shortly see, that's not really their fault at all.

Crappy code

To a good approximation most commercial software is rubbish. You don't need to take only my opinion on it. Ian Sommerville, the world expert in Software Engineering who literally who wrote the book, recently said after 40 years leading the field that quality software was a failed project. Ross Anderson, the leading light in Security Engineering and Security Economics has pointed out the multiple ways the software industry runs on negative externalities, has massive principal agent problems and has a necessary interest in placing time to market and network lock-in above security in every strategic analysis.

As Anderson put it, "When Alice relies on Bob's software for her security, but Alice pays the cost for Bob's failure, Bob has no incentive to fix any problems."

What makes it much worse is that individuals and companies rely on a small number (Google, Apple, Microsoft, Amazon) of monopolists who offer seemingly "free" services. In reality their software is not free but takes your data to sell. In order to do that it is deliberately insecure. Indeed, the incentives to write secure commercial software are so bad that governments around the world are having to draft far-reaching regulation to force companies to do it. And even that may not work, because as we have seen with all these companies, Big-Tech considers itself to be above the law.

The problems really break down into technical, economic and policy:

Amongst the technical problems are;

Broken economies

From an economic point of view, a major cause is skills shortage. Education is a positive public externaity whose cost is avoided by giant companies who pay little or no tax. It is a threat to their monopoly.

It seems to make more sense for businesses to use low quality products from big vendors like Microsoft than to invest in more expensive, high quality - but difficult to configure - solutions that are secure. This has side effects. The real, emerging skills gap in cybersecurity is not in front-line employee training but a dearth of capable system administrators and policy makers.

Cloud computing encouraged companies to outsource trust and responsibility for security. Basic skills like system configuration, maintenance, auditing, on-prem customisation and support have declined in favour of outsourced one-size-fits-all monoliths that are externally managed. Fewer companies are capable of even simple things like setting up and running their own email server now.

Put simply; we don't have the smart people who know about computers any more. They all went to work for Google and Microsoft. This is perhaps a hidden danger of monopolies that politicians focused only on the money side of "markets" do not see or understand.

Potty policies

Lastly, let's pick an example from the many policy problems.

Just because someone decides on a "IT security policy" doesn't mean it is 'correct', or, more to the point, even workable. Many IT policies contain contradictions, poor reasoning, or simply stop employees from doing their jobs. They represent internal power divides within firms, and the tendency of ICT services to suffer scope creep and become totalitarian.

A big problem starts with hiring policies. The assumption of prior training is pernicious. Everyone learns to use Microsoft Word at school, right? Wrong! What we call "Basic IT literacy" began in the 1980s as a way to boost the competitiveness of the Western workforce. Kids learned BASIC and how computers work as part of primary and secondary education. It was cool. It was the future. Engagement was high and the skills enduring.

After the mid 90s and into this century the quality of that education plummeted. Microsoft and Google infiltrated the school system and IT education became dumbed-down classes in Word and Excel without any appeal to young minds.

Today most employers assume wrongly that people have "Basic IT skills" on which they can rely. For employers this assumption is an invisible externality. In fact most 20 year-olds arrive at their first job having forgotten anything useful they picked up at school, which is almost certainly out of date anyway.

Millennial generations (Y-Z) learn new technologies on the fly as needed. These technologies are ever changing. No version of, for example, Microsoft "365" looks anything like the last, and the functional behaviour is constantly moving. Why invest personal time and effort in learning something that will change next week?

Besides, it benefits Big Tech and the education system to keep system interfaces in constant turmoil. The tech companies get to appear to be offering something new, and the training sector get an ever-fresh demand for re-training and issuing low level competency certificates. And who are the biggest players in that educational market now? Why, Google and Microsoft of course. Standard, durable IT skills in generic principles rather than products are eschewed to keep this circus running.

Not safe for work

In many cases the software chosen by companies is inappropriate for the workflow and company security. We say "chosen", but in fact it is just an arbitrary default from a BigTech supplier. For example the average web browser is a dumpster fire when it comes to security. Google Chrome browsers leak confidential information, and most browsers run dangerous JavaScript - which administrators wrongly assume is "necessary" - and have poor privacy settings out of the box. Browser companies have been found abusing privacy promises, fingerprinting and tracking users.

In many cases an employee does not need a full browser or even full access to the Internet. A "captive portal" built around a kiosk mode browser that runs a single web application would suffice. In many cases they do not even need to read email as part of work, yet are issued an email by default "for administrative reasons". Instead, an internally secure pull rather than push system of inter-departmental communication would work much better.

Browsers are some of the most bloated and unpredictable pieces of software. They are extensible via plugins which can bring all kinds of gains and risks too. Integrated applications including things like Jira, Office-365, GoogleDocs are packed with features. So many features in fact that they are overwhelming, unnecessary and a security risk. What we get with these flexible 'standardised tools' is a bad alignment of user capabilities with job descriptions. Indeed jobs are often ill-defined, suffer scope creep and make-work pressures that are the root causes of cybersecurity problems. Clearly these are issues that lie with management.

Terrible training

Finally, let's make some not so flattering observations on the quality of remedial cyber-training itself.

Most are bulk purchased by large employers at a standardised rate per seat. To minimise productivity impact they are finely chunked video based training with form based quizzes designed to be digested "during lunch hour". They are therefore designed to be completed on top of an existing workload. Students are distracted, not fully present and just resentfully going through the motions to get the punishment over with. These are the worst possible psychological conditions for learning, and we can realistically expect none of it to stick at all.

Online training videos are mostly space-fillers. In order to make money for the training company they are padded with endless introductions stating over and over what this video is going to teach you, how and in what order. By the time a student gets to the first chunk of actual knowledge, usually in the second or third video, they're dispirited and tired. Scenes of expensive looking stock footage of city skylines accompany tedious puffed up credetialising explaining how the video series is better than others, because it's from "internationally recognised" institutions and experts.

After throwing in some bold claims about the "total coverage" of the course, and how this is the "Only video you'll ever need" (despite the subject being enormous and ever-changing) we'll begin with the meaningless diagrams made of random clip-art, graphs without lables or axes and AI generated cartoons that accompany an incongruous robotic voice-over. These videos serve platitudes and gushing enthusiasm for ubiquitous technology, bleating learned helplessness about technological dependency and theatrical fear-mongering about cyber threats. They are justifications for poor cybersecurity, not authentic attempts to mitigate it. They are "all fur coat and no knickers".

Computer generated voices are in fashion again (because AI reasons) but these so-called amazing advances in "lifelike AI voices" only make cheap production values seem excusable. I find myself grateful for the rude punctuation of gauche, jarring edits and mispronunciations, as the are the only things that keep me awake. The worst human narrator does not send you to sleep in 30 seconds with an irritating monotone of cheap corporate dirge read flatly from a script.

Where there is synthetic expression it is disorienting and cartoonish. I feel like a child being down-talked to by an over-enthusiastic special needs teacher fresh from the empathy training course. Yes, I know that the black hoodie and balaclava-clad figure set against a Matrix backdrop of random green-screen symbols is supposed to be a "bad actor" - and that the cowering Penelope Pitstop character is the "victim" - without two octaves of pitch variance to emphasise that point. Infantalising cybersecurity narratives serve nobody.

Recommendations

Let's stop with the idea that "cybersecurity" can be bolted on as an afterthought for ordinary employees, and that adopting punitive, remedial attitudes is any way to accomplish that.

We're sending the wrong people on the training courses, and that isn't helping security and it isn't going to. Those attending training courses should be senior IT managers and policy makers. They should be getting a proper university-level education in the complexities of cybersecurity ecosystems, security engineering and economics.

We need them making better, and bolder choices about the IT structure of our companies, and not taking their cues from BigTech sales reps.

At present we have what I'll call trickle down insecurity. BigTech companies make a profit by pushing insecurity down onto smaller businesses. Those firms who make poor IT decisions push that pain down on to their employees. And the employees, in turn, transfer loss and misery to the general public or other business customers they serve.

In order to make workplaces safe for employees, for the companies that employ them and for the economy of our country we need a radical shake-up of how cybersecurity education is provisioned and delivered, and what its aims are.

Other Recent Techrights' Posts

EPO Strike a Week From Now, After That Strikes Can Become Permanent
A week from tomorrow there will be another strike
 
Links 23/03/2026: "Shocking Peter Thiel Antichrist Lectures", Robert Mueller Remembered
Links for the day
The Scandal Bigger Than IBM/Red Hat Layoffs is the de Facto "Media Blackout" About Those Layoffs
So we have a media crisis, aside from the economic crises
Gemini Links 23/03/2026: Geminispace/Elpher Enhancement and the Cerberus Cinco
Links for the day
Fear is Not a Legitimate Factor
Smart people know that trying to prevent moral people from doing the "Right Thing" will backfire
Fuel Autonomy and What It Teaches Us About Software Autonomy (or Software Freedom)
Need we wait until a "software Pearl Harbor" or protect ourselves proactively by weaning ourselves off of GAFAMware?
Scheduled Maintenance This Coming Wednesday
Other than that, all is the same and we carry on as usual
Most Press Articles About IBM Are LLM Slop, Sometimes With Slop Images
IBM basically laid off almost 1,000 people last week [...] At the moment about 75% of the 'articles' we see about IBM (in recent days) are some kind of slop
Links 23/03/2026: Security Breaches, Energy Shortages, Another SRA Scandal, and Patents on Nature
Links for the day
Over at Tux Machines...
GNU/Linux news for the past day
IRC Proceedings: Sunday, March 22, 2026
IRC logs for Sunday, March 22, 2026
Streisand Effect and Justice
This weekend this site has served over 8 million Web requests
Gemini Links 22/03/2026: "Woman of Tomorrow" and "First Steps in Geminispace"
Links for the day
SLAPP Censorship - Part 19 Out of 200: They Were Ill-prepared for Tough Questions in Cross-Examination
Very ill-prepared for the deteriorating situation caused by their clients' past behaviour towards many people, including high-profile figures who offered to testify
The Media Sold Out to Slop Bros
If you wish for the hype to stop, then stop participating in it
The Only Non-IBM Staff in Fedora Council/Leadership Attacks Booting Freedom (Just Like the Master Wants)
Last week IBM laid off almost 1,000 people in Confluent and the media didn't write anything about it, so don't expect anyone in what's left of the media to comment on Fedora's demise and silent layoffs at Red Hat
Just Like a Founder of XBox Said, Microsoft XBox is Collapsing, Management Continue to Jump Ship
Nowadays Microsoft tries to promote this idea that Windows is XBox and XBox is Windows
Links 22/03/2026: Slop Triggers Emergency at Meta, Energy Prices Rise Sharply
Links for the day
Links 22/03/2026: Microsoft 'Open' 'AI' in Legal Trouble (Plagiarism, Distortion, Misrepresentation); Facebook/Meta Kills Off "Horizon Worlds"
Links for the day
Racism Dressed Up as "Choice"
Racism is rampant at IBM
Probably an All-Time Record
Our investment in our own SSG is paying off
Your Site Should Implement Its Own Search (Before It's Too Late)
GAFAM was never trustworthy
Gemini Links 22/03/2026: LLM Slop Attacks USENET, Announcing Pig (New Game in Gemini Protocol)
Links for the day
Over at Tux Machines...
GNU/Linux news for the past day
IRC Proceedings: Saturday, March 21, 2026
IRC logs for Saturday, March 21, 2026
SLAPP Censorship - Part 18 Out of 200: Third Parties Funding Attacks on the Messengers, Lawsuits Against GAFAM-Critical Voices That Uphold Real National Security
Women are like kryptonite to them
Never Trust People Who Write Their Own Wikipedia Pages (Vanity Pages About Themselves) or Ask Friends to Do So. Also: Jono Bacon is Married to Microsoft.
We'd hardly be the first to point out Wikipedia isn't what it seems
No Tolerance for Attacks on Family Members
Being a Free software activist ought not lead to "collateral damage" like attacks on family members, including doxing
Sirius Open Source is Just a Zombie Firm With Shell Entities
Many companies fake their health and their size
Communities Can Only Survive When Trust Prevails
PCLinuxOS is still a vibrant and authentic community
Techrights Was Always a Community Site
The harder we're attacked, the more people participate in the site
Maintenance Reminder
We'll carry on publishing
Behind the PR Smokescreen and Microsoft-Sponsored Chaff, Microsoft Layoffs in "AI" Alleged This Month
In an age when ~1,000 simultaneous layoffs aren't enough to receive any media coverage, what can we expect remaining publishers to tell us about Microsoft layoffs in 2026?
EPO "Cocaine Communication Manager" - Part VIII - Mobbing and Silencing of Dissenting Staff
that's the very cornerstone of functional democracies with real opposition parties
Bluewashing at Confluent: Some Workers to Leave Within 3 Months (IBM Mass Layoffs)
Is the "era of AI" an era when none of the media will mention over 800 layoffs? [...] There's a lesson here about the state of the contemporary media, not just IBM and bluewashing
Microsoft OpenAI, Drowning in Debt and Forced to Make Significant Cuts (as Reports Reveal This Month), Does Hiring Disguised as "Takeovers" to Fake Value or Alleged Potential
Remember what happened to Skype last year
Reader Shares Recent Memes on Slop and 'Coding' by LLMs
"just some funny memes I thought were relevant to current coverage."
Slop Does Not Replace Art, It Contaminates Everything With Reckless Nonsense
many Computer Scientists do not want programs to get contaminated by slop
Coders Don't Just Reject 'Vibe Coding' Because They're "Luddites", They Just Know the True Cost of Slop
if some programmer says slop sucks, don't rush to assume selfishness or defence of one's occupation
When Nobody Else Covers the News
There's an obvious "media blackout" regarding the mass layoffs
Links 21/03/2026: David Botstein Dies, Slop as Censorship Apparatus
Links for the day
Links 21/03/2026: Metastablecoin Fragmentation and Crescent Moon
Links for the day
Gemini Links 21/03/2026: Historic Ada Docs; The Lurking LLM on the SmolNet
Links for the day
HSBC the Latest Failed Bank Using Slop as Excuse for Its Financial Failure
"HSBC is planning on cutting as many as 20,000 jobs in the near future as the company allies with AI revolution."
Invitation to General Assembly After 1,200 EPO Workers Participated in the Demonstration 3 Days Ago
"the strike of 19 March was also very well followed."
A/Prof Susan G Kleinmann, Enkelena Haxhija & Debian-private risk to MIT
Reprinted with permission from Daniel Pocock
Over at Tux Machines...
GNU/Linux news for the past day
IRC Proceedings: Friday, March 20, 2026
IRC logs for Friday, March 20, 2026