Dr. Andy Farnell on Marketing Bad Things Like Slop Using FOMO (Fear of "Being Left Behind")
Took the bait just one month after his employer took bribes from Microsoft to promote slop ("AI") [1, 2].
2-3 hours ago Dr. Andy Farnell published another long essay that contains interesting thoughts on many of the same themes we often cover here. It's well worth reading. Some portions:
Corporate control is now the art of personally avoiding it. I believe this rampant abdication of human power and responsibility is an important aspect of (2026:) current "AI" automation ambitions, and the Digital Self Defence needed to counter it.
[...]
Great harm is often done by trusted systems simply when they fail. Disaster does not require that the trusted cybernetic systems we build are the sentient malevolent AI's of science fiction. We just need to helplessly depend on them and never question them.
Neil Postman constantly asks the question of why we choose to externalise trust in systems, and hints at a misanthropic principles. I have heard Silicon Valley techno-utopianism called out as 'species treachery'. There is a self-loathing at the bottom of many people's deification of technology.
Our narrative, as technologists, is that we're on a road to hell paved with good intentions, we are just innocent travellers. At worst, to the Behemoth machinery, we are handing over the keys to our lives out of tiredness, weakness and fear in the face of overpopulation and climate threats.
But as Postman points out, the solution to all these challenges is not with technology, but human endeavour, education, creativity and community. Technology may help, but only if the necessary human conditions are the foundation. If we go against Nietzsche's warning, along the path of raising technology to our new god, we need to accept that, like the West's Christian God, technology will only save us if we are worthy of saving. Some will be chosen, some will not.
[...]
Most technology needs "selling" because it is just a tool. Beyond things like hammers and bicycles, it is rare to encounter any technology so obviously useful and intuitive that a user immediately recognises its affordances and takes naturally to it. Users need, at least training, and in most cases convincing to take it up.
Digital "smart" technology is a solution looking for a problem, a way to persist growth in a tech industry that would otherwise collapse since we've passed three-sigma of utility and hit fundamental resource limits around rare metals (and some would argue we've passed "peak technology" overall). Instead of perfecting the technology we have, we're pouring money into rich kid's sci-fi toys.
So the "markets" need bullying through peer pressure, threats of being "left behind", and while technology advances in a technical sense, a further social project of advancing technology uptake as a fantasy occurs in parallel. It is, as Chomsky would say, simply manufacturing consent (frivolous demand). Science fiction films are useful to create a benign Star Trek universe. But when that fails, because people are broadly satisfied, another method is not to sell or promote it, but rather to insinuate it quietly, poisonously, into our lives as "necessity".
Footnotes point out the impact of the planet too: "Today as developers with valuable, hard earned skills, we toil pointlessly making better ways to sell shampoo or trick people into clicking on malware. Meanwhile climate catastrophe and huge social problems that we could solve present no incentive and our stupid governments pour money into "AI"." █

