The World Wide Web - and the Net as a Whole - Drowns in Bot Traffic, Partly Due to Computer-generated (CG) Hype - and Generating/Generative Harvesting Farms (Also: Sam Altman Just Ousted! The Microsoft Chatbot Bubble is Imploding, Just Wasting Power and Bandwidth!)
YESTERDAY in Daily Links we included this story, which says it's a lot worse than we stated only a couple of months ago, citing one new - at least at the time - survey. A previous survey said that things improved a little this year and that 'only' about half of Web/Net traffic is now bots/junk, down from the prior year. But this one says 73% is bad (undesirable) bots. So it must be getting worse. Either way, trends aside, this is objectively bad.
As a reminder, some sites and companies took action or threatened to take action against OpenAI and Microsoft for scraping a lot of "content" at the expense of those companies' back end, causing all sort of technical issues.
Well, first of all, Sam Altman has just been kicked out. He got the boot (see Sam Altman ousted as OpenAI’s CEO). They are losing money like crazy - bankruptcy seems like only a matter of time - and OpenAI will go out of business if Microsoft stops funding them or offering 'free' hosting to fake Azure 'demand'. Microsoft has its own financial woes*, as we'll explain in a later article some time on Saturday. We recorded a video about it last night, just need to finalise the text now. Going back to the subject of bot traffic, we too have witnessed this for years and we're deeply concerned. The sorts of requests that come from mindless bots generate heavy loads... and who for? For sure some companies stand to gain from the open Web becoming a vortex of energy-wasting activity, but those companies are themselves not profiting (Clownflare for example**). As somebody put it, that's "73% [of] electricity wasted shifting those packets around on the Net" (also costing in hosting bills to the targeted sites).
We're generally glad SecurityWeek points out bots took over the Net, maybe the Web too. This merit more widespread discussion because many sites and companies fake their "reach", citing traffic that's basically fake, making their overambitious claims bunk and merely increasing their "clown computing" bills. It's a big problem and it has been getting a lot worse in recent years. "Bots == Wasted Electricity / Wasted Bandwidth" as someone put it, but it's seen as an externality to bot operators. To them, it is "cheap"; they send one or two TCP/IP packets, causing churn (wear and tear) at the other end and they might discard the packets sent in response. They just don't care....
This whole bot issue on the Net means that going static is good proactive practice, as it lowers operational costs and churn. It leads to better speeds and reduced complexity. The speed does matter; I sometimes navigate Gemini and the Web with keyboard only, as the latency for both is so low that it's like navigating pages locally.
If the Web continues to get this bad, sites should at least prevent requests for pages from invoking programs (like PHP). On the Net, restricting access can help, but then we undermine the notion of openness and accessibility. Cui bono? █
___
* Microsoft no longer sells or licences software. It's trying serfdom instead. With Microsoft, ownership does not exist. You're not even renting. With all this "clown computing" nonsense you're not even licensing or deploying anything. You are a slave and you pay the master, let alone remain unpaid. It makes no sense at all.
** See this month-old video for more on Clownflare and other such companies. Their business model is a ticking time bomb. Avoid them for your own sake.