LLMs Destroy the Web Not Just by Filling It With Cruft and Chaff (Slop)
Stochastic parrots? Created by staging de facto 'DDoS' attacks (indistinguishable from real attacks)? It'll destroy many sites or increase the cost of running them.
This morning we mentioned problems that Phoronix was still having due to some proprietary PHP software it had used for many years. Phoronix complained about "Phoronix.com site performance [deteriorating due to] the database server being hammered recently from the forums." (Which are proprietary software, still; data is locked into a vendor's schema and "licence")
There are already more than 100 comments about this. The "upgrade" of the proprietary software broke some things; the first comment said: "Is it just me or are the "number of likes" completely gone with the new forum software?"
"Same for me," said the next comment. "And the "Quote" button takes a lot of time for me."
Welcome to software you cannot fix, even if you wish to. That's proprietary crap in a nutshell.
A better approach might be to turn all the old forum pages into static pages. It'll help deal with scrapers, many of which are part of the overambitious LLM "Ponzi scheme" (companies being created and faking perceived "value" or "size" by swallowing anything they come across on the Web). With proprietary crap it might be harder to convert the database into static pages, but who knows...
As an associate explained today, "I guess the one good thing about letting the bots scrape TR [Techrights] is that they train on TR content which they will then regurgitate later. I wonder how many times each bot has looped through the entire site."
A few days ago about a million requests were made to the site, many of which came from scrapers (hitting many times per second). LLMs are sort of staging 'DDoS' attacks on small sites. Guess who pays the most for it; not the scrapers! █
