Permutation in LLMs Does, Inevitably, Change Meanings and Therefore LLMs Cannot Properly Rephrase or Summarise Texts
LLMs lack actual grasp or comprehension of what they spew out
We recently examined what plagiarism programs do to text, code, images etc. They don't really offer anything of value, they just replicate other people's work whilst attempting to pass that off as "fair use" and lowering the quality of the aggregated material.
"Tragedy of the commons and fallacy of composition" come to mind here.
"In your recent Techright's article titled "LLMs Breaking Everything"," a reader said, "you mention the "tragedy of the commons." There is a related concept known as the "fallacy of composition." It is quite easy to understand. You should take a look at the Wikipedia page."
To quote Wikipedia: "The fallacy of composition is an informal fallacy that arises when one infers that something is true of the whole from the fact that it is true of some part of the whole. A trivial example might be: "This tire is made of rubber; therefore, the vehicle of which it is a part is also made of rubber." That is fallacious, because vehicles are made with a variety of parts, most of which are not made of rubber. The fallacy of composition can apply even when a fact is true of every proper part of a greater entity, though. A more complicated example might be: "No atoms are alive. Therefore, nothing made of atoms is alive." This is a statement most people would consider incorrect, due to emergence, where the whole possesses properties not present in any of the parts. The fallacy of composition is related to the fallacy of hasty generalization, in which an unwarranted inference is made from a statement about a sample to a statement about the population from which the sample is drawn. The fallacy of composition is the converse of the fallacy of division."
Therein lies a problem for LLMs. Picking or cherry-picking many facts out there on the Web (assuming it's not an LLM training on its own flawed output) and throwing them in the "blender" won't guarantee that the "juice" will taste good.
There's yet another problem and analogy here.
The reader added: "I also notice that the tragedy of the commons can be caused by bloated web-pages. Each author or blogger uses sophisticated web-authoring tools which implant elements that are barely necessary."
"Another, quite obvious example is security vulnerabilities caused by OS monoculture. There is economy in using the standard computer software. Mody everybody buys PCs with, say, MS Windows installed. Crackers exploit the bugs of the ubiquitous Windows machines and cause widespread theft or erasure of data, outages, etc."
"There are several notable examples of the tragedy of the commons in the field of digital communications alone."
The bottom line is, there are tasks which cannot be automated without actual intelligence and intellect. Parsing words and throwing them as mere tokens into LLMs isn't the same as inference. A recent much-publicised paper from Apple argued that LLMs have severe limitations. Apple is a company based mostly on hype, but the LLM hype didn't attract Apple because, unlike Microsoft, it still sells actual things. It doesn't need to sell an empty promise. That's why Apple, unlike Microsoft, didn't have like 8-10 waves of mass layoffs in 2025.
Trying to rephrase/rebrand "plagiarism" as "AI" and "bad coding" (throwing together sloppy things) as "vibe coding" is dishonest marketing [1, 2]. That's all it is. █