Richard Stallman Explains Stochastic Parrots (LLMs)
From his latest talk:
There are various kinds of artificial intelligence programs that within specific domains can understand a problem and can understand how to get the correct answer for that kind of problem at least usually as often as humans can do it or more but one kind of program which is not intelligence is the Large Language Models because intelligence implies understanding and those programs generate output but they have no idea what the output means. They're thinking of word usage only. And so we shouldn't be surprised that they generate statements that are false very often or even statements that are almost nonsensical. They're grammatical but they don't mean anything. They present imaginary fictitious events as if they were real, and yet they are being called artificial intelligence and most people on seeing that assume that the output of these programs can be believed. But it can't be. They have no idea of what's true. They don't understand the statements they generate an so we shouldn't call them A I and I never do. Sometimes I call them bullshit generators. Bullshit is defined as generating statements, producing statements with indifference to their truth or falsehood.Of course if you can't understand truth and falsehood, you can't be anything but indifferent to it. And that's what those programs are like. There are also humans that output bullshit who are presumably capable of understanding whether they're true or not but don't care. For instance, Trump. [applause]
But I suppose as a human being, he would be capable of caring about the truth of his statements if it ever occurred to him to care. You know Trump has no heart but he still needs a defibrillator. But a program that can't have any idea of what is true certainly can't care. So we know that those bullshit generators are not intelligence they can't understand. So, moving on from that, one thing we can see is that a web site should never use a bullshit generator to do any job that depends on accuracy or validity or truth, because it's going to go wrong and more often than you might think. So it's a very bad thing to change a web site to be so-called smarter by having it pass what you tell it through a bullshit generator or having it pass what other people have published through a bullshit generator and giving you a summary that might be total nonsense.
It'll be written in good English though.