Google Spreading Misinformation and Lies
Even about GNU/Linux
Yesterday: Glue Inside Your Pizza (or Why People Will Get Fed Up With Slop)
Some minutes ago, when working on this very short article, I wanted to add one particular photo which had I uploaded years ago. I searched to no avail. Looking for a 1990s photo from Switzerland, Google "search" is pushing LLM slop that is TOTALLY false and unrelated. This is beta-grade nonsense! Why does Google insist on foisting/imposing falsehoods on people? Even when they just look for an old PHOTO? Google is in the propaganda business. As explained yesterday afternoon (see above), this is becoming part of a disturbing pattern and Akira Urushibata posted to libreplanet-discuss
yesterday at around noon about the same sort of thing:
The Osaka Expo opened on April 13. In search of news of the event I consulted Google search (in Japanese). Google replied in the ordinary way, with the URL of the official Expo site on top followed by news items, images and such. Among them, on the first page was a rather new field which listed "related questions". The first of them was a great surprise. It read:
"For what reason(s) was the 2025 Osaka Expo canceled?"
Other items of the list discussed ticket prices, access to the grounds. Obviously the above text was generated by AI, for it is unlikely for a human being to err in this manner, but Google does not clarify. (Confirmed April 28 Japan time) So, one part of Google (probably AI) treats the exposition as canceled while another part understands that it is in progress. There is a failure to notice an obvious contradiction.
Can AI distinguish between "free" as in "free drinks" and "free" as in "freedom"? We should watch out for this. The above makes me think that we should not expect much. Even if we observe AI getting the distinction right in a number of cases we should not expect it to be always correct.
For one thing many people are confused and much has been written based on the mistake. it is unreasonable to expect that none of the erroneous material has been fed to the large language model (LLM) neural networks for training. In addition the above example indicates that AI is not good at coping with lack of integrity. Perhaps a contradiction which is obvious to a human being is not so for AI.
---
Related article: Researchers have found methods to investigate what is going on inside the neural networks that power AI. They have discovered that the process differs greatly from human reasoning.
We Now Know How AI 'Thinks'--and It's Barely Thinking at All https://www.yahoo.com/finance/news/now-know-ai-thinks-barely-010000603.html
Though I don't follow developments in this field closely, what is written in this article confirms my suspicions.
Another way to see this is that the AI developers now have the equivalent of a debugger which allows them to probe the internal process. This is likely to affect development.
There's no thinking at all. Those things just spew out nonsense and mislead people. There are now many videos in YouTube (flooding the platforms) which are slop with LLM nonsense read out by synthetic voices. They're full of errors, but listeners/viewers have no idea and might simply absorb what's coming out as "facts" because of the authoritative voice.
Google is just a malicious propaganda spreader; it doesn't care as long as it can make money out of it; it's just "flooding the zone" with "s***" ("bulls*** generators" as Richard Stallman calls these).
My theory is, Google is trying to fake "demand" and is thus pushing slop or feeding lies to people who didn't ask for any; similarly, Microsoft was sticking LLM things into every software and service it had in order to artificially "manufacture" or fake "growing interest" in what was failing and waning. Weeks ago Microsoft openly admitted usage was declining. People get sick and tired of LLMs. Many businesses tried those (due to the delirious hype) and abandon these. The bubble is bursting. It's dying. What might shareholders say? It was money down the drain. It left us worse off - in a society (and Web) full of falsehoods. █