An LLM Inside a 'Search' Engine Means That Companies Tell You What They Want, Not What Web Pages to Visit
The future of 'googling' things might be as unreliable as using Social Control Media as a source of information.
THIS is a classic "test case":
Can search engines be trusted when they "speak" instead of list relevant results from authoritative pages, based on how often they're cited (PageRank/BackRub)? Is Google gradually abandoning Web search in pursuit of greater power (bypassing or ripping off originals)? If so, can Google be trusted?
Some people already seek "medical advice" this way. They see grammatically-correct sentences with plausible text in them. But the facts are wrong (or might be right, albeit by mere coincidence) The LLM frenzy is going to needlessly kill a lot of people. █