Hallucinations won’t go away

Gary Marcus explains why LLMs/Generative AI will never do away with hallucinations, no matter how much data they slurp up. These AIs have no “world model,” meaning they don’t arrange data according to rules and parameters. ChatGPT has the rules of chess somewhere in its database, but it doesn’t keep them in a box labeled “Rules of Chess.” They’re mixed in with everything else. #artificial-intelligence #gary-marcus