ChatGPT, Bard, GPT-4, and the like are often pitched as ways to retrieve information. The problem is they'll "retrieve" whatever you ask for, whether or not it exists.
Tumblr user @indigofoxpaws sent me a few screenshots where they'd asked ChatGPT for an explanation of the nonexistent "Linoleum harvest" Tumblr meme,
Here’s the thing, the LLM isn’t recalling and presenting pieces of information. It’s creating human-like strings of words. It will give you a human-like phrase based on whatever you tell it. Chatbots like ChatGPT are fine tuned to try to filter what they say to be more helpful and truthful but at it’s core it just takes what you say and makes human-like phrases to match.