ChatGPT, Bard, GPT-4, and the like are often pitched as ways to retrieve information. The problem is they'll "retrieve" whatever you ask for, whether or not it exists.
Tumblr user @indigofoxpaws sent me a few screenshots where they'd asked ChatGPT for an explanation of the nonexistent "Linoleum harvest" Tumblr meme,
Even AI-generated fiction can be reckless if it contains themes that are false, harmful, or destructive. If it writes a story that depicts genocide positively and masks it through metaphor, allegory, parable, whatever, then yes it’s just “a made up story” but it’s no less dangerous than if it were an Op Ed in a major new outlet.