I am seeing more and more folks pop into forums (for example a car discord) asking for advice on, say tires, and then saying βI asked an LLM/AI/GPT and it says Xβ, non ironically.
They want humans to validate the LLM for them. Too lazy to do it themselves. Not experienced enough to ask the LLM for citations, too lazy to review the citations if they were given any.
People are lazy and LLMs are not helping the lazy people. They think itβs helping. Itβs just making things worse.