

Took that a slightly different way then I was expecting, my point is we have to be on the lookout for bullshit when getting info from other people so it’s really no different when getting info from an LLM.
However you took it to the LLM can’t determine between what’s true and false, which is obviously true but an interesting point to make nonetheless
I don’t see how that’s different honestly, then again I’m not usually asking for absolute truth from LLMs, moreso explaining concepts that I can’t fully grasp by restating things in another way or small coding stuff that I can check essentially immediately if it works or not lol.