Gemini on the Google search page constantly answers questions yes or no… and then the evidence it gives indicates the opposite of the answer.
I think the core issue is that in the end LLMs are just word math and they don’t “know” if they don’t “know”…. they just string words together and hope for the best.