LLMs have difficulties in representing out of distribution viewpoints, lines of thinking and conlusions. Original thought is out-of-distribution by definition.
I wrote up why i think this matters more than most people realize as I believe the boring, routine, daily application of generative AI will be enough to detoriate society as a whole - self goverment on autopilot basically with strong attractors to default outcomes.