2 pointsby sichengo3 hours ago2 comments
  • 1970-01-013 hours ago
    >If hallucination is structural, then the real adjustment is to update our expectations.

    I can't think of a harder job than changing public expectations on how to use AI. Landing a man on the surface of Mars and returning him to Earth would be easier.

  • rekabis3 hours ago
    The primary example given here is not hallucination, but a simple failure to understand, at a conceptual level, the question being asked. Real hallucination involves making stuff up. So when you ask it about a legal issue, and it brings up case law that doesn’t even exist, that is an example of hallucination.

    In the end, real hallucination comes down to incentives built into the AI: where no answer is weighted just as negatively as a wrong answer. Therefore, the AI is incentivized to provide any kind of an answer, even if it has no bearing in reality.