10 pointsby MohskiBroskiAIa month ago2 comments
  • verdverma month ago
    > zero-hallucination guarantees

    This is impossible, either your testing is wrong, incomplete, or you are the one hallucinating

    • MohskiBroskiAIa month ago
      [flagged]
      • verdverma month ago
        Did the AI tell you this is all legit?

        I'm not going to make waste time verifying some random on the internets idea that they solved P=NP or hallucinations in LLMs

        If you have, you'd be able to get the results published in a peer reviewed forum.

        Start there instead of "I'm right, prove me wrong"

        Have you built lt the thing to know it actually works, or is this all theory with practice?

        Show us you are right with implementation and evaluation

      • verdverma month ago
        > Don't be a troll. Prove me wrong. Run the code.

        There is no code in the repo you linked to, what code am I supposed to run?

        This just looks like stateful agents and context engineering. Explain how it is different

      • MohskiBroskiAIa month ago
        [flagged]
  • MohskiBroskiAIa month ago
    [flagged]