2 pointsby ClearwayLaw2 hours ago2 comments
  • ClearwayLaw2 hours ago
    Ventures increasingly train AI agents on retrieval augmented generation (RAG) systems that containerize data in small data sets
  • reifyan hour ago
    Its not that difficult really.

    Considering that AI never hallucinated in the first place.

    It basically fucks up and squirts out shit.

    Its like putting too much animal feed in a cows mouth and waiting at the other end with a bucket.

    hallucinate is a made up word for the stuff you eventually get in your bucket.

    Excuse me for two minutes while I pop to my toilet to hallucinate a big turd.