3 pointsby byandrev3 hours ago2 comments
  • byandrev3 hours ago
    Unlike large models such as Gemini or ChatGPT, where information is extracted from numerous web sources that may contain “hallucinations,” NotebookLM relies 100% on the sources you provide, such as PDFs, audio files, YouTube videos, Google Docs, or even articles. By working exclusively with your sources, the tolerance for hallucinations is very low.
    • knollimar3 hours ago
      Huh don't most hallucinations come from the models internal knowledge and not the RAG?
  • burnerToBetOut2 hours ago
    Please clarify the Google connection.

    I'm guessing that it's an official Google-built product. [1]

    [1] http://support.google.com/notebooklm/answer/16179536?sjid=62...