28 pointsby zdw6 days ago3 comments
  • llIIllIIllIIl2 days ago
    Is Murat suggesting that thinking equals predicting the next token or is this just fiction to read?
  • tensor2 days ago
    Except that something like half of people don't have any internal monologue. It's tempting to pretend that LLM are doing similar things to our brains, but the reality is that they are extremely different and only very very superficially appear to be doing similar things.
    • tdeck2 days ago
      Even those of us with an internal monologue aren't using it most of the time. I doubt I've ever though "now let me scroll down to the next comment", for example.
      • Ancapistani3 hours ago
        ... is this true?

        I'm literally always thinking in that way. If I focus a bit harder on it, it's apparent that there's another, faster cognitive layer beyond my background monologue as well.

  • blibble2 days ago
    > One of the most profound pieces of advice I ever read as a PhD student came from Prof. Manuel Blum, a Turing Award winner. In his essay "Advice to a Beginning Graduate Student", he wrote: "Without writing, you are reduced to a finite automaton. With writing you have the extraordinary power of a Turing machine."

    only if you have an unlimited amount of paper!

    otherwise you're still a finite state machine (technically)

    • whattheheckheck2 days ago
      There feels to be something special about reducing the computation to self reflect or just be in the tao/dao

      Maybe it's still in the state machine but it feels like one escapes the trajectory of a Turing machine running the program

    • measurablefunc2 days ago
      And also only if you know how to do boolean algebra.