9 pointsby freediver5 hours ago7 comments
  • thinkingemotean hour ago
    It's like hearing that yesterday your Grandma got scammed by a basic Nigerian Prince email scam of 20 years ago sophistication.

    Poor chap, we will see an increasing number of people we once respected get fooled.

    But looking at it another way: this should make us seriously think what the chat interfaces to LLM are doing to many people.

  • big-chungus43 hours ago
    He "met Claude", people have been using AIs on the daily basis and he's probably never even used a Google Translate let alone an LLM, and he is using it for the first time without knowing any of how it works and claiming something about it. He should try running Qwen3.5 4B in Unsloth Studio next.
  • NoPicklez5 hours ago
    > said AI consciousness was “an illusion” and “there is no one there”, just a string of data processing events often happening in geographically different locations.

    I appreciate the "well technically no" points when someone says AI is conscious or intelligent. Because we understand how it works and can argue why it does what it does and all of its flaws.

    But you can't mistake the latest LLM's abilities to mimic it, particularly the latest models we have today like Claude Opus 4.7. Humans are also very good at being confidently wrong and the latest models are getting pretty close to that if not better than most people.

    As it gets less and less perceivably different to human interaction and in many cases potentially doing a better job in some regards to human interaction, then on the face of it to the end user, what's the difference?

    • cwilluan hour ago
      “I don't know what it is, but that ain't it” is just the standard confused thinking on consciousness.

      We don't know what consciousness is apart from the direct experience, and at best we have informal arguments for why data processing can't be it.

    • m30474 hours ago
      Had catfish lately?
    • onetokeoverthe5 hours ago
      [dead]
  • yawpitch3 hours ago
    Of course Richard Dawkins would also conclude that Richard Dawkins is conscious, even if Richard Dawkins doesn’t know it.
  • bloqsan hour ago
    To quote another thread - "LLMs are glue traps for narcissists"
  • antibull5 hours ago
    [dead]