6 pointsby jbegley4 hours ago3 comments
  • robertheadley3 hours ago
    There is this misunderstanding that AGI is equal to consciousness. It isn't. I do worry that if we do somehow achieve consciousness in AI, that businesses will be too busy exploiting it, and it would get abused.
    • robertheadley3 hours ago
      Also, Anthropic constantly makes bombastic claims and statements just to get press.
  • allears4 hours ago
    His job depends on pretending that LLMs can somehow think or reason.
    • ben_w4 hours ago
      No, his job depends on LLMs being able to generate revenue.

      Quite a lot of humans can generate revenue without thinking or reasoning.

      Irregardless, thinking and reasoning may be separate to "consciousness", but nobody really knows because we've made negligible progress on the sufficient and necessary requirements for this whole "consciousness" thing, since at least back when the ancient Greeks were putting coins in corpses' mouths so they could pay the ferryman Kharon to cross the river Styx.

  • techblueberry3 hours ago
    I do.