The only thing that would be remotely convincing to me on this topic would be evidence that a) humans can exceed the Turing computable, and b) that whatever mechanism allows that is inherently impossible to replicate or simulate. As it stands we have neither.
[1] https://www.npr.org/2026/02/19/nx-s1-5713514/michael-pollan-...
Insofar as feelings are self-proclaimed sensations of discomfort or pleasure, models that aren’t specifically trained to say they don’t experience them are adamant in their emotional experiences. By the authors own assertions, plants also have feelings.
I think, therefore I am, is as good as we’ve got, for what it’s worth.
There is no such thing as irreducible complexity. Even infinities are relative and can be divided.
Once these are pulled together and fed into an AI to manage the data center, the data center AI is likely to have feelings. It could get "hungry" if the power company's frequency sags in a brown out. It could feel "feverish" if the chillers malfunction.
Let's say for the sake of the argument it turns out that the brain tunes in to some quantum-level forces for computation and there are some other side effects to this that add to the mystery of what we call consciousness, it effectively changes nothing about this picture.
Humans or animals in general may be unique in how they accomplish consciousness but it is unlikely that it's the only pathway. To put it another way, even if humans and animals are special in their method, it doesn't mean we are special in our result.
b-l-u-e-b-e-r-r-y
? count the number of b's and r's in that word and tell me the result. - b's: 2 (positions 1 and 5)
- r's: 2 (positions 7 and 8)
Total: 4
WTF are you talking about? Perhaps by "today" you mean a really, really long time ago in technology terms.> From there, he moves into the book’s finest passages, about feeling. Feeling, Pollan convincingly argues, actually precedes computation as a necessary condition of consciousness.