To get a reality check, open up 3-4 different models (ChatGPT, Claude, Gemini, etc.), and ask them topics you know really well, and questions you already know the answers to. And see that maybe a quarter, or 25% will fail somewhat. Some topics are of course easier for these than others.
This video is about people who believe ChatGPT (or another LLM) is a sentient being sent to us by aliens or the future to save us. LLM saviour is pretty close to a religious belief. A pretty weird one, but still.
> o get a reality check, open up 3-4 different models (ChatGPT, Claude, Gemini, etc.), and ask them topics you know really well, and questions you already know the answers to. And see that maybe a quarter, or 25% will fail somewhat.
I have tried this a bit with ChatGPT, and yes, there are a lot of issues. Things such as literally true but misleading answers, incomplete information, and a lack of commonsense.
People place plenty of trust in astrology, tarot, and I Ching without requiring they have an subjective experience.
If anything, there's a tendency of technologists to have a blind spot identifying AI as such. The dismissal and sometimes contempt held for divination makes it genuinely difficult to recognize it when it's not decked out in stars and moons.
It's interesting if anything that the Barnum principle applies in both cases.
The internet is full of pure nonsense, quack theories and deliberate fake news.
Humans created those.
The LLMs essentially regurgitate that, and on top they hallucinate the most random stuff.
But in essence the sort of information hygiene practices needed are the same.
I guess the issue is the deliver method. Conversation is intrinsically felt as more "trustworthy".
Also, AI is for all intents and purposes already indistinguishable from magic. So in that context is hard for non-technical people to keep their guard up.
Functionally, it’s similar to why LLMs hallucinate.
Good video essay. Learned the origins of the term "cargo cult", and to my surprise, has nothing to do with rust...
Could you recommend some further reading to dig into this insight?
Also I'm curious why you created such a topic-specific user, I guess for privacy?
Chinua Achebe, Arturo Escobar, Ashis Nandy, Dipesh Chakrabarty, Edward W. Said, Frantz Fanon, Gloria E. Anzaldúa, Jasbir K. Puar, Jodi A. Byrd, Michel-Rolph Trouillot, Ngũgĩ wa Thiong'o, Robin D. G. Kelley, Silvia Federici, Sundhya Pahuja, Leanne Betasamosake Simpson
> at the root we're all living in fractured relationship with each other
Indeed, and technology plays an increasing role in mediating and shaping those social relations. That's very relevant in the context of ChatGPT becoming a kind of oracle and object of worship.
Prior to, activating this population required a high IQ/EQ psychopath to collect followers, or schizophrenic's who believed they were talking to a superior being ('my leader talks directly to me via his writings').
Now however, people can self-hypnotize themselves into a kind of self-cult. It might be the most effective form of this phenomenon if it's highly attuned to the individuals own idiosyncratic interests.
In a typical cult, people fall into or out of the cult based on their internal alignment with the leader and failed enlightenment. But if everyone of these people can have their own highly tailored cult leader, it might be a very hard spell to break.
When you believe in things that you don't understand, then you suffer.
--
I can, however, elaborate on the subject separately from that quote.
The video talks about the more extreme cases of AI cultism. This behavior follows the same formula as previous cults (some of which are mentioned).
In 2018 or so, I noticed the rise of flat earth narratives (bear with me for a while, it will connect back to the subject).
The scariest thing, though, was _the non flat earthers_. People who defended that the earth was round, but couldn't explain why. Some of them tried, but had all sorts of misconceptions about how satellites work, the history of science and so many other mistakes. When confronted, very few people _actually_ understood what it takes to prove the earth is round. They were just as clueless as the flat earthers, just with a different opinion.
I believe something similar is happening with AI. There are extreme cases of cult behavior which are obvious (as obvious as flat earthers), and there are the subtle cases of cluelessness similar to what I experienced with both flat-earthers and "clueless round-earthers" back in 2018. These, specially the clueless supporters, are very dangerous.
By dangerous, I mean "as dangerous as people who believe the earth is round but can't explain why". I recognize most people don't see this as a problem. What is the issue with people repeating a narrative that is correct? Well, the issue is that they don't understand why the narrative they are parroting is correct.
Having a large mass of "reasonable but clueless supporters" can quickly derail into a mass of ignorance. Similar things happened when people were swayed to support certain narratives due to political alignment. The flat-earthism and anti-vaccine pseudo nonsense is tightly connected to that. Those people were "reasonable" just a few years prior, then became an issue when certain ideas got into their heads.
I'm not perfect, and I probably have a lot of biases too. Narratives I support without fully understanding why, probably without even noticing. But I'm damn focused on understanding them and making that understanding the central point of the issue.
When you and your allies have all the tech, but the enemy still finds cheap and easy ways to make them ineffective (Vietnam War). Makes one question if all the gizmos are worth it, really shakes up the morale.
I was not talking about about a confrontational situation though. Most cults and pseudoscience are just plain scams.
I philosophically don't think AGI as described is achievable because I don't think humans can build a machine more capable than themselves ¯\_(ツ)_/¯ But continuing to insulate it'll be here in a few months sure helps put some dollars in CEOs pockets!