To the extent that people still need to be able to critically assess what AI delivers to achieve their goals, they will still pick up those skills or fail. They will then need to either invest the time to learn, or they'll fail to find employment, or fail in other aspects of life.
When we see people lamenting lost skills like this, it is usually a result of them overestimating the continued necessity of certain skills in the face of new technology.
You won't suddenly have a generation of software developers (for example) who don't know the necessary skills to do their work, but you may get a generation of software developers who don't have the skills you think are necessary to do their work.
This sounds like a great future! Nothing worrying here at all.
Or, the people who evaluate them will be suffering from the exact same self-inflicted cognitive limitations, and promote them, or at least not fire them.
The quality of this firm's product suffers perhaps, but it doesn't matter. The consumer will again, in all likelihood, be limited in the same way.
Everyone's happy.
When you see an error like, "error: expected ‘,’ or ‘;’ before ‘include’" you know what happened and where to look because you've seen it a hundred times before.
AI takes that away. It's not inherently bad, it's great that it can solve those sort of things for you. However, the second order effects are terrible. You end up never developing that experience. Is this simply evolution of the craft? Is that experience no longer necessary?
I could be wrong, but I believe that experience is necessary and losing it will be a net negative. Furthermore, the reduction of experience will increase dependency on these tools and the companies that provide them.
AI is a tool to help you see the forest from the trees.
You reading articles the old fashion way can be akin to seeing the trees but not seeing the actual forest.
Young minds tend to learn. How they do it, the old fashion way, the new AI way, they will learn.
Many blank out in school on different subjects and the cognitive overload byproduct follows them all their life making them wary of new things.
And finally, maybe you, personally, are reaching a limit in your comprehension of the modern world, and you show it by fighting the wrong battle with the wrong arguments.
Or maybe you are onto something.
I don't know where you get the 'cognitive overload' term from (it's not in the article). But it general; cognitive effort is what drives our brains to learn in the first place.
As an organ in an organism, the brain is very adverse to using energy, because the organism might need it later to run or fight some danger. Learning costs energy, and the brain rather doesn't if it doesn't need to. The only reason that the brain will ever learn anything, is if you repeatedly expose it to 'cognitive effort', because in this case the effort of learning will save energy in the long run.
If you use AI so that most things don't require cognitive effort, your brain will not use those learned neural pathways, and they will atrophy over time.
The only thing that the brain learns from using AI is that the most efficient way of doing anything is having the AI do it for you.
One, if cognitive overloading is not in the article, then it's good, it means I actually put some thought in the responding effort.
Two, AI expands possibilities for those that want that, and offer shortcuts for those that want that. No different from any other learning process: you could actually learn something or you could just do it. It makes sense, not all humans seek learning, but most humans look for results and answers.
You mean the state of affairs humans have enjoyed for the last four millennia? The status quo that led to all of the technology you seem to think we now can't live without?
> Many blank out in school on different subjects and the cognitive overload byproduct follows them all their life making them wary of new things.
They should try putting their phones down before we double down on solving tech problems with more tech.
There was/is constant progress that constantly demanded/demands more tech. Without more tech, the progress would have/would stalled.
dark app patterns, gambling, etc. like seriously, i know we all want to hate on llms or whatever stealing our jobs or making us stupider but has this been any different from the past in that regard?
whether it be radio, tv, computer, internet, video games, etc. all of these claimed to be doing something "to the children" but i agree with another comment said kids will figure out a way to learn and utilize the tools given to them.
did me "offloading" my thinking to google or some computer instead of cracking open some library book or doing calculations by hand damage my thinking at the time? no... because a sufficiently motivated person will learn regardless, figure out why things work the way they do, and rather it's better access to said information that helps.
we should be fixing the motivation problem rather than the tools which we've been trying to do for decades. teach people the framework for solving problems and critical thinking. kids nowadays have way more things demanding their attention and it's been on a decline far before this AI wave (cough social media). we literally sound like old farts lol.
> In study hall, I watched a kid use Snapchat to take pictures of his computer screen. He was working on IXL skills. His Snap A.I. friend sent an immediate reply. He then clicked the answer on his screen. The next question popped up, he took a picture and got an answer. He swiftly went through the whole session this way. His right hand held the phone, he tapped the camera button, glanced at the reply, and his left hand entered the answers on his laptop. He didn’t know I was watching, but I saw the gold medal of 100 percent mastery bloom on his screen. I told the teacher who assigned the IXL. She didn’t realize Snapchat had an A.I. that would do her homework. It can answer all the questions.
... Now, can you use AI to learn things? Sure. But what the article is talking about it is critical thinking:
> Adults using AI mostly just sound generic. But for a child who never formed independent reasoning, "generic" is a major identity problem. The model’s reasoning doesn’t compete with the child’s reasoning but becomes the child’s reasoning. For children still building out the cognitive skills for evaluating the world, the effect will not be temporary but have a foundation impact on their thinking.
American's performance with critical thinking is already mixed at best. A new generation with even lower independent thinking ability combined with AI painstakingly engineered to suffer from severe bias is a powerful recipe for (even more) horrors beyond human comprehension. Paid for by our tax dollars.
0 - https://www.nytimes.com/2026/02/26/learning/teachers-on-how-...
If you go into AI as a way to get your school work done more quickly, you won't experience the friction you need to. AI should be used to make the work longer and deeper. More engaging and adapted to the individual. Not quicker and easier.
The problem is that AI is the most effective dual use technology we have ever created with regard to education and cheating at education. The monkey brain doesn't like to suffer, so on average I think we find most people tend toward the shittier use case.
I wasn't ever able to really develop deep intuition about/understanding of a calculation until I did it by hand once or twice. I often just plugged in new models and algos just to see if performance was above a threshold, but when I wanted to productionize a new winner, I'd have to run through the algo by hand for a few steps to understand and tune it. And through doing it by hand, the complex became the simple.
like the logic sticks deeper in your head that way... using computer is fast, but sometimes it just goes in one ear and out the other
The again we are not that far off from time when your AI glasses will read the price label. And then automatically add up total for you. Hopefully you then each time ask what does that total mean in context of your finances...
One effect of widespread books is we don’t have poets like Homer. We don’t develop the memorization skills like they did in the past.
And that’s ok.
We can use the bandwidth for other stuff.
Socrates would have been against LLMs, and for good reason. Writing isn't unequivocally bad, but it is simply not a substitution for real dialogue and thought. We use books as a means by which to have more things to discuss with humans. LLMs can supplant the desire to even have dialogue with others, which is perhaps the more insidious thing.
It's something we all learn in freshman english class. But it comes up over and over again because the general idea is true. You have to temper the unbridled optimism that comes with any new technology by contemplation of what may be lost. Otherwise we're spinning in circles.
Like fighting on social media...
Seriously, what was the other stuff that we used our bandwidth for when the books caused the loss of skills.
We have lost Homer, but what have we gained? A million social-media warriors?
Now future is to replace those with machines. No more human input. Just endless amount machines fighting with other machines...
Books encode skill.
I’m not a hater. LLMs on search is the best research tool I’ve ever used because it’s read everything and can find minutia buried in places it would take me a long time to find.
But there’s a huge difference between using it to assist focus, or as a study aide, and offloading the whole act of thinking itself.