> (..) it implies a product manager, a designer, two engineers (one senior) and four to six months of design, coding and testing. Plus maintenance.
What I find interesting is that ten years ago the author could have written: our intern spent the weekend at the office and ended-up creating 350k of billable work.
Such as statement would have been instantly ridiculed as unprofessional, but somehow with AI it works.
Shows Jaron Lanier is right when saying that AI makes people adjust themselves to a lower level of intelligence(or something along these lines).
> People don’t judge A.I. code the same way they judge slop articles or glazed videos. They’re not looking for the human connection of art. They’re looking to achieve a goal. Code just has to work.
Yes indeed. And what the author leaves aside is that for code to work, it requires some level of conceptual integrity beyond that which vibe coding can offer.
He also forgets to mention that the future is not either we keep building slow software by hand, or we go full vibe code, but that there is also the option of ( a new kind of?) professionals using AI to be more productive, while ensuring output is up to standards.
> It might fail a company’s quality test, but it would meet every deadline.
This brings us back to the intern frantically coding over the weekend. This problem is as old as software itself, it is just compounded by AI being even faster than a frantic human coder, but it's not new. The industry could have thrown quality standard out of the window long before LLMs came around.
I also dislike how the author seems to imply that quality software requires all this bureaucracy. I mean, what about open source for example?
> the direct descendant of NeXT’s software is what’s running on Macs and iPhones in 2026. In software, sharp change is to be avoided at all costs. The risk is just too high.
I think here he got it backwards again. At least half of the reason people are willing to spend a premium on Apple products is because the software just works, and that is because it is based on a strong foundation going back all the way to when Jobs decided to spend all his cash on building the best computer.
Some additional thoughts at https://medium.com/@polyglot_factotum/on-what-ai-does-not-di...