It is easy to build "home cooked" (by his definition) software when you are building for a small number of people by a small number of people. Old Visual Basic programs built for a single version of Linux will do just fine - and that will work well for some use cases. But if you are building software that has to work on thousands to millions of machines, and fulfill their needs, you will need to build a factory to build that software.
The factory, in this case, is the infrastructure necessary to turn the R&D (you and me) into the repeatable product (that is, your app.) The factories need a lot of maintenance. If nobody is buying your product, you need to change the factory to produce outputs that people are buying. If people have bought your product (let's say... on subscription) then you need to keep the factory maintained well enough to produce the product that people are happy to buy.
Some things will be like tarsnap and change relatively little. That's great. A small shop will need less maintenance. A bigger shop that is a factory for more users with wider needs will need more complicated products and that will mean more maintenance.
But yes, if you don't need to have a return on investment, you don't need to react to customers' needs and therefore you can build a spec first and let it change little. (Or, you build something as niche and as good as Tarsnap!)
> Fast Food > Here, we develop in "agile" sprints. Working software is developed at the fastest pace possible, and all bugs are to be fixed later.
> Home Cooked > Here, things are slower, more thoughtful. More waterfall-y.
While Sprints is a term that sounds like fastest pace possible, that is not what the term means; and a key part about waterfall vs agile is that waterfall IS NOT more thoughtful, but all planned up front.
Both methodologies can create bugs, or deliver features faster than scale can be thought of and deliver features faster than can be tested.
If we remove the quotes from "agile" we actually get slower and thoughtful. A key part of that is measuring (training, interviewing, analyzing). An agile process should build a feature, release that feature, interview users, analyze system behavior, iterate by improving user's goal, adding appropriate scale, iterating by removing unexpected errors or behavior.
I feel like there's this no true Scotsman thing going on with agile. Whenever someone describes their actual experiences with agile, there's always at least one person who speaks up and decries it as as not real agile and what agile should be.
At this point I don't care what agile should be. I just don't want management shoving agile down my throat anymore. I've yet to see it actually improve productivity for any team I've been on. Real agile must be exceedingly rare.
It's not mysterious or confusing. The original definition is at https://agilemanifesto.org/
I've heard that one of the benefits of agile is identifying blockers and encouraging collaboration, but I saw much better results from assuming you've hired intelligent adults with work ethic and letting them reach out and collaborate as needed. Daily standups, sprints, boards, planning, etc. are great in a low-trust environment where you can't be sure people are doing the right things. But if you've hired self-directed people, that stuff just gets in the way.
I've known talented devs who are great people who still need more oversight than you describe. Usually they are ~5 years off from being full-trust, yet still valuable team members. Yes they benefit from daily standups.
This is often how a functional safety project works - specification is everything, no point writing any code until you know exactly what you plan to build.
Due to traceability requirements it becomes a very waterfall approach.
However one way of improving methodology and outcomes is an executable specification that can be converted to code and documenting at the very last minute, meaning you can be specifying right up until you need to deploy.
Executable specification concept comes from IEC 61499.
Just for fun let me pose: Where would a Michelin star chef perform better? Whats the equivalent of fast casual or fine dining? Does the type of cuisine influence the outcomes???
Any analogy looks like a bad analogy if you stretch it beyond the authors intent and make out like that's somehow their fault or problem.
I think it's interesting to try think about including the chef in the analogy, but I don't want to entertain you because I feel like the tone of your post was unnecessarily rude, essentially: "duuuh, did you even read your stuff before pressing post, idiot."
I love to cook and have fed many yet I also have come to accept that most people never meet the chef who made their meal.
Stay Healthy!
> the pyramids have had 100% up-time with no human maintenance
It helps that there is hardly any rain in the desert. Water would foul up the structure in a matter of years.
> dependencies are added like seasoning. Hundreds of packages. Thousands of foreign lines of code make their way onto your software routinely
True, and I think NPM JavaScript exhibits the worst of this behavior.
> Problems are expected and fixed on the fly, somewhat haphazardly: ... push a patch ... code scanning ... dependa-bot ... DevOps
True, and I remember when software releases were treated with much more care and quality because it costs a lot to ship a CD or game cartridge rather than a download.
> More waterfall-y.
That's not a good thing. Having a short feedback cycle from implementation back to design is a big win for software development. Waterfall is the dark ages when we didn't know better.
> This is where minimalist software is built.
Agreed.
> No build process
That doesn't make any sense, unless you're writing machine code directly in hexadecimal.
> no outside dependencies
I agree with minimizing dependencies, but there's no such thing as no dependencies. I dare you to avoid any of these libraries: HTTP (especially HTTP/2 and 3 which are much harder than HTTP/1), TLS/SSL, TCP/IP, hardware drivers, compilers, data codecs like DEFLATE, multimedia codecs like AVC and AAC.
> There's no need to be constantly refactoring things, since everything was designed up to spec beforehand. ... The catch? Writing such a spec costs you over 80% of your engineering time, and you'll have nothing to show for it until day 100.
This is a fallacy. I agree with and like this talk where the speaker Glenn Vanderburg argues that the software is the specification, the construction is done by compilers, and that most analogies to physical engineering are completely wrong. https://www.youtube.com/watch?v=NP9AIUT9nos
> The thing is, most humans are laughably bad at architecting software without actually writing it first.
There's no shame in discovering the software architecture as you go along. If you already knew the architecture beforehand, that means you're very familiar with the problem space already, and you probably should've written a framework to avoid repetitive work. In a sense, software development biases toward novel exploratory work rather than routine work, and that's why it's challenging.
> this all stems from a certain greed software developers have ... Meanwhile other engineering fields are far more humble.
Nonsense; greed is human. In all fields of engineering, the general principle is to do more with less. As the saying goes, any fool can build a bridge, but only an engineer can build a bridge that barely stands. All engineers want more features for less cost, and software is no different. The difference is that in most engineering, there are more physical constraints, more templates to apply, more repetitive work. And because of that, the norms are well-established in traditional engineering. You don't look at a big McMansion and call it "greedy" because it's just the norm. You don't look at a sprawling highway interchange with 4 levels of ramps and call it "greedy" because it's socially acceptable.
> Even your measly human body doesn't need weekly patches
Have you looked at the list of bugs for humans? Allergies, back pain, appendix, aging, various birth defects, cancer, etc. If anything, life is the ultimate example of spaghetti coding and monkey-patching. Look at how vertebrate embryos (human, chicken, fish) all look the same in the first few weeks of life, then they diverge as various body parts and limbs are grown or shrunken.
> Sadly, a good home-cooked meal is hard to find nowadays. Fast food is just too good to beat.
It's weird when people praise home-cooked meals, because I've found restaurants that have great food. Heck, I've been to various traditional sit-down restaurants where they bring you the food in one minute and is faster than standing in line at McDonald's.
> Most software today just feels bloated and trashy, even if the experience is drug-like.
Most software do feel bloated and trashy to me too. I especially find that the more popular a software is, the trashier it is. A few decades ago for example, I found MSN Messenger to be popular but insufferable (big program size, laggy UI, lots of attention-grabbing features) whereas IRC was an underground community and the software was very well-behaved (small, not attention-grabbing).
Overall, I agree with the ideas that simpler software is better, there's too much cargo-culting in industry, piling on complexity and dependencies is bad. But your article wanders all over the place and doesn't hit the right points.
It's also weird when people praise home-cooked meals and even restaurants but diss McDonald's. As I'm fond of saying, the difference between cooking and process engineering is that the latter actually cares about quality of outcome.
There's a lot of stuff you simply can't make at home, because the necessary equipment is beyond what's allowed in residential conditions (think e.g. pizza ovens, which run 100-200°C hotter than where consumer ovens max out), and on top of that, standard cooking appliances have little to no means of quality and consistency control - it's only industry that bothers with such things.
1. The API isn't quite what you need, so you add enough extra logic to integrate it that you might as well have reinvented the wheel instead and had a proper solution, plus you have all the bugs and performance problems associated with bolting on extra code to a domain mismatch issue.
2. The API covers cases you don't care about, opening you up to bugs and performance problems from the project's complexity while not actually saving much time since the cases you care about can be handled more simply.
3. Both (1) and (2) are amplified as you change the project going forward. If you wrote the data structure, you could easily tack on the modifications you need. If not, in the presence of desired modifications you needed to re-implement it anyway (or bolt on even more hacks), and the 3rd party isn't a good starting point because of (2).
For something like cryptography, with all the ways constant-time execution and whatnot can bite me in the ass, I'm happy to use a 3rd-party dependency. For almost everything else I'll home-cook it. Networking starts at io_uring. CPU float-intensive software starts with (depending on the domain) a Tensor type capable of lazy, fused operations [0].
That's just a heuristic I use, and from time to time I'll definitely take a shortcut. More often than not though, I'll be replacing the shortcut in less than a year, and the value from having the feature sooner is only sometimes worth the lost productivity. The driving factor is just that as I learn more it's easier and easier to home-cook those sorts of things, whereas the costs induced by 3rd parties haven't really diminished.
[0] I'm curious what a language designed around cache obliviousness and reducing data dependencies might look like. There are some patterns I've used which compose nicely, but they only solve part of the problem. Even custom vector languages like ISPC require a fair bit of work from the programmer.
If you are a good programmer/ independent thinker you will tend to just write software from first principles with limited tooling. Its leaner and faster to build that way. Its usually more effective software. 3rd party libraries though easy to implement bring their own interfaces/paradigms. They require maintenance/security updates. They are often written by individuals who care little about performance.
There are TONS of exceptions to what I am saying above and tons of great packages that I use frequently, but if the default is to try and solve a problem by installing a package now you have 2 problems.
For me, that's the best use of time without re-inventing the wheel—though this is specifically for network apps and for addressing pain points of existing SaaS products.