93 pointsby r4um5 hours ago17 comments
  • zkmon2 hours ago
    > There should be some balanced path in the middle somewhere, but I haven’t stumbled across a formal version of it after all these decades.

    That's very simple. The balanced path depends directly on how much of the requirements and assumptions are going to change during the life time of the thing you are building.

    Engineering is helpful only to the extent you can forsee the future changes. Anything beyond that requires evolution.

    You are able to comment on the complexity of that large company only because you are standing in the future into 50 years from when those things started take shape. If you were designing it 50 years back, you would end up with same complexity.

    The nature's answer to it is, consolidate and compact. Everything that falls onto earth gets compacted into a solid rock over time, by a huge pressure of weight. All complexity and features are flattened out. Companies undergo similar dynamics driven by pressures over time, not by big-bang engineering design upfront.

  • bestham5 hours ago
    “A complex system that works is invariably found to have evolved from a simple system that worked. The inverse proposition also appears to be true: A complex system designed from scratch never works and cannot be made to work. You have to start over, beginning with a working simple system.” Gall’s Law
    • nasretdinov2 hours ago
      I think the important part here is "from scratch". Typically when you're designing a new (second, third, whatever) system to replace the old one you actually take the good and the bad parts of the previous design into account, so it's no longer from scratch. That's what allows it to succeed (at least in my experience it usually did).
    • codeflo2 hours ago
      This is often quoted, but I wonder whether it's actually strictly true, at least if you keep to a reasonable definition of "works". It's certainly not true in mechanical engineering.
      • bestham37 minutes ago
        The definition of a complex system is the qualifier for the quote. Many systems that are designed, implemented and found working are not complex systems. They may be complicated systems. To paraphrase Dr. Richard I. Cook’s ”How Complex Systems Fail” where he claims that complex systems are inherently hazardous, operate near the edge of failure and cannot be understood by analyzing individual components. These systems are not just complicated (like a machine with fixed parts) but dynamic, constantly evolving, and prone to multiple, coincidental failures.

        A system of services that interact, where many of them are depending on each other in informal ways may be a complex system. Especially if humans are also involved.

        Such a system is not something you design. You just happen to find yourself in it. Like the road to hell, the road to a complex system is paved with good intentions.

        • codeflo23 minutes ago
          Then what precisely is the definition of complex? If "complex" just means "not designed", then the original quote that complex systems can't be designed is true but circular.

          If the definition of "complex" is instead something more like "a system of services that interact", "prone to multiple, coincidental failures", then I don't think it's impossible to design them. It's just very hard. Manufacturing lines would be examples, they are certainly designed.

    • jackblemming2 hours ago
      People misinterpret this and think they can incrementally build a skyscraper out of a shed.
    • smitty1e2 minutes ago
    • McGlockenshire4 hours ago
      Ah, the Second System Effect, and the lesson learned from it.
      • jeffreygoesto4 hours ago
        But this is about the first systems? I tend to tell people, the fourth try usually sticks.

        The first is too ambitious and ends in an unmaintainable pile around a good core idea.

        The second tries to "get everything right" and suffers second system syndrome.

        The third gets it right but now for a bunch of central business needs. You learned after all. It is good exactly because it does not try to get _everything_ right like the second did.

        The fourth patches up some more features to scoop up B and C prios and calls it a day.

        Sometimes, often in BigCorp: Creators move on and it will slowly deteriorate from being maintenaned...

    • YZF4 hours ago
      So true.
  • jonathanlydallan hour ago
    Lots of wisdom in this post about some of the realities of software development.

    The core point they're trying to make is that agile (or similar) practices are the incorrect way to approach consolidation of smaller systems into bigger ones when the overall system already works and is very large.

    I agree with their assertion that being forced to address difficult problems earlier on in the process results in ultimately better outcomes, but I think it ignores the reality that properly planning a re-write of monumentally sized and already in use system is practically impossible.

    It takes a long time (years?) to understand and plan all the essential details, but in the interim the systems you're wanting to rewrite are evolving and some parts of the plan you thought you had completed are no longer correct. In essence, the goal posts keep shifting.

    In this light, strangler fig pattern is probably the pragmatic approach for many of these re-writes. It's impossible to understand everything up front, so understand what you reasonably can for now, act on that, deliver something that works and adds value, then rinse and repeat. The problem is that for sufficiently large system, this will take decades and few software architects stick around at a single company long enough to see it through.

    A final remark I want to make is that, after only a few years of being a full-time software developer, "writing code" is one of the easiest parts of the job. The hard part is knowing what code needs to be written, this requires skills in effective communication with various people, including other software developers and (probably more importantly) non-technical people who understand how the business processes actually need to work. If you want to be a great software developer, learn how to be good at this.

  • iafan4 hours ago
    > There are two main schools of thought in software development about how to build really big, complicated stuff.

    > The most prevalent one, these days, is that you gradually evolve the complexity over time. You start small and keep adding to it.

    > The other school is that you lay out a huge specification that would fully work through all of the complexity in advance, then build it.

    I think AI will drive an interesting shift in how people build software. We'll see a move toward creating and iterating on specifications rather than implementations themselves.

    In a sense, a specification is the most compact definition of your software possible. The knowledge density per "line" is much higher than in any programming language. This makes specifications easier to read, reason about, and iterate on—whether with AI or with peers.

    I can imagine open source projects that will revolve entirely around specifications, not implementations. These specs could be discussed, with people contributing thoughts instead of pull requests. The more articulated the idea, the higher its chance of being "merged" into the working specification. For maintainers, reviewing "idea merge requests" and discussing them with AI assistants before updating the spec would be easier than reviewing code.

    Specifications could be versioned just like software implementations, with running versions and stable releases. They could include addendums listing platform-specific caveats or library recommendations. With a good spec, developers could build their own tools in any language. One would be able to get a new version of the spec, diff it with the current one and ask AI to implement the difference or discuss what is needed for you personally and what is not. Similarly, It would be easier to "patch" the specification with your own requirements than to modify ready-made software.

    Interesting times.

    • _dark_matter_3 hours ago
      Iceberg is, primarily, a spec [0]. It defines exactly what data is stored and how it is interacted with. The community debates broadly on spec changes first, see a recent one on cross-platform SQL UDFs [1].

      We have yet to see a largely llm driven language implementation, but it is surely possible. I imagine it would be easier to tell the llm to instead translate the Java implementation to whatever language you need. A vibe-coded language could do major damage to a companies data.

      [0] https://iceberg.apache.org/spec/ [1] https://lists.apache.org/thread/whbgoc325o99vm4b599f0g1owhgw...

      • iafan2 hours ago
        If I had a spec for something non-trivial, I probably would ask AI to create a test suite first. Or port tests from an existing system since each test is typically orders of magnitude easier to rewrite in any language, and then run AI in a loop until the tests pass.
    • simianwords3 hours ago
      > I can imagine open source projects that will revolve entirely around specifications

      This is a really good observation and I predict you will be correct.

      There is a consequence of this for SaaS. You can imagine an example SaaS that one might need to vibecode to save money. The reason its not possible now is not because Claude can't do it, its because getting the right specs (like you suggested) is hard work. A well written spec will not only contain the best practices for that domain of software but also all the legal compliance BS that comes along with it.

      With a proper specification that is also modular, I imagine we will be able to see more vibecoded SaaS.

      Overall I think your prediction is really strong.

    • energy1233 hours ago
      Interested in ideas for this. I've mulled over different compact DSLs for specs, but unstructured (beyond file-specific ownership boundaries) has served me better.
      • simianwords3 hours ago
        I think it has to be modular and reusable. Like GDPR compliance spec should be opensourced and reused by all SaaS specs.
    • fc417fc8022 hours ago
      There are parallels of thought here to template and macro libraries.

      One issue is that a spec without a working reference implementation is essentially the same as a pull request that's never been successfully compiled. Generalization is good but you can't get away from actually doing the thing at the end of the day.

      I've run into this issue with C++ templates before. Throw a type at a template that it hasn't previously been tested with and it can fall apart in new and exciting ways.

    • polyglotfacto2 hours ago
      You can look at the Web as a starter: https://html.spec.whatwg.org/#history-2

      > The WHATWG was based on several core principles, (..) and that specifications need to be detailed enough that implementations can achieve complete interoperability without reverse-engineering each other.

      But in my experience you need more than a spec, because an implementation is not just something that implements a spec, it is also the result of making many architectural choices in how the spec is implemented.

      Also even with detailed specs AI still needs additional guidance. For example couple of weeks ago Cursor unleashed thousands of agents with access to web standards and the shared WPT test suite: the result was total nonsense.

      So the future might rather be like a Russian doll of specs: start with a high-level system description, and then support it with finer-grained specs of parts of the system. This could go down all the way to the code itself: existing architectural patterns provide a spec for how to code a feature that is just a variation of such a pattern. Then whenever your system needs to do something new, you have to provide the code patterns for it. The AI is then relegated to its strength: applying existing patterns.

      TLA+ has a concept of refinement, which is kind of what I described above as Russian dolls but only applied to TLA+ specs.

      Here is a quote that describes the idea:

      There is no fundamental distinction between specifications and implementations. We simply have specifications, some of which implement other specifications. A Java program can be viewed as a specification of a JVM (Java Virtual Machine) program, which can be viewed as a specification of an assembly language program, which can be viewed as a specification of an execution of the computer's machine instructions, which can be viewed as a specification of an execution of its register-transfer level design, and so on.

      Source: https://cseweb.ucsd.edu/classes/sp05/cse128/ (chapter 1, last page)

  • qzncan hour ago
    I find the title misleading. While the author is "thinking about systems", this is not about https://en.wikipedia.org/wiki/Systems_thinking
  • leoff24 minutes ago
    >In a sense, it is the difference between the way an entrepreneur might approach doing a startup versus how we build modern skyscrapers. Evolution versus Engineering.

    There are core differences in software engineering that, unlike in construction work:

    - making changes is often cheaper

    - we might not know beforehand everything that is needed to be built, especially unknown unknowns

    I would still agree that the truth is somewhere in between, but I would argue that, for software, it's closer to the evolutionary approach.

  • itay-maman34 minutes ago
    Wow that's a good piece. It propelled me into writing this: https://dev.to/itay-maman/the-elephant-in-the-room-systems-t...

    In short: the tension described in "systems thinking" is the same one as the one between "spec driven" vs. "iterative prompting"

  • repelsteeltjean hour ago
    > Also, the other side of it is that evolutionary projects are just more fun. I’ve preferred them. You’re not loaded down with all those messy dependencies. Way fewer meetings, so you can just get into the work and see how it goes. Endlessly arguing about fiddly details in a giant spec is draining, made worse if the experience around you is weak.

    IMO the problem isn't discussing the spec per se. It's that the spec doesn't talk back the way actual working code does. On a "big upfront design" project, there is a high chance you're spending a lot of time on moot issues and irrelevant features.

    Making a good spec is much harder than making working software, because the spec may not be right AND the spec may not describe the right thing.

    • Aditya_Gargan hour ago
      Yeah I’ve noticed the same problem. Do you know any resources for writing good specs?
      • repelsteeltjean hour ago
        Nope, sorry. I known I'm not good at it.

        I suppose it's primarily a matter of experience. And as the article alludes, it's very important to deeply understand the subject matter. I highly value some of my non-programmer colleagues responsible for documentation. But can't put my finger on what exactly they brought to table that made their prose exceptionally good (clear, concise, spot on)...

  • praptak2 hours ago
    Show me an example of a large complex software system built from spec rather than evolved.
    • iafan2 hours ago
      Everything that is touching hardware, for example. Bluetooth stack, HDMI, you name it.

      Everything W3C does. Go is evolving through specs first. Probably every other programming language these days.

      People already do that for humankind-scale projects where there have to be multiple implementations that can talk to each other. Iteration is inevitable for anything that gains traction, but it still can be iteration on specs first rather than on code.

  • Aldipoweran hour ago
    Is this philosophising about software or does it have something to do with real life?
  • ArchieScrivener4 hours ago
    The Evolution method outlined also seems born from the Continuous Delivery paradigm that was required for subscription business models. I would argue Engineering is the superior approach as the Lean/Agile methods of production were born from physical engineering projects whose end result was complete. Evolution seems to be even more chaotic because an improper paradigm of 'dev ops' was used instead of organically emerged as one would expect with an evolving method.

    Ai assistance would seem to favor the engineering approach as the friction of teams and personalities is reduced in favor of quick feasibility testing and complete planning.

  • qzncan hour ago
    > There are two main schools of thought in software development about how to build really big, complicated stuff.

    That feels like a straw man to me. This is not a binary question. For each small design decision you have a choice about how much uncertainty you accept.

    There are no "two schools". There is at least a spectrum between two extremes and no real project was ever at either of the very ends of it. Actually, I don't think spectrum is a proper word even because this is not just a single dimension. For example, speed and risk often correlate but they are also somewhat independent and sometimes they anti-correlate.

  • readthenotes15 hours ago
    A major factor supporting evolution over big up-front design is the drift in system requirements over time. Even on large military like projects, apparently there's "discovery"--and the more years that pass, the more requirements change.
    • zppln2 hours ago
      This isn't my experience. Requirements tend to settle over time (unless they're stupidly written). Users tend to like things to stay the same, with perhaps some improvement to performance here and there.

      But if anything, all development is the search for the search for the requirements. Some just value writing them down.

    • YZF4 hours ago
      Even if the requirements are indeed fixed your understanding of the problem domain evolves.
  • qwertyuiop_4 hours ago
    Software cannot be built like skyscrapers because the sponsors know about the malleability of the medium and treat it like a lump of clay that by adding water can be shaped to something else.
    • ako4 hours ago
      You're mixing up design and manufacturing. A skyscraper is first completely designed (on paper, cad systems, prototypes), before it is manufactured. In software engineering, coding is often more a design phase than a manufacturing phase.

      Designers need malleability, that is why they all want digital design systems.

    • baxtr3 hours ago
      Funny you bring up the clay analogy.

      It was discussed here just 2 days ago intensively.

      https://news.ycombinator.com/item?id=46881543

    • YZF4 hours ago
      But software is in fact not very malleable at all. It's true the medium supports change, it's just a bunch of bits, but change is actually hard and expensive, perhaps more than other mediums.
      • jbl0ndie3 hours ago
        I'd argue it's more malleable than a skyscraper.

        How rapidly has business software changed since COVID? Yet how many skyscrapers remain partially unoccupied in big cities like London, because of the recent arrival of widespread hybrid working?

        The buildings are structurally unchanged and haven't been demolished to make way for buildings that better support hybrid working. Sure office fit outs are more oriented towards smaller simultaneous attendance with more hot desking. Also a new industry boom around team building socials has arrived. Virtual skeet shooting or golf, for example.

        On the whole, engineered cities are unchanged, their ancient and rigid specifications lacking the foresight to include the requirements that accommodate hybrid working. Software meanwhile has adapted and as the OP says, evolved.

      • ako4 hours ago
        With LLMs it's becoming very malleable.
  • nobodywillobsrv35 minutes ago
    This missed the point that they are ignoring evolution is literally the way you build things. There is no other way. You don't know what is actually needed or what might work really. You try things and then compress later. If you can try bigger things, bigger leaps great.
  • wetpaws3 hours ago
    [dead]