From that perspective the BWT implementation isn't very interesting because the BWT is deterministic, so the effect on compression ratio doesn't depend on which implementation was used.
It seemed like it should have been the standard for a lot of desktop applications for the 90's and 2000's.
Its syntax and concepts were considered solid and so was selected as the basis of the VHDL hardware description language, also successful in its domain.
Vanishingly little was written in Ada as waivers were too easy to get.
So, wherever these waivers were, they certainly weren't in those spaces.
1 - https://www.stroustrup.com/JSF-AV-rules.pdf
2 - https://www.defenseone.com/technology/2025/02/f-35-programs-...
Not just software, entire systems. They built their own datalinks instead of aiming for compatibility with existing datalinks or pushing those existing datalinks to a newer version that could meet their needs.
In principle, that should have been their first (flight test ready) version. Just get the planes talking to an existing datalink network even if they had some next gen datalink target. But this is also a classic DOD project management problem. They aim for the full system instead of staging out a series of iterative versions.
And LM, in their infinite wisdom, hired more developers when they were late and ran their development in shifts. Because everyone knows the best way to get a late project back on track is to hire more people...
That doesn't say anything about the quality of the language, just that the goal for it was unrealistic. If the DoD had pressured Microsoft to support Ada in the same way they supported C++, Visual Basic, and vbscript, they might have been able to pull it off. But with the COTS directive and the utter lack of Ada support from commercial vendors, it was just never going to happen.
C and C++ are still used pretty frequently. I wouldn't say that they failed, but if someone wrote an application in Ada in 2025, I would find that a bit anachronistic.
I suppose that Nvidia's use of Ada and SPARK for self-driving vehicles is anachronistic:
https://www.eenewseurope.com/en/nvidia-drives-ada-and-spark-...
https://nvidia.github.io/spark-process/process/introduction....
Many languages have their great qualities. Whether or not they're outdated is a determination full of biases. Measure the language choice against resources and potential revenue. I'd be happy to write an app in Ada to proclaim its advantages as a sales pitch.
I just don't see Ada used a lot anymore. This isn't a value judgement on it being "good" or "bad", lots of bad languages (like PHP) end up getting very popular, and lots of cool languages (like Idris) kind of languish in obscurity. Don't mistake me saying popularity is proper metric for how "good" something is.
When I say "anachronistic", I don't mean it as a bad thing either, just that it's not used a lot anymore. I've literally never heard of anyone writing an Ada application in the last twenty years outside blogs on HN.
One thing I do believe: the quality of software from MSFT has gone down, in part because their business model has gone from providing products to monetizing the users. Their products are just stagnant honeypots to collect data. This is opening a door for the small time dev to try new things, maybe with unpopular toolchains. I've got something that would be great for highlighting Ada's mission critical rep. Price and value discovery aren't dead (yet).
It's still a very modern language which is missing very little in that niche. It's only missing adoption.
I have high hopes for Rust in this space. Using C is fine, using C++ is madness, using Ada is good but fewer available devs.
Rust also solves a lot of issues with C++ and generally, once you get past the "fighting the borrow checker" to the "working with the borrow checker" phase has insanely good ergonomica, safety mechanisms and features. Additionally Rust has momentum right now.
Also the main advantage it has over C is safety, and nobody really took that seriously until recently (and it doesn't really have an answer for use-after-free which is a huge class of bugs).
I have my own thoughts that I'd be happy to share, but I don't want to spoil your opinion with them.
(This goes for basically anyone who's interested in talking about it, I think it's a fascinating bit of history.)
I wouldn't disagree with either of those things at a high level, but I think in practice, the details matter, and it's more of those details that led to the present state than the high level goals, which certainly were laudable.
At the same time, when major compiler vendors couldn't even get their Ada compilers out the door in project overruns measured in years, Mitch Kapor's 1-2-3 was written and took over the spreadsheet market (dethroning the incumbent Visicalc) and it was written in assembler, not even C, and barely had memory to spare on user machines (memory? there was barely address space left to spare). The tradeoffs were authentic, and there was zero enthusiasm for Ada among programmers or entrepreneurs looking to get products launched. The mainframe world could have been different but the personal computer software market was exploding in a way larger systems never did.
(oh and let me add, all the undefined behaviors back then were very clearly defined (perhaps over a small field of options) and extremely useful, and should have been left that way; shoot out a compiler warning or error if they get your panties in a bunch)
Do you have sources you could point to? I once read that it's a great language mired by a designed by committee ecosystem. I really liked the language when I tinkered with it a bunch around 2010 but moved on after work pushed me to other languages like C# for GUI stuff. First language I used with simple built in concurrency.
Ferrous and AdaCore were originally collaborating, but then they parted ways. In my understanding they’re both largely the upstream codebase, I know that the Ferrous folks religiously upstream almost everything, no clue if AdaCore does as well.
https://rust-lang.github.io/fls/
This is effectively a fork of the Rust Reference, made by Ferrous, and laid out in a way that allowed the compiler to be qualified. It now lives at this URL, because it's being adopted by upstream as the spec.
I'm not following what you meant by this though, it seems like a contradiction:
> A language specification is not required to be qualified. The behavior of the compiler needs to be described.
But they're putting work into reviving the language spec, to enable certification? Also, if the source language hasn't been described, then surely the compiler's behaviour hasn't been described.
Or did you mean that their documentation is for the Ferrous flavour of Rust and might not reflect the latest version of the Rust language?
It has already been qualified. Upstream has always wanted a spec. It’s being worked on because it’s desirable, not because it’s blocking safety critical cases.
You’re always going to need to have more than a language spec because you qualify compilers not languages.
> Also, if the source language hasn't been described, then surely the compiler's behaviour hasn't been described.
It has. At least to the degree that regulators find it acceptable.
> Or did you mean that their documentation is for the Ferrous flavour of Rust and might not reflect the latest version of the Rust language?
There is no difference in flavors, but it is true that each version of the spec is for a specific version of the compiler, and so sometimes that will lag a bit. But that’s just how this process works.
Can you point to a production language today that doesn't have a committee leading it's development?
C, C++, Rust, Javascript, Python, etc. All have committees leading their development. The only difference with Ada was that, for a long time, that committee was in the DoD (which has plenty of fine engineers, given it's practical achievements) instead of ISO/ANSI. And instead of being focused on general purpose, they had a clear domain they prioritized. That's different now, but it's hard to erase a few decades of heritage.
Specifically, I think these three paragraphs near the end are critical:
> I'm reading a great book now called Why People Believe Weird Things, by Micheal Shermer, in which the author explains what rational thinking is, and how skepticism is a process. Basically, people believe something because that want to, not because of any scientific arguments you make.
> There are guys out there who dislike Ada, but they do so because they want to, not because of any rational analysis of its merits or flaws. Sometimes even their arguments are factually incorrect, like saying that "Ada was designed by committee," ignoring the fact that Jean vetoed language design arguments that were 12-to-1 against him. It's not unlike creationists who explain the "fact" that evolution violates the 2nd law of thermodynamics. (No, it does not, as any book on freshman physics will tell you.)
> I've explained the reasons Ada why I think is not as popular as C++, and I'd like to hope that it will convince Ada's detractors that Ada isn't so bad after all. But as Robert Dewar pointed out, a person who has made an irrational decision probably isn't going to be swayed by rational arguments!
That is, people aren't really rational. A choice was made to dislike it, it entered into the culture and to this day people dislike it because they think they should dislike it. They don't even spend 5 minutes studying it to see that half of what they've heard (if not more) is flat out wrong. In several Ada discussions on HN people claim its syntax is like COBOL's, for instance. Not just similar in being keyword heavy, but practically the same. Sometimes they even provide Ada "examples" that won't even compile. That's the kind of nonsense that happens when people turn off their brains or refuse to turn on their brains. You see it in many Lisp discussions as well.
Other languages survive being called designed by committee or having ugly syntax. People talk shit about C++ all the time. PHP is still alive despite getting so much hate. However, there are rational reasons why these languages are used, they're just more complicated than beauty of the language itself, and are due to complex market forces, ecosystems, unique capabilities, etc.
I'm not qualified to answer why Ada isn't more popular, but an explanation implying there was nothing wrong with it, only everyone out of the blue decided to irrationally dislike it, seems shallow to me.
This argument eats itself. It's just an accusation that people who disagree with you are irrational, and their arguments are in bad faith. It's not a valid argument because it doesn't even depend on any context or facts of the actual discussion which he's using it. It's the definition of cope.
In the end, even if we can't be sure why Ada failed, it failed spectacularly. It had massive institutional backing and never made it past obscurity. I don't know exactly why people dislike it so much, maybe because everyone already knew C, C was well supported, every single OS was written in C, etc, so trying to bring some incompatible algol like language (always a popular lineage hahaha) with very sparse to nonexistent tooling and very theoretical advantages, especially considering the huge performance disadvantage at the time on highly constrained resources of computers at the time was not likely to succeed on its face.
> every single OS was written in C.
No, they weren't. In fact, some were written in Algol-like languages such as Pascal.
So Pascal had one mainstream OS for about 10 years, most of which time it was being phased out.
Your first claim: Every OS was written in C. Your new claim: Most were written in assembly.
Pick a position. If most were written in assembly then it would not have had any impact on the adoption of Ada so why make the original claim?
I would respond to your question but you substantially edited your comment and removed the question. I also notice you removed the claim in your edit about most OSes being written in assembly in the 80s. Obnoxious way to communicate with people, altering your comments while they're replying so their replies look like random comments.
Nonsense. MPW Pascal and Think Pascal were well supported developer tools, and a lot of third-party developer code was written in them during the 80's. Photoshop (1987) was originally written in Pascal! Apple's Pascal dialect had object extensions that made OOP simpler than with C or standard Pascal.
Pascal started to leave the building circa 1991, when C++ became viable for OOP. Even then, Metrowerks CodeWarrior supported native Pascal compilation for PowerPC in 1993/4.
This report is of interest to you:
https://nap.nationalacademies.org/read/5463/chapter/1#vii
> It is in this context that Assistant Secretary of Defense (Command, Control, Communications, and Intelligence) Emmett Paige, Jr., requested that the National Research Council's Computer Science and Telecommunications Board (CSTB) review DOD's current programming language policy. Convened by CSTB, the Committee on the Past and Present Contexts for the Use of Ada in the Department of Defense was asked to:
> * Review DOD's original (mid-1970s) goals and strategy for the Ada program;
> * Compare and contrast the past and present environments for DOD software development; and
> *Consider alternatives and propose a refined set of goals, objectives, and approaches better suited to meeting DOD's software needs in the face of ongoing technological change.
https://www.militaryaerospace.com/communications/article/167... is an article from 1997 about this, and here's what it has to say:
> Paige says he believes industry engineers will be more likely to accept the benefits of using Ada if DOD leaders recommend, not require, the language. Software engineers, who would rather choose a language based on its merits rather than because of a governmental mandate, historically have resisted the Ada mandate on principle.
and
> Chief complaints about Ada since it first became a military-wide standard in 1983 centered on the perception among industry software engineers that DOD officials were "shoving Ada down our throats."
This is basically the story: The DoD tried to mandate it, people resisted, and made liberal use of the ability to be granted an exception, and so they eventually gave up.
The first link contains much more nuance, some excerpts:
> In decisions affecting adoption of programming languages, non-technical factors often dominate specific technical features. These factors include the broad availability of inexpensive compilers and related tools for a wide variety of computing environments, as well as the availability of texts and related training materials. In addition, grass-roots advocacy by an enthusiastic group of early users, especially in educational and research institutions, often has broad influence on adoption of programming languages. These advantages were largely absent when Ada was introduced in 1980. In contrast, C++ and Java both have achieved widespread acceptance and use. The strong military orientation of the publicity generated for Ada also may have served to alienate significant portions of the academic and research communities.
> Ada has not been widely taught in colleges and universities, particularly compared with Pascal, C, and C++; until recently, even the military academies taught programming in languages other than Ada
> Historically, compilers and other language-specific tools for Ada have been significantly more costly and slower in coming to market than those for C and C++.
> Software engineers are likely to be interested in enhancing skills that they expect to be most valuable in the software engineering marketplace, which is now dominated by commercial opportunities. Thus, programmers have moved quickly to learn Java and Hypertext Markup Language (HTML; used to develop pages for the World Wide Web) because they see these as the next wave, which can carry them to new career opportunities. Similarly, software engineers might avoid using Ada if they see it as limiting their careers.
I did some consulting at a major US car manufacturer, and helped with a coding seminar, mostly in java. A fair chunk of those developers struggled with a fizzbuzz exercise. All I can say is this: don't leave your baby in the back seat of an autonomous car just to get out and recharge unless you have consequential trust reciprocation with the manufacturer tantamount to shutting them down if anything tragic happened. Of course, even that price is too low.
I don’t think I’ve ever heard Graydon comment on Ada specifically, but early Rust was very different than today’s Rust. Funny enough, I’d argue that early Rust was much closer to Ada than the Rust we ended up with.
One thing that makes Ada and current Rust different is that Ada does a lot more checks at runtime than rust does, and is more okay with features that have more of a runtime than Rust is.
Another is that Ada doesn’t guarantee memory safety at compile time, or at least, it did not at that time. I believe that with the new Spark features that are inspired by Rust that recently landed may bring parity here, but I haven’t had time to investigate those yet.
But really, they're just very different languages from each other. There's no reason they can't coexist.
Simply as one who does not use Ada I had hoped to be a bit more informed about it.