We take this for granted now, but at the time it was revolutionary. In part, we've done things like mandating Unicode and IEEE 754, but nowadays most of our languages also encourage portability. We think very little of moving an application from Windows on x86_64 to Linux on ARMv8 (apart from the GUI mess), but back when Cobol was being created, you normally threw your programs away (“reprogramming”) when you went to a new machine.
I haven't used Cobol in anger in 50 years (40 years since I even taught it), but for that emphasis on portability, I am very grateful.
Rather than rejecting such features, COBOL was just slower to adopt them-because conservatism and inertia and its use in legacy systems. But there are 20+ year old COBOL compilers that support full OO (classes, methods, inheritance, etc)
you need special custom numerical types to come even close in, say, java or C++ or any other language.
I guess you mean:
>digest -> digits
>loosing -> losing
Is that the same as BCD? Binary Coded Decimal. IIRC, Turbo Pascal had that as an option, or maybe I am thinking of something else, sorry, it's many years ago.
Sounds interesting. Is there anywhere you know I can read about it, or is there something specific I can search for? All results I'm getting are unrelated.
1100 in “regular” binary is 12 in decimal.
0001 0010 in BCD is 12 in decimal.
ie: bcd is an encoding.
High precision numbers are more akin to the decimal data type in SQL or maybe bignum in some popular languages. It is different from (say) float in that you are not losing information in the least significant digits.
You could represent high precision numbers in BCD or regular binary… or little endian binary… or trinary, I suppose.
IBM z Systems Processor Optimization Primer
indeed, it's exactly BCD arithmetics which are part of the standard, with fixed decimal size and comma position
and yes, Turbo Pascal had some limited support for them.
you needed them in the 1990s for data exchange with banks in Germany. "Datenträgeraustauschformat". data storage exchange format. one of my first coding gigs was automatic collection of membership fees. the checksum for the data file was the sum of all bank account numbers and the sum of all bank ID numbers (and the sum of all transferred amounts)... trivial in Cobol. not so much in Turbo C++ :-)
I wasn't aware of the BCD in turbo Pascal... those were the days :-D
The newest attempt seems to be revolving around WASM, which should make language interoperability across many languages possible if they finally get the Components Model (I think that’s what they are calling it) ready.
And many of them can target WASM now too.
I am not really an expert but here is my best shot at explaining it based on a 5 minute web search.
Cobol/fortran were designed to run off punch cards, specific columns of the punch card were reserved to specific tasks, things like the sequence number, if the line is a comment, continuation lines.
https://en.wikipedia.org/wiki/COBOL#Code_format
https://web.ics.purdue.edu/~cs154/lectures/lecture024.htm
In python the indentation is a sort of enforced code style guide(sort of like go's refusal to let you load unused modules), by making the indentation you would do normally part of the block syntax, all python programs have to have high quality indentation. whether that author wants to or not.
Columns seven through 72 were for your code
My main beef, however, is that the last sentence in the section seems to suggest that the birth of Haskell killed SML on the vine because suddenly everybody only wanted pure, lazy FP. That's just wrong. The reality is that these two branches of Functional Programming (strict/impure and lazy/pure) have continued to evolve together to the present day.
F# was – from the start – a functional language designed specifically for the .NET Framework Common Language Runtime. Whenever OCaml and CLR diverged in how they did things, they went the CLR route.
(See e.g. https://entropicthoughts.com/dotnet-on-non-windows-platforms... for more information, or the Don Syme history of F#.)
Every Xmas when people are picking up new languages for Advent of Code (which like me he does most years) I wonder if he'll pick Rust and go "Oh, that's nice" - I was going to write a relatively contemporary spoiler here but instead let's say - it's like watching a Outer Wilds player staring at the starting night sky for their fifth or hundredth time wondering if they're about to say "Oh! Why is that different each time?". No. Maybe next time?
Sorry it may not have been clear, I was comparing the experience of knowing he might love Rust (or not) but not knowing if he'll decide to learn it - against the experience of watching unspoiled people playing a discovery game such as Outer Wilds where you know what they don't know yet and so you're excited to watch them discover it. I dunno that's maybe not an experience most people have.
If you either enjoy learning new languages or have a purpose for which Rust might be applicable I encourage you to try it. As you're an F# user it won't be as revelatory as it is for someone with say only C as a background, but on the other hand if you have no bare metal experience it might also be a revelation how fast you can go without giving up many of the nice things you're used to.
If you're a polyglot you probably won't encounter much that's new to you because Rust never set out to be anything unprecedented, I've heard it described as an "industrialization of known best practice" and it's ten years since Rust 1.0 drew a line in the sand.
I put the blame solely on the management of Borland. They had the world leading language, and went off onto C++ and search of "Enterprise" instead of just riding the wave.
When Anders gave the world C#, I knew it was game over for Pascal, and also Windows native code. We'd all have to get used to waiting for compiles again.
No kid or hobbyist or person just learning was spending $1400+ on a compiler. Especially as the number of open-source languages and tools were increasing rapidly by the day, and Community Editions of professional tools were being released.
Sure they were going for the Enterprise market money, but people there buy based on what they're familiar with and can easily hire lots of people who are familiar to work with it.
Last I looked they do have a community edition of Delphi now, but that was slamming the barn door long after the horses had all ran far away and the barn had mostly collapsed.
Would I be wrong in saying that SQL has what feels to me to be a very cobaly syntax. By which I mean, I know it is not directly related to cobal, But someone definitely looked at cobal's clunky attempt at natural language and said "that, I want that for my query language"
Pascal, particularly the Delphi/Object Pascal flavor, is also still in widespread use today.
edit: for ancient Greek to become a dead language, will we be required to burn all of the books that were written in it, or can we just settle for not writing any new ones?
Same with a programming language - is no one is wiring code in it, it's dead
As an aside, the article you linked to is pretty obvious AI slop, even aside from the image ("blockchin infarsucture" and all). Some of the details, like claims that MIT is offering COBOL programming classes or that banks are using COBOL to automatically process blockchain loan agreements, appear to be entirely fabricated.
No.
You have to put this relative to projects started in other languages, at which points new projects started in COBOL is even less than a rounding error, it probably wouldn't result in anything other than 0 with a float.
(Apart from the wild terminology. File modes are called "moods", and coincidentally, ALGOL 68's three standard "channels" (i.e. files) are "stand in", "stand out", and "stand back" -- I kid you not, close enough to 'stdin', 'stdout'.)
When I was in grad school in the late 70s, there was a major competition to design a DoD-mandated language, to be used in all DoD projects. Safety and efficiency were major concerns, and the sponsors wanted to avoid the proliferation of languages that existed at the time.
Four (I think) languages were defined by different teams, DoD evaluated them, and a winner was chosen. It was a big thing in the PL community for a while. And then it wasn't. My impression was that it lost to C. Ada provided much better safety (memory overruns were probably impossible or close to it). It would be interesting to read a history of why Ada never took off the way that C did.
(1) It was very expensive to licence at a time where C was virtually free.
(2) It was a complicated language at a time where C was (superficially) simple. This made it harder to port to other platforms, harder to learn initially, etc.
(3) All major operating systems for the PC and Mac happened to be written in C.
Ada had virtually nothing going for it except being an amazingly well-designed language. But excellence is not sufficient for adoption, as we have seen repeatedly throughout history.
Today? Virtually nothing stops you from using Ada. For lower level code, it's hands-down my favourite. Picking up Ada taught me a lot about programming, despite my experience with many other languages. There's something about its design that just clarifies concepts.
MS-DOS was mostly x86 assembly, Classic MacOS was a mix of 68k assembly and Pascal, CP/M was written in PL/M, UCSD P-System was Pascal, and this leaves out all of the OS Options for the Apple II - none of which were written in C. I'm hard pressed to identify a PC OS from that time period that was written in C, other than something Unix derived (and even sometimes the unix derived things were not C, Domain/OS for example was in Pascal).
If we leave the PC space, it gets even less true - TOPS10/20 NotC, RSX-11 NotC, VMS also NotC - and I can keep going from there - the only OS from the time period that I can point at from that time period that was C is UNIX.
I'd actually argue that C/C++ were not enshrined as the defacto systems programming languages until the early-90's - by that time Ada had lost for reasons (1) and (2) that you noted.
What would you recommend for getting started with it? Looks like there's GNAT and then also GNAT Pro and then the whole SPARK subset, which one would be best for learning and toying around?
SPARK is best considered a separate language. It gives up some of the things that make Ada great in exchange for other guarantees that I'm sure are useful in extreme cases, but not for playing around.
One thing to be aware of is that GNAT is part of GCC:
AdaCore is the primary developer of GNAT, SPARK, and related tools:
https://blog.adacore.com/a-new-era-for-ada-spark-open-source...
There are other Ada compilers as well:
https://forum.ada-lang.io/t/updated-list-of-ada-compilers/10...
> Use three spaces as the basic unit of indentation for nesting.
https://en.wikibooks.org/wiki/Ada_Style_Guide/Source_Code_Pr...
https://github.com/AdaCore/gnatformat
I had forgotten about the three spaces for indentation. Yuck.
They were written in “not Ada”; the original OS for the Mac was written in assembly and Pascal.
That said, Ada also has features that make C-style dynamic allocation less common. Ada does not have pointers but access types, and these are scoped like anything else. That means references cannot leak, and it is safer to allocate things statically, or in memory pools.
I've heard of enough Cobol and Fortran jobs existing, and Lisp continues to exist in some form or other, but Algol really does seem dead. I remember someone telling me about an Algol codebase that was decommissioned in 2005 and that seemed like a very late death for an Algol codebase.
Unisys still actively maintains their MCP mainframe operating system, which is written in an Algol superset (ESPOL/NEWP), and comes with a maintained Algol compiler - https://public.support.unisys.com/aseries/docs/ClearPath-MCP... - and they continue to add new features to Algol (even if minor)
So, no, Algol isn’t dead. Getting close but still not quite there. There are better candidates for dead languages than Algol, e.g. HAL/S (programming language used for the Space Shuttle flight software)
(In contrast, Lisp retains some unique ideas that have not been adopted by other languages, so it survives by a slim margin.)
It used to be that things like GC, REPL, flexible syntax, the cond form etc. made Lisp unique, but these days the list is down mainly to homoiconicity (and an amazing hacker culture).
Replacing that is a very hard problem as thousands and thoudands of (abused and overloaded) integrations are layered in serveral strata around it relying each night on exact behaviour in the core, warts and all.
As for Smalltalk, I know at least one company around hete still running some legacy, but still maintained afaik, code on it ( https://mediagenix.tv ).
The things it got wrong were mostly in it having a rigorous mathematical definition (syntax and semantics) that was almost unreadable by humans ... and the use of 2 sets of character sets (this was in the days of cards) rather than using reserved words
Fairly straight-forward once you've learnt the character set.
See here for details: https://aplwiki.com/wiki/Typing_glyphs
I feel that the article should have made this a lot more clear - as so many people code along the APL -> Matlab / R (via S) -> NumPy family tree.
I'm not sure what qualifies as dead. Prolog is still around although as a small and specific community, perhaps comparable in size to the APL community at least within an order of magnitude.
For some reason I remember an odd feature of PL/1: Areas and offsets. If I am remembering correctly, you could allocate structures in an area and reference them by offset within that area. That stuck in my mind for some reason, but I never found a reason to use it. It struck me as a neat way to persist pointer-based data structures. And I don't remember seeing the idea in other languages.
Maybe the reason it stayed with me is that I worked on Object Design's ObjectStore. We had a much more elegant and powerful way of persisting pointer-based structures, but an area/offset idea could have given users some of the capabilities we provided right in the language.
I believe it also starts to creep into things like C#.
>> An area is a region in which space for based variables
can be allocated. Areas can be cleared of their allocations
in a single operation, thus allowing for wholesale freeing.
Moreover, areas can be moved from one place to another by
means of assignment to area variables, or through input-output
operations.
>> Based variables are useful in creating linked data struc
tures, and also have applications in record inputoutput. A
based variable does not have any storage of its own; instead,
the declaration acts as a template and describes a generation
of storage.
http://www.iron-spring.com/abrahams.pdf p. 19, 74*shrug_emoji*
Freepascal [1] is up and running, targeting a lot of platforms: x86-16, x86-32, AMD64, RISC-V (32/64), ARM, AArch64, AVR, JVM, Javascript...
... and operating systems: Windows, Linux, MacOS, iOS, web and others.
I don't know an easier way to build multiplatform desktop applications other then Freepascal/Lazarus. I mean, real desktop apps, not that Electron bullsh*.
(There are a few other threads with a smaller number of comments.)
"COBOL was one of the four “mother” languages, along with ALGOL, FORTRAN, and LISP."
Lisp isn't as widely used as, say, Python, but it's still something a lot of people touch every single day.
What are people's thoughts about it?
I know a little about its history, and had tried it out a bit, some years ago, via the EiffelStudio trial edition, IIRC. I had also read a good chunk of Bertrand Meyer's book, Object Oriented Software Construction. Thought the book was good.
Unfortunately it has no mind-share.
Old justification: https://mail.python.org/pipermail/tutor/2003-October/025932....
>Nothing is really private in python. No class or class instance can keep you away from all what's inside (this makes introspection possible and powerful). Python trusts you. It says "hey, if you want to go poking around in dark places, I'm gonna trust that you've got a good reason and you're not making trouble."
>After all, we're all consenting adults here.
The difference is that in smalltalk everything is a message. Operators are messages to an object. Even things we commonly assume to be control structures in other languages like if or while (or rather ifTrue, whileTrue) are messages. Python is a lot less "pure" but so are all commonly used OO languages.
We sometimes think that Smalltalk is “true” OOP and things like Java and Python aren’t “real” OOP, but that’s not true.
+1 for basic, first used in gradeschool.
+1 for pascal (highschool)
+1 for lisp (compsci 101)
Java has virtually nothing in common with Smalltalk, other than in the most superficial way (things called objects exist, that work nothing alike). The closest thing to Smalltalk in serious use today is Ruby, which hews very close to Smalltalk's philosophy of message passing, though it does abandon the idea of a holistic programmable environment.
If one considers the actor model to be a natural evolution of Smalltalk's original idea of message-oriented, dynamic objects, then the BEAM might be Smalltalk's natural successor (Erlang, Elixir, Gleam, etc). Genservers are effectively isolated green threads that communicate solely by sending one another messages, which strikes me as very close to the spirit of Smalltalk's objects.
Most of the uniquely Perly things (topicalisation, autovivification, regexes in the language syntax, context sensitivity) haven't been picked up by other languages.
The only really Perly thing that was picked up elsewhere is CPAN, but that's not part of Perl-the-programming-language but Perl-the-community.
(Oh I guess PHP picked up sigils from Perl but misunderstood them and that was the end of that.)
In terms of direct language impact, Ruby code rarely shows that these days, but it was essentially designed in a way to easily migrate from Perl ways
This is when I started professionally and we were asked to replace "slow, old Perl scripts" As a new entrant, I didn't ask many questions, but I also didn't see any of the replacements as improvements in any way. I think the # of devs left to take over messy Perl projects was shrinking.
As you might imagine, this job involved a lot of text processing. People still point to that as the arrow in Perl's quiver, but it seems especially quaint today since any language I'd reach for would blow it out of the water in terms of flexibility and ease of use.
But I thought maybe the end of the 00s was when ror started showing up.
Mid 2000s I think i was learning php and lamp stack. Perl was already kind of old
Awk is sold on pattern matching, and there are earlier technologies that do pattern-matching - ML, SNOBOL.
But awk's historic significance is something else: it was the embryonic scripting language. You could use it in an imperative manner, and in 1977 that showed a new path to interacting with a unix system. It allowed you to combine arithmetic, string manipulation, and limited forms of structured data processing in a single process without using a compiler.
Two language schools grew from imperative awk. (1) The scripting language that expose convenient access to filesystem and OS syscalls like perl/pike/python/ruby; (2) The tool control languages like tcl/lua/io.
It may also have influenced shell programming. Note that awk was released before the Bourne shell.
That said, I don't know how many other languages explicitly have cited awk as an inspiration, which was the criterion for this list.
I often read answers to questions all over the internet where awk is part of the solution. Mainly serious programmers using BSD and Linux.
Unix gurus will recommend awk as a pattern matching and substitution tool.
But my comment was about awk the vanguard imperative scripting language. I don't know of anyone who recommends use of awk's imperative style over python in 2025.
As an exercise, I tried writing a simple roguelike in awk in an imperative style. Within twenty minutes, it felt obvious where perl came from.
Since F# was like the test bed, for many features that got moved to C# once proven.