286 pointsby fuzztester8 days ago114 comments
  • yellowapple6 days ago
    I've pretty much settled on Zig at this point, if only for how dead-simple it is to cross-compile for other hardware platforms. The process of compiling working code for oddball platforms (in my case the Nintendo 64) was way easier than I expected it to be.

    The only downside is the stdlib being as fast-moving of a target as it is. Right now I've had to put a pin on getting panic stack traces to work on my N64 code because apparently the upcoming release changes a bunch of stuff around panic/stacktrace handling (and it's already changed quite a bit over the years even before these new changes).

    • archargelod5 days ago
      > how dead-simple it is to cross-compile for other hardware platforms

      The fact that zig can compile C code makes it useful for other languages too. I recently started using `zig cc` to cross-compile Nim for lots of different platforms within the same environment.

      It takes no time to setup and, honestly, works like magic.

      • Galanwe5 days ago
        > The fact that zig can compile C code makes it useful for other languages too

        Agree, C interop is IMHO the big feature of Zig. There are plenty of systems programming languages in 2025, but where Zig shines is its pragmatism: a single standalone binary containing compiler, libc, build system, code formatter and test runner for C and Zig.

        As of late though, I've been concerned with some "holy wars"/"ideological postures" that the dev team started which IMHO departs from the original "let's be pragmatic" mantra.

        - There's a bunch of places where the stdlib just crashes on unreachable assertions, and that won't be fixed "because the kernel should have better error reporting".

        - There are a bunch of kernel syscalls which are just not possible to call "because C enums should not allow aliases"

        - etc

        I hope this trend fades away and it gets back on a more pragmatic stance on these issues, nobody wants a systems programming language that plays the programming police.

        Otherwise, C3 looks promising as well (though not as nice than Zig IMHO), but currently it's a bit too barebone to my taste. There no stable LSP, no nvim plug-in, etc.

      • csdvrx5 days ago
        I like Perl mostly because it's poetic (the code is super nice to read, with variable types standing out thanks to sigils), but another core strength is how very fast and light it is.

        Instead of "cross-compiling" or just running a native perl interpreter (there's one for about every platform!), I prefer how Actually Portable Executables make Perl multiplatform with just 1 binary asset running everywhere!

        I wanted to write a webserver processing CGI to learn more about the "old school web", so I wrote https://github.com/csdvrx/PerlPleBean and the simplicity of just downloading and running the .com on anything is very nice

        I'm now trying to do the same in Python3, but it's not as fun - and I'm not yet to the part where I will try to safely run python code within the python webserver, either through restrictedpython or ast.parse(), ast.walk(), eval(compile()) ...

        • dwedge5 days ago
          I also choose Perl most of the time, but I think this is the first time I've ever heard anyone call it super nice to read
          • raffraffraff5 days ago
            I've heard it called a "write-only language"
            • pino9995 days ago
              They call it like that, but it depends on the programmer as always. The problem is, that it is really flexible, more so than python or javascript, so it gives you all the tools to shoot yourself in the foot and take away the leg with it.

              An example, you can rewrite the calling program in a module.(https://metacpan.org/pod/Acme::Bleach orhttps://metacpan.org/release/DCONWAY/Lingua-Romana-Perligata...)

              While cool for jokes or serious DSL's, it may lead to difficult to understand code. (Nothing wrong with Damian Conway btw, I just remembered he used source filters in interesting ways).

              • csdvrx5 days ago
                > They call it like that, but it depends on the programmer as always.

                There are different styles, but in general they are concise, and I like them.

                perl use various sigils to remain concise, while other languages take a lot of room on the screen: too many letters in the usual function names, not enough sigils within the language.

                It's like if everything was in binary or hex, instead of using the full range of ASCII: while technically possible, it may be harder to fit into your head

                Python has one sub-style I dislike the most: using tabs for indentation, because how much EXTRA room they use on the screen.

                It must not just be me, as there are solutions for coloring the spaces (I forked https://github.com/csdvrx/indent-rainbow to focus on black-and-white and using spaces instead of tabs)

                I use space to limit the issue, but I can't make python less verbose.

                > it gives you all the tools to shoot yourself in the foot and take away the leg with it.

                python isn't innocent either: I recently traced a issue where exit(0) wasn't working to a threading problem, making a bad use of atexit.

                • d0mine5 days ago
                  > tabs for indentation

                  I don’t know a single Python project that does it. You can’t mix space and tabs for indentation.

                  4 spaces is the default for Python formatters like black, ruff (not sure whether it is configurable—never tried to change).

                  Big indent is a feature—deep nesting is a code smell.

                • worik5 days ago
                  > Python has one sub-style I dislike the most: using tabs for indentation, because how much EXTRA room they use on the screen.

                  Can you not adjust your tab stops?

                  I hate it too, because tabs look like spaces and they have a different syntactic meaning

                  • csdvrx5 days ago
                    I can, in vim it's simple. It just bothers me that it is the default and I have to take care of tabs with the rainbow, or a toggle shortcut like:

                    function TabCollapse_Toggle() abort

                        if &tabstop ==1
                    
                            set tabstop=8
                    
                        else
                    
                            set tabstop=1
                    
                        endif
                    
                     endfunction
                    
                    BTW if you hate tabs looking like other characters and other invisible characters (like spaces at the end of line, non breaking spaces...), I have a solution in CuteVim (https://github.com/csdvrx/CuteVim : just run the portable executable) where I mapped it by default to a Fxx key

                    If you already use vim, here's the relevant part: assuming your Shift-F11 is free, add to your vimrc:

                    " Default is off, `se list` to turn on and `se nolist` to turn off

                    " Traditional with ISO-8859-1:

                    "set listchars=tab:»·space:_,trail:·,eol:¶

                    " Or cuter with unicodes:

                    set listchars=tab:↹⇥,space:_,nbsp:␣,trail:•,extends:⟩,precedes:⟨,eol:↲

                    set showbreak=↪

                    inoremap <silent> <S-F11> <Esc>:set list!<CR>

                    noremap <silent> <S-F11> :set list!<CR>

                    Shift-F11 will then become a toggle, to show you tabs: you will see ↹ where the tab starts, and ⇥ for how long it is

                  • fuzztester5 days ago
                    >Can you not adjust your tab stops?

                    I've used this in vim for years:

                    :se expandtab tabstop=4 shiftwidth=4

                    (those can be abbreviated, check docs)

                    Then:

                    I use only tabs for indentation.

                    All the tabs get expanded to four spaces each.

                    Indenting and unindenting lines or blocks of lines, moves them by four spaces.

                    Never had a problem.

                    Maybe there are more fancy ways, but this works fine for me.

                    • worik4 days ago
                      > I use only tabs for indentation.

                      > All the tabs get expanded to four spaces each.

                      Then Python will not work (?)

                      • fuzztester4 days ago
                        Sure it will work. Even before I discovered this method (which is simple), I was using four spaces typed manually, for indentation in Python, for years.

                        Maybe it is you who "will not work".

                        Did you try it before commenting? I already said it works fine.

                    • fuzztester4 days ago
                      e.g. for tabstop, use ts, and for shiftwidth, use sw.
            • creer3 days ago
              > I've heard it called a "write-only language"

              A frequent opinion. Easy way to fit in for people who never bothered to learn the language. Which is all the more sad that Perl is super easy to learn (one layer at a time).

            • rurban5 days ago
              It's more readable than C++, C or Rust though
              • tmountain5 days ago
                100% depends on how it’s written. It gives a ton of flexibility regarding incorporating “magic variables” which can lead to incredibly abstruse code. The language motto is “there’s more than one way to do it”, and that’s implemented to a fault.
                • lief795 days ago
                  Paraphrased quote from one of my professors years ago:

                  Writing readable perl is easy, just code it like it's LISP.

                  Granted, he was working with it in AI/bioinformatics.

                  One of my classmates who moved into the IT/management side of things historically got much quicker responses from the dev team whenever he volunteered to code something, as he was always going to do it in perl.

                • rurban5 days ago
                  Same with C or C++. Only recently there came up some examples of well-written C++ code. But most of old cooperate, Microsoft or Stroustrup code is just horrible, worse than hard-core perl nonsense.
                  • bluGill5 days ago
                    Real Programmers can write FORTRAN in any language.

                    Making a large code base easy to read is very hard. People often work on tiny code bases and talk about how easy it is to read, not understanding that they are comparing something with a couple thousand lines of code to something with tens of millions.

                  • dataflow5 days ago
                    > Only recently there came up some examples of well-written C++ code.

                    I'm not sure what you're referring to (link?) but note that whether C++ code is good or bad can depend strongly on the tooling. Certain coding patterns can be fantastic when your tooling can adequately detect their misuse, and awful when it doesn't. Which means sometimes you can't tell just look at code and tell whether it is good or bad.

                    > But most of old cooperate, Microsoft or Stroustrup code is just horrible, worse than hard-core perl nonsense.

                    I got the impression Microsoft's C code was always pretty good, their C++ not so much a decade ago - not sure how their C++ is now.

              • jsrcout5 days ago
                Depends on who wrote it. My own Perl code, and plenty I've seen, is extremely clean and readable; sadly, a lot isn't. I'm sure clean and readable C++ exists, but the stuff I have to work with - big codebases with tons of history - is not. "Terrifying" would be more apt in most cases.
                • aaronbaugher4 days ago
                  The Perl code I write today is much cleaner and easier to follow than what I wrote 30 years ago. I hope that's true of my programs in other languages too.
            • fuzztester5 days ago
              Also called executable line noise, like Python is called executable pseudocode.

              But I like Perl (and other languages) too.

              Variety is the spice of life.

          • johnisgood5 days ago
            My Perl code is super nice to read. :D
            • chgs5 days ago
              My perl code isn’t. But neither is my Python, or Java, or C, or JavaScript or bash or anything.

              Clearly the problem is all these languages, and not me.

              • johnisgood5 days ago
                Could be... or readable by who, who do not know these languages at all? :P
          • davidwritesbugs5 days ago
            The joke used to be that Perl code looked like an explosion in an apostrophe factory.
            • dwedge5 days ago
              I once accidentally piped an SSL certificate into Perl and got 40+ warnings before it realised it wasn't Perl. I'm not joking.
            • csdvrx5 days ago
              It may looks chaotic ("explosion") when you don't see the structure.

              When you do, you appreciate the density of information.

              When I read perl it's like I read a poem: to take a simple example, 'while/until' instead of 'while/while not' creates more beautiful code

            • perlcommunity4 days ago
              Larry Wall has said he likes parenthesis in his LISP like he likes fingernails in his oatmeal. xD Hence, Perl's abilit to forgo parens in many cases.
            • vram222 days ago
              Or like an explosion in a sigil pottery :)
            • antod5 days ago
              Or executable line noise (I think I heard that one from slashdot)
            • 5 days ago
              undefined
          • IshKebab5 days ago
            Maybe he's used to K?
        • yellowapple4 days ago
          Perl was my first "real" language, i.e. the first one that actually "clicked" for me. Still holds a soft spot in my heart, even though I don't use it much these days. It's one of the few languages (along with Ruby, Erlang, and Zig) that I feel have made me a better programmer by learning them.
        • ajsnigrutin5 days ago
          I like perl, because the code i wrote 20 years ago still works without issues.
        • phatskat4 days ago
          Just wanted to say that I really dig your enthusiasm! I read most of PerlPleBean’s README and was smiling the whole time - APE was so exciting to read about when it came out, and your project may just inspire me to look more into what it can do!
        • kamma44345 days ago
          Importing stuff fron CPAN used to be a nighthmare but I admit I have a sweet spot in my heart for Perl.
    • Galanwe5 days ago
      > The only downside is the stdlib being as fast-moving of a target as it is.

      Ah that's an interesting take, my opinion is that the stdlib doesn't move fast enough.

      In its current state it's pretty broken, most of the "process", "os" and "posix" modules are either straight up raising unreachable in normal scenarios, or simply badly designed. I would like the stdlib to be much more fast moving and fix all these issues, but I had the impression most work on it is frozen until 0.15 or 0.16, after incremental compilation is done.

      • brodo5 days ago
        You are right, the stdlib is not the highest priority right now. There are major improvements coming in 0.14 though. The new default allocator for example. I think the problem you describe can be solved by having more contributors focussing on the standard library. With the compiler, there are bottlenecks which make onboarding new people hard. This is a smaller problem in stdlib.
        • Galanwe5 days ago
          > I think the problem you describe can be solved by having more contributors focussing on the standard library.

          I don't think so, my impression is that stdlib improvements are volontarily frozen for now, not because of a lack of contributors but because of a lack of clear plan as to what the stdlib should look like. There are a number of issues and PR of people willing to contribute to stdlib that are stalled.

          That's not to say that's its bad per se, "we don't have a plan for now and don't want people to commit time for an unclear target" is a perfectly OK answer.

    • epolanski5 days ago
      I'm picking up zig as my first system programming language myself and I love it.

      Sadly the job market looks dead

      • azthecx5 days ago
        It's such a new language, not even in 1.0.0 You won't really find companies willing to bet their livelihoods at such an early stage.

        You can make your own though :)

    • nyjah6 days ago
      What N64 code are you working on? I am intrigued.
    • dvdbloc5 days ago
      What’re you doing with Zig and N64? Sounds awesome.
  • creakingstairs6 days ago
    I've been using Odin [1] for my hobby game development and I've been liking it a lot. Feels like a more ergonomic C.

    Things I like:

    - Vendor libraries like Raylib and MicroUI make it easy to get started

    - I can pass around memory allocators and loggers implicitly using context, or explicitly if I need to.

    - natively supports vector math and swizzling

    - error handling with `or_else` and `or_return`

    Things I don't like:

    - Name spacing is a bit annoying. The convention is to prefix the procedures but I don't like how they look. It really isn't a big issue.

    Have a quick read of the overview and if you are still interested, I highly recommand 'Understanding the Odin Programming Language' book by Karl Zylinski [2]

    [1] https://odin-lang.org/docs/overview/

    [2] https://odinbook.com/

    • johnisgood5 days ago
      I like Odin, but the creator is not too motivational (or rather, actively un-motivational)[1]. I still use it nonetheless for some of my own stuff, for now.

      Regardless, I do recommend people to try it out. I use Linux and OpenBSD, too, despite Linus and Theo. :)

      [1] The reason for why I think this can be found in their pull requests, but it's been some time I think.

      • amjoshuamichael5 days ago
        What do you mean by "motivational?" Are you talking about how the creator is against adding new features to the language? I actually think that's perfectly fine. One of my favorite things about Odin is the simplicity; the entire language and all of its rules can be understood by reading the Odin overview document. I'm actually thrilled to have a creator that doesn't want to bloat the language.
        • johnisgood5 days ago
          No, suppose you make a contribution to a project and its creator belittles you instead of providing constructive criticism, for example. This contribution is not a new feature to the language, however, perhaps a 3rd-party library.

          It has nothing to do with adding new features. I agree with you, I do not want the language to be bloated, nor do I want new features blindly added. I prefer simplicity, too.

          FWIW you can see him losing his "cool" on Discord, too, at times.

          • vhantz5 days ago
            You should back those claims with something more than handwaving.
            • johnisgood5 days ago
              I am not trying to make him look bad. I just simply stated my reasons for hesitancy when it comes to contributing to the language. At any rate, his messages are available on Discord (unless deleted) and the pull requests are out there too, on GitHub.

              I do not intend to have a collection of all the times he lost his cool.

              • vhantz4 days ago
                Well, with your unsubstantiated claims, you are making him look bad, regardless of your intentions.
                • johnisgood4 days ago
                  Yeah, but it would be even worse if I collected everything about him, that would even lean towards obsession.
            • 5 days ago
              undefined
      • xxami5 days ago
        I remember being turned off from it for the same reasons, in particular there were some fairly harsh comments directed to V's developer which felt a bit dog piley to me. Drama's long dead now.. It is a great language though. Closest to the language that would kill C for me so far, but not quite =D
        • johnisgood5 days ago
          I missed out on this! I remember something, had no idea Odin was involved.
  • chris_armstrong5 days ago
    OCaml

    The compiler is very fast, even over large codebases.

    Mostly trying to bring AWS tooling to the platform[1], or experimenting with cross-compilation[2] using another less well known systems language, zig.

    [1] https://github.com/chris-armstrong/smaws/ [2] https://github.com/chris-armstrong/opam-cross-lambda

    • mbac327685 days ago
      I've used a lot of programming languages and the kind of groove you can get into with OCaml is hard to match. You can just dive into an enormous, unfamiliar codebase and make changes to it with so much more confidence. But while it's reasonably fast, it's also higher level than Rust so you don't have to struggle quite so much with forms like `Arc<Mutex<HashMap<String, Box<dyn Processor + Send + Sync>>>>` everywhere.

      Re: AWS tooling, have you seen https://github.com/solvuu/awsm ?

      It generates code for all 300+ AWS services and produces both Async and Lwt forms. Should be fairly extensible to Eio.

      I worked on this. Let me know if you want to tag team.

    • IshKebab5 days ago
      I want to like OCaml but OPAM is just so bad... and tooling is super important (it's one of the reasons Go is popular at all). Windows support is also an afterthought. There's no native debugger as far as I can tell. This is before you even get to the language which definitely has its own big flaws (e.g. the lack of native 64-bit integers that MrMacCall mentioned.

      The syntax is also not very friendly IMO. It's a shame because it has a lot of great ideas and a nice type system without getting all monad in your face. I think with better tooling and friendlier syntax it could have been a lot more popular. Too late for that though; it's going to stay consigned to Jane Street and maybe some compilers. Everyone else will use Rust and deal with the much worse compile time.

      • Taikonerd5 days ago
        > The syntax is also not very friendly IMO.

        Very true. There's an alternate syntax for OCaml called "ReasonML" that looks much more, uh, reasonable: https://reasonml.github.io/

        • Bilirubino18 hours ago
          The OCaml syntax was discussed a long time ago between the developers and the whole community and the agreement was that the community is happy with the current/original syntax. ReasonML was created for those developers more familiar with Javascript, but it was not very successful in attracting new developers as they usually look more at the semantics of the language along with the syntax (and that is where OCaml's type system shines). Strictly speaking, there is a long list of ML family languages that share many properties of OCaml's syntax. However, what is a ‘reasonable’ syntax is open to debate. Javascript and Python were not mainstream languages when Ocaml was developed and it made much more sense to create a syntax in line with the ML family of powerful languages available at the time. Once you program a bit in OCaml syntax is not a problem, learning to program in a functional paradigm and getting the most out of it is the real challenge.
      • mbac327685 days ago
        > (e.g. the lack of native 64-bit integers that MrMacCall mentioned.

        They exist, I think you just mean `int` is 63-bit and you need to use operators specialized `Int64.t` for the full precision.

        • MrMcCall5 days ago
          How can you access the full 64 bits if "one bit is reserved for the OCaml runtime"? (the link is in the my original post's thread)
          • ravi-delia5 days ago
            The usual int type is 63 bits. You can get a full 64 bit int, it just isn't the default.
            • MrMcCall5 days ago
              The docs say, "one bit is reserved for the OCaml runtime", so doesn't that mean that one of the bits (likely the high bit) are unavailable for the programmer's use?

              I mean, I understand "reserved" to mean either "you can't depend upon it if you use it", or "it will break the runtime if you use it".

              • ravi-delia5 days ago
                So the "one bit" you refer to is what makes the standard int 63 bits rather than 64. If you could do things with it it would indeed break the runtime- that's what tells it that you're working with an int rather than a pointer. But full, real, 64-bit integers are available, in the base language, same goes for 32.
                • MrMcCall4 days ago
                  And that means that the OCaml runtime is not compatible with systems-level programming.

                  If something is "available", it should mean that it can be used to its full capacity. One of those bits are definitely not available.

                  • roetlich4 days ago
                    I think you need to re-read some of the comments you are replying to. There is a 64 bit int type: https://ocaml.org/manual/5.3/api/Int64.html You can use all 64 bits. There are also other int types, with different amounts of bits. For example, 32 bit: https://ocaml.org/manual/5.3/api/Int32.html No one will stop you. You can use all the bits you want. Just use the specific int type you want.
                    • MrMcCall3 days ago
                      ravi-delia explained that the fact is that an OCaml int is different than either Int32 or Int64 because an 'int' sacrifices one of its bits to the OCaml runtime. Int32 or Int64 are treated completely differently and are library defintions, bolted onto the OCaml runtime.

                      That is a runtime system not suitable for systems-level programming.

                      My C experience gave me a fundamental misunderstanding because there, an int is always derived from either an 32- or 64-bit int, depending on architecture.

                      OCaml is architected differently. I imagine the purpose was to keep the programs mostly working the same across processor architecture sizes.

                      I imagine this fundamental difference between OCaml's native int and these more specific Ints is why there are open issues in the libray that I"m sure the int does not.

                      Regardless, no one should be using OCaml for systems-level programming.

                      Thanks for helping me get to the heart of the issue.

                      • Bilirubino2 days ago
                        The situation is that OCaml is giving you all the options:

                        (a) int has 31 bits in 32-bit architectures and 63 in 64-bit architectures (which speed up some operations)

                        (b) the standard library also provides Int32 and Int64 modules, which support platform-independent operations on 32- and 64-bit signed integers.

                        In other words: int is different but you always have standard Int32 and Int64 in case you need them.

                        It seems therefore that the use for system-level programming should not be decided for this (although the fact that it is a garbage collected language can be important depending on the case, note that still its garbage collector has been proved one of the fastest in the comparisons and evaluations done by the Koka language team of developers).

                  • ravi-delia4 days ago
                    Ok, running this by you one more time. There is a type called "int" in the language. This is a 63-bit signed integer on 64-bit machines, and a 31-bit integer on 32-bit machines. It is stored in 64 bits (or 32), but it's a 63-bit signed integer, because one of the bits is used in the runtime. There is also a 64 bit integer, called "Int64". It has 64 bits, which is why I call it a 64-bit integer rather than a 63-bit integer. An "int" is a 63-bit integer, which is why I call it a 63-bit integer rather than a 64-bit integer.
                    • MrMcCall3 days ago
                      So an int has nothing to do with an Int32 or Int64.

                      Thanks for your patient elucidation.

                      This means the semantics for Int32 and Int64 are COMPLETELY different than that of an int. My problem is that I come from the C world, where an int is simply derived from either a 32- or 64-bit integer, depending on the target architecture.

                      OCaml's runtime is not a system designed for systems-level programming.

                      Thanks again.

                      Now I know why the F# guys rewrote OCaml's fundamental int types from the get-go.

                      • Bilirubino2 days ago
                        The reason of F# guys did things different from OCaml is not because system-level programming but because F# is a language designed for the .NET ecosystem which imposes specific type constrains. F# language was not specifically designed for systems-level programming.

                        Again, the semantics of Int is different but the semantics in OCaml of Int32 and Int64 is the same/standard. So you have 3 types: int, Int32 and Int64 and it is an static typed language.

                      • ravi-delia3 days ago
                        I mean I guess you could say they have different semantics. They're just different types, int and Int64 aren't any more different from each other than Int64 and Int32. You can treat all of them exactly the same, just like how you have ints and longs and shorts in C and they all have the same interface.

                        Regardless, I don't think C's "probably 32 bit" non-guarantee is the make or break feature that makes it a systems language. If I care about the exact size of an integer in C I'm not going to use an int- I'm going to use explicit types from stdint. Rust makes that mandatory, and it's probably the right call. OCaml isn't really what I'd use for a systems language, but that's because it has no control over memory layout and is garbage collected. The fact that it offers a 63-bit integer doesn't really come into it.

                        • MrMcCall3 days ago
                          > int and Int64 aren't any more different from each other than Int64 and Int32

                          They are, though. Int64 and Int32 only differ in bit length and are in formats native to the host microprocessor. int has one of its bits "reserved" for the OCaml runtime, but Int32 has no such overhead.

                          > The fact that it offers a 63-bit integer doesn't really come into it.

                          It does if you interoperating with an OS's ABI though, or writing a kernel driver.

                          But you're right: there are a host of other reasons that OCaml shouldn't even have been brought up in this thread ;-)

                          Peace be with you, friend. Thanks for so generously sharing your expertise.

              • lapinot5 days ago
                • MrMcCall3 days ago
                  I see, now. From that doc:

                  > Performance notice: values of type int64 occupy more memory space than values of type int

                  I just couldn't even imagine that a 64-bit int would require MORE memory than an int that is one bit less (or 33 bits less if on a 32-bit architecture).

                  It really makes absolutely no sense discussing OCaml as a possible systems-level programming language.

                  • MrMcCall3 days ago
                    Sorry, I should have said that an Int64 shouldn't take more memory on a 64-bit system where the default int is 63 bits, because of the "reserved bit".

                    It was early this morning.

      • dgan5 days ago
        Why opam is bad? Compared to what? Could you elaborate
        • IshKebab5 days ago
          1. I've found it to be extremely buggy, often in confusing ways. E.g. there was a bug where it couldn't find `curl` if you were in more than 32 Linux groups.

          2. It has some kind of pinning system that is completely incomprehensible. For example you can do `opam install .`, which works fine, and then `git switch some_other_branch; opam install .` and it will actually still install the old branch?? Honestly I've never figured out what on earth it's trying to do but me and my colleagues have had constant issues with it.

          > Compared to what?

          Compared to good tooling like Cargo and Go and NPM and uv (if you give it some slack for having to deal with Python).

          It's better than Pip, but that doesn't take much.

          • Bilirubino18 hours ago
            In my case I have not found opam buggy at all, and I never find it confusing but this last point may be personal taste. The bug you commented is something I have never experimented with opam in linux or Mac OS and I am sure if you report the developer will check about it.

            The point 2 you mention, I don't understand the issue. There is an opam switch which works for me perfectly, no issues at all. Please, like any other tool it is better to read the manual to understand how it works.

            Cargo and opam is not something comparable, probably next generation of dune could be, but at this moment it is make no sense compare two utilities that are so different. Compare with pip, julia package manager, etc is fine. Personally, I like more opam than npm and pip.

          • dgan3 days ago
            Interesting, thanks, I have been using opam, but since I am lal alone and by myself, I never hit the cases you mentioned
      • fuzztester5 days ago
        >The syntax is also not very friendly IMO.

        Why do you think that the syntax is not very friendly?

        Not saying you are wrong, just interested to know.

      • satvikpendem4 days ago
        Have you tried esy?
    • fuzztester5 days ago
      I've read some part of the book Real World OCaml, by Yaron Minsky and Anil Madhavapeddy.

      https://dev.realworldocaml.org/

      I also saw this book OCaml from the Very Beginning by John Whitington.

      https://ocaml-book.com/

      I have not read that one yet. But I know about the author, from having come across his PDF tools written in OCaml, called CamlPDF, earlier.

      https://github.com/johnwhitington/camlpdf

      >CamlPDF is an OCaml library for reading, writing and modifying PDF files. It is the basis of the "CPDF" command line tool and C/C++/Java/Python/.NET/JavaScript API, which is available at http://www.coherentpdf.com/.

    • davidwritesbugs5 days ago
      My problem with OCaml is just that there is no stepping debugger for VScode. I'd use it except for that.
      • worik5 days ago
        Yes

        Symbolic debugger seem to be going out of fashion

    • MrMcCall5 days ago
      It's my understanding that OCaml does not allow its programs to specify the size and signedness of its ints, so no 16-bit unsigned, 32-bit signed, etc...

      Being a huge fan of F# v2 who has ditched all MS products, I didn't think OCaml was able to be systems-level because its integer vars can't be precisely specified.

      I'd love to know if I'm wrong about this. Anyone?

      • Bilirubino18 hours ago
        The modules Int64 and Int32 and part of the OCaml standard library. You mentioned that it is needed dune or Janestreet in your comments to have this functionality. They are part of the standard library. Really, they are part of Ocaml core developments. Actually, for example, you even can use the library big-arrays with these types and int8, int16, signed, unsigned... even more you have platform-native signed integers (32 bits on 32-bit architectures, 64 bits on 64-bit architectures) with Bigarray.nativeint_elt as part of the standard library so all these types are there.

        You also mention that Int32 and Int64 are recent, however these libraries were part of OCaml already in the 4.X versions of the compiler and standard library (now we are in the 5.3).

        Note that in OCaml you can use C libraries and it is quite common to manage Int32, Int64, signed etc...

      • cmrx645 days ago
        You’re wrong, not sure where you got that conception but the int32/64 distinction is in the core language, with numerous libraries (eg stdint, integers) providing the full spectrum.
        • MrMcCall5 days ago
          Thanks. They're not in the basic-data-types, but you are correct, they are available in the stdint module, which has a pub date from Oct 19, 2022. It can be found here:

          > https://opam.ocaml.org/packages/stdint/

          It's been a while since I investigated OCaml, so I guess this is a recent addition and is obviously not a part of the standard integer data types (and, therefore, the standard language), that not only have no signedness, and only have Int32 and Int64, but have "one bit is reserved for OCaml's runtime operation".

          The stdint package also depends on Jane Street's "Dune", which they call a "Fast, portable, and opinionated build system". I don't need or want or need any of its capabilities.

          As well, the issues page for stdint has a ton of more than year old open issues, so, as I understood, OCaml does not, like F#, have all sizes and signedness of ints available in their fundamental language. Such a language is simply not a good fit for system-level programming, where bit-banging is essential. Such low-level int handling is simply not a part of the language, however much it may be able to be bolted on.

          I just want to install a programming language, with its base compiler and libraries and preferably with man pages, open some files in vi, compile, correct, and run. That is my requirement for a "systems-level" language.

          I would never in my life consider OCaml with opam and Dune for building systems-level software. I wish it could, but it's not copacetic for the task, whose sole purpose is to produce clean, simple, understandable binaries.

          Thanks for helping me understand the situation.

          • Bilirubino17 hours ago
            As I commented above, Int32 and Int64 are part of the standard library since at least 4.X Ocaml versions (we are now in 5.3). So normally all them are available when you install any distribution of Ocaml. Note that there is also a type named nativeint (which, I think is the kind of int that you were looking for in all your comments and post) and it is part of the standard library, so in summary:

            Int type (the one you dislike for systems programming)

            Int32 type (part of the standard library, one of those you were looking for)

            Int64 type (part of the standard library, one of those you were looking for)

            Nativeint (part of the standard library, maybe the one you were looking for)

            The library stdint is other option, which can be convenient in some cases but for Int32 and Int64 you don't need it also for Nativeint you don't need it.

          • thedufer5 days ago
            > which has a pub date from Oct 19, 2022

            I think you're misinterpreting this. That's just the date the most recent version of the library was published. The library is something like 15 years old.

            > the standard integer data types (and, therefore, the standard language), that not only have no signedness

            I'm not sure what you mean by this - they're signed integers. Maybe you just mean that there aren't unsigned ints in the stdlib?

            > and only have Int32 and Int64, but have "one bit is reserved for OCaml's runtime operation".

            The "one bit is reserved" is only true for the `int` type (which varies in size depending on the runtime between 31 and 63 bits). Int32 and Int64 really are normal 32- and 64-bit ints. The trade-off is that they're boxed (although IIRC there is work being done to unbox them) so you pay some extra indirection to use them.

            > The stdint package also depends on Jane Street's "Dune", which they call a "Fast, portable, and opinionated build system". I don't need or want or need any of its capabilities.

            Most packages are moving this way. Building OCaml without a proper build system is a massive pain and completely inscrutable to most people; Dune is a clear step forward. You're free to write custom makefiles all the time for your own code, but most people avoid that.

            • MrMcCall5 days ago
              > The library is something like 15 years old.

              It's not clear from the docs, but, yeah, I suspected that might be the case. Thanks.

              > I'm not sure what you mean by this - they're signed integers. Maybe you just mean that there aren't unsigned ints in the stdlib?

              Yes, that's what I mean. And doesn't that mean that it's fully unsuitable for systems programming, as this entire topic is focused on?

              > The "one bit is reserved" is only true for the `int` type (which varies in size depending on the runtime between 31 and 63 bits).

              I don't get it. What is it reserved for then, if the int size is determined when the runtime is built? How can that possibly affect the runtime use of ints? Or is any build of an OCaml program able to target (at compile-time) either 32- or 64-bit targets, or does it mean that an OCaml program build result is always a single format that will adapt at runtime to being in either environment?

              Once again, I don't see how any of this is suitable for systems programming. Knowing one's runtime details is intrinsic at design-time for dealing with systems-level semantics, by my understanding.

              > Building OCaml without a proper build system

              But I don't want to build the programming language, I want to use it. Sure, I can recompile gcc if I need to, but that shouldn't be a part of my dev process for building software that uses gcc, IMO.

              It looks to me like JaneStreet has taken over OCaml and added a ton of apparatus to facilitate their various uses of it. Of course, I admit that I am very specific and focused on small, tightly-defined software, so multi-target, 3rd-party utilizing software systems are not of interest to me.

              It looks to me like OCaml's intrinsic install is designed to facilitate far more advanced features than I care to use, and that looks like those features make it a very ill-suited choice for a systems programming language, where concise, straightforward semantics will win the day for long-term success.

              Once again, it looks like we're all basically forced to fall back to C for systems code, even if our bright-eyed bushy tails can dream of nicer ways of getting the job done.

              Thanks for your patient and excellent help on this topic.

              • thedufer5 days ago
                > I don't get it. What is it reserved for then, if the int size is determined when the runtime is built? How can that possibly affect the runtime use of ints?

                Types are fully erased after compilation of an OCaml program. However, the GC still needs to know things about the data it is looking at - for example, whether a given value is a pointer (and thus needs to be followed when resolving liveness questions) or is plain data. Values of type `int` can be stored right alongside pointers because they're distinguishable - the lowest bit is always 0 for pointers (this is free by way of memory alignment) and 1 for ints (this is the 1 bit ints give up - much usage of ints involves some shifting to keep this property without getting the wrong values).

                Other types of data (such as Int64s, strings, etc) can only be handled (at least at function boundaries) by way of a pointer, regardless of whether they fit in, say, a register. Then the whole block that the pointer points to is tagged as being all data, so the GC knows there are no pointers to look for in it.

                > Or is any build of an OCaml program able to target (at compile-time) either 32- or 64-bit targets, or does it mean that an OCaml program build result is always a single format that will adapt at runtime to being in either environment?

                To be clear, you have to choose at build time what you're targeting, and the integer sized is part of that target specification (most processor architectures these days are 64-bit, for example, but compilation to javascript treats javascript as a 32-bit platform, and of course there's still support for various 32-bit architectures).

                > Knowing one's runtime details is intrinsic at design-time for dealing with systems-level semantics, by my understanding.

                Doesn't this mean that C can't be used for systems programming? You don't know the size of `int` there, either.

                > But I don't want to build the programming language, I want to use it.

                I meant building OCaml code, not the compiler.

                • MrMcCall5 days ago
                  Thanks for the fantastic explanation for how ints are handled in OCaml, but I've got to say that having the low bit be the flag is a strange design decision, IMO, but I understand that aligning the pointers will make the low bit or two irrelevant for them. But, oh!, the poor ints.

                  All this said, thanks for putting to bed, once and for all, any notion anyone should have that OCaml can be used as a systems language. Yikes!

                  > Doesn't this mean that C can't be used for systems programming? You don't know the size of `int` there, either.

                  You know that at compile time, surely, when you set the build target, no? Even the pointer sizes. Besides, after years of C programming, I got to where I never used the nonspecific versions; if I wanted 64-bits unsigned, I would specifically typedef them at the top, and then there's no ambiguity because I specifically declared all vars. (You can see how I did the same thing in F# at the bottom of this reply.)

                  It makes working with printf much less problematic, where things can easily go awry. Anyway, I want my semantics to percolate down pyramid-style from a small set of definitions into larger and larger areas of dependence, but cleanly and clearly.

                  Sure, DEFINEs can let you do transparent multi-targetting, but it ends up being very brittle, and the bugs are insidious.

                  Thanks for your excellence. It's been a joy learning from you here.

                  ---

                  As an aside, here's a small part of my defs section from the final iteration of my F# base libs, where I created an alias for the various .NET types for standard use in my code:

                     type tI4s = System.Int32
                     type tI1s = System.SByte
                     type tI2s = System.Int16
                     type tI8s = System.Int64
                  
                     type tI1u = System.Byte
                     type tI2u = System.UInt16
                     type tI4u = System.UInt32
                     type tI8u = System.UInt64
                  
                  Why risk relying on implicit definitions (or inconsistent F# team alias naming conventions) when, instead, everything can be explicity declared and thus unambiguous? (It's really helpful for syscall interop declarations, as I remember it from so many years ago). Plus, it's far more terse, and .NET not being able to compile to a 64-bit executable (IIRC) made it simpler than C/C++'s two kinds of executable targets.
                  • thedufer3 days ago
                    > But, oh!, the poor ints.

                    Empirically this is a rather low cost. IIRC, the extra ops add less than a cycle per arithmetic operation, due to amortizing them over multiple operations and clean pipelining (and also things like shifts just being really cheap).

                    But yes, there are certainly applications where we almost exclusively use Int64 or Int32 rather than the primary int type, if you need exactly that many bits.

                    > You know that at compile time, surely, when you set the build target, no?

                    Well, that's true of OCaml as well.

                    This is ultimately a difference of opinion - I think that the cost of installing a single extra library to get ints of various widths/signedness would be worth the advantage of eliminating nearly all memory errors (and various other advantages of a higher-level language).

                    The main carveout I would agree with is any case where you absolutely need strict memory bounds - it's not clear to me how you'd satisfy this with any GC'd language, since the GC behavior is ultimately somewhat chaotic.

      • worik5 days ago
        > F# v2

        What does that mean?

        • MrMcCall5 days ago
          The second version of F#, where they implemented generics, before they got into the type provider stuff.
    • eimrine5 days ago
      What is ML programming language? They say OCaml is the same thing with the different name, is it truth?
    • rowls665 days ago
      Can a systems programming lanugage use garbage collection? I don't think so.
      • flavio815 days ago
        You´d be surprised.

        In the 1980s, complete workstations were written in Lisp down to the lowest level code. With garbage collection of course. Operating system written in Lisp, application software written in Lisp, etc.

        Symbolics Lisp Machine

        https://www.chai.uni-hamburg.de/~moeller/symbolics-info/fami...

        LMI Lambda http://images.computerhistory.org/revonline/images/500004885...

        We're talking about commercial, production-quality, expensive machines. These machines had important software like 3D design software, CAD/CAM software, etc. And very, very advanced OS. You could inspect (step into) a function, then into the standard library, and then you could keep stepping into and into until you ended up looking at the operating system code.

        The OS code, being dynamically linked, could be changed at runtime.

  • seanw4445 days ago
    My two recommendations are easily Nim and Zig.

    If you want something that is essentially just a modernized C, go with Zig. The concept of compile-time programming having the same appearance as runtime programming is very cool in my opinion. My only major complaint at the moment is that duck typing is fairly prevalent. Sometimes function arguments are declared `anytype` and you occasionally have to dive down multiple function calls to figure out what's going on, though that's not too much of a hindrance in practice, in my experience.

    My personal favorite language is Nim. Efficient, but simple, memory management (drawing from C++/Rust). You rarely have to think too hard about it, yet making fast programs is not complicated. You can stick to the stack when you want to. The flexibility at compile-time gives you great power (but it requires great responsibility -- easy to abuse in a bad way). The type system is awesome. The only downside for me is the tooling. The LSP needs much optimization, for example.

    • sph5 days ago
      My issue with Nim is its import system. If you have a function "foo" it's hard to tell where is it imported from. I'm not sure why this bothers me when C is the same... probably because I'm familiar by now which header defines any C function.

      Also, I believe high-level compiled languages suffer from the fact that it is very hard to tell which construct is expensive and which is a zero-cost abstraction. Rust has the same issue, but "zero-cost" is a major feature of the language so you don't feel bad using an Iterator, for example, in kernel code. With Nim it is hard to tell.

      • seanw4445 days ago
        It makes logical sense to do imports that way when operator overloading exists. Otherwise your custom operators would look like:

            import other
        
            varA other.`+` varB
        
        Which is very ugly. At that point, we might as well just go with the function name approach that languages like Go take:

            customAdd(varA, varB)
        
        I suppose you could change it so operators are imported into the same namespace, and non-operators still require a separate namespace when referred to. But that makes it even more complicated in my opinion. I agree it's less obvious what's coming from where, but I think when your libraries have distinct responsibilities, it usually ends up being pretty straight-forward what function comes from where based on how it's named (if it's written well).
    • winrid5 days ago
      I find the type system in Nim to be pretty poor. It's difficult to reason about what is on the stack vs heap by looking at the business logic and not the types themselves, and also hard to reason about when you do copies vs pointers, since everything is defined on the type itself. I find it to be a bad design decision, I wouldn't build anything large with it.
    • optymizer5 days ago
      > The concept of compile-time programming having the same appearance as runtime programming is very cool in my opinion

      https://tour.dlang.org/tour/en/gems/compile-time-function-ev...

    • flavio815 days ago
      >The concept of compile-time programming having the same appearance as runtime programming is very cool in my opinion.

      You mean, something that Lisp does since the early 1980s?

      • seanw4444 days ago
        I didn't say it was novel. It's just not something you see in modern languages.
      • akho5 days ago
        Should that make it uncool?
      • bsder4 days ago
        > You mean, something that Lisp does since the early 1980s?

        Um, no. Debugging a macro in Lisp is a terrible experience while debugging a comptime function in Zig is brain dead simple.

        Zig is the first "macro" system I've used that doesn't want to make me blow my brains out when I need to debug it.

      • perching_aix5 days ago
        Yes, I think that's what they mean.
  • kevlar7007 days ago
    Loving Ada without using exceptions or inheritance on embedded and desktop. Some love Ada full OOP tagged types. I love Ada procedural style with privacy and abstract data types. I wish Flutter was written in Ada but atleast Dart is better than JavaScript atleast for procedural code without it's oop boiler plate. You don't actually need OOP for widgets.
    • linuxlizard6 days ago
      I'm a big fan of Ada. I first encountered exceptions in Ada. When I first saw Python, way back in version 1.5, I was happy to see exceptions.
    • numerosix2 days ago
      I second that. From 8 to 64 bits systems, embedded or system programming, Ada is the best choice. I've saved tons of missing hours, headhaches, etc. with this gem. Search Github Sowebio Adel for a good setup manual and, in the same repo, v22 for a good gp kiss framework...
    • dominicrose5 days ago
      But is Dart better than Typescript? I prefer Typescript for multiple reasons but one of them is that you don't have to use classes to use the advanced typing system. Without a typing system I like Ruby the most, but sometimes we just need a typing system.
      • IshKebab5 days ago
        Dart is better in some ways and worse in others.

        1. It has an actually sound type system.

        2. The language and standard library are waaaaaaaay ahead of Javascript.

        3. The tooling is top notch. Better than JS/TS.

        But on the other hand:

        4. Way smaller ecosystem.

        5. Debugging is worse if you're compiling to JS. The fact that the code you run is basically identical to the code you write in TS can be a big advantage. Only really applies for web pages though.

        6. Type unions are way nicer in TS.

        7. Non-nullable types interact badly with classes. It can make writing methods correctly really awkward - you have to explicitly copy member variables to locals, modify them and then write them back.

        8. Way smaller community.

        • ohmahjong5 days ago
          As someone curious about learning more about type systems, would you mind elaborating on 1.? I'm assuming you mean the formal definition of "sound", not just as a synonym for "sensible". Sound typing is often something handwaved away as not being particulary consequential in practice; what benefits have you seen there?
          • IshKebab5 days ago
            It's not particularly consequential when the types are only used for type checking and then thrown away. That's how Typescript and Python work.

            But when the types are sound you can use them to compile better code. That's what most languages with "proper" static types (not just type hints) do.

          • dominicrose5 days ago
            From the official website: > Dart enforces a sound type system. This means you can't write code where a variable's value differs from its static type.

            I know you didn't ask me but I think that not ensuring soudness is a feature because it allows the type system to wrap something that could work without it. Would you like unit tests if removing them would break your code? Maybe it's not a fair comparison, or maybe it is...

    • satvikpendem4 days ago
      > I wish Flutter was written in Ada but atleast Dart is better than JavaScript atleast for procedural code without it's oop boiler plate. You don't actually need OOP for widgets.

      You can use other libraries for this like Riverpod with flutter_hooks and functional_widget which essentially removes the OOP structure of widgets and turns them more into functions, in a way.

    • johnisgood5 days ago
      What are you using Ada for?
      • numerosix2 days ago
        Embedded 8 and 32 bits microcontrollers to web linux based erp/crm softwares. Ada can be used for anything, with the speed of C/C++ but in a far more readable and safer way... Ada is a secret weapon. Don't spread theses infos ;)
  • mkovach7 days ago
    Free Pascal, but I am interested in Ada and will be learning it more this year. I love the readability of the syntax, and on the outside looking in, the community seems good.

    I have also moved back hard to using TCL as my scripting language. I like it too much, and bouncing between Python, Go, and such for DevOps glue tires me out.

    For systems, I love using plan9 (9front) to solve problems, which grounds me to C, awk, sed, and the rc shell.

  • pjmlp5 days ago
    That would be mix of D, Object Pascal, Swift, Ada, C#, Java.

    A few decades ago plenty of Oberon dialects.

    As language geek, I randomly select languages when doing hobby coding.

    Regarding Go's remark, even if I dislike Go's authors decisions, back in my day writing compilers, linkers, firmware, networking stacks, and OS services was considered systems programming.

    Likewise .NET team has been making wonders catching up to what C# 1.0 should have been for low level code, given its Delphi linage.

    Java, in the context of being whole Android userspace, including drivers, there is very little systems exposed in the NDK. Vulkan is one of the few things not exposed to Java land, and that is being fixed with WebGPU like API in an upcoming version.

    • gnz115 days ago
      What are your thoughts on D? My experience is limited but seems like a very underrated language.
      • sfpotter5 days ago
        I started using it recently for a prototype of something I'll eventually rewrite in C++ at work. I really like it.

        Discarding the preprocessor and replacing it with a proper module system is huge. I got burnt by templates and horrifying compile times in C++, but haven't had any problems with D templates. The module system makes templates feel much more natural to use. The syntax for templates is a huge improvement, and throwing `static if` into the mix results in concise and easy-to-read code.

        I also quickly realized (with the help of some people on the D discord) that the garbage collector is fine for my needs. So I don't have to spend any time thinking about memory management... put stuff on the stack when I can for speed, othrewise just GC and don't think about it. I think there may be some issue with multithreading and the GC, but this is supposed to get fixed with the new GC that's on the way.

        There are a few other nice QOL improvements. Getting rid of `->` is honestly worth its weight in gold. There's nothing difficult about forgetting to change a `.` to a `->` or vice versa in C++, but not having to trip over it periodically when you're compiling makes the language that much smoother. I was also initially confused by the `inout` keyword but have come to really like that, as well. Little niceties like `const(T[])` are small but, again, reducing just a little bit of friction like this across the language makes D much, much more pleasant to deal with than C++.

        I think the main challenge the language is facing right now is that it's huge and a lot of it is still getting worked out. I never thought I'd pine for C++'s "rule of 3/5/0", but it's a lot tighter and more logically consistent than the equivalent in D. But part of that is there being a huge community of C++ developers who have taken the time to promulgate rules of thumb in the community. I'd kill for an "Effective D" book to short circuit some of this process... after all, I'm trying to write code, not play at the margins, tinkering with D's idiosyncracies.

        • fuzztester5 days ago
          >I'd kill for an "Effective D" book

          https://en.m.wikipedia.org/wiki/Scott_Meyers

          The Last Thing D Needs - Scott Meyers - DConf 2014

          https://youtu.be/KAWA1DuvCnQ

        • e12e5 days ago
          > (...) for a prototype of something I'll eventually rewrite in C++ at work.

          > (...) realized (with the help of some people on the D discord) that the garbage collector is fine for my needs.

          Do you envision linking in a garbage collector in your eventual c++ rewrite?

          • sfpotter5 days ago
            I'm open to it but I don't know enough about the options, other than the Boehm GC. If people know of good GC-in-C++ options, I'd love to hear about them.

            In my area (numerical methods and computational geometry), I do not need anything to run in real or soft real time. The GC pauses aren't a concern. In which case, there is no real performance concern other than what I mentioned about the pauses being effectively single-threaded (my understanding... maybe this isn't exactly right). But this is supposed to be improved at some point, so whatever. Not having to explicitly think about memory management is a pure win.

            On the other hand, my understanding is that using a GC in C++ could confuse things like Valgrind and ASan. Converting the entire codebase to use a GC is infeasible; so, if it made things more difficult for others by making these tools harder to use, it would be a nonstarter. But maybe this is just an imagined difficulty.

            Another option is to just implement some scoped allocators. Everything I'm working on at the moment is "pure": some complicated operation applied to some fixed data. So, use an allocator to simulate GC within the scope of what I'm doing.

            If anyone has thoughts here I'm definitely interested to here. Not that I'm looking forward to a C++ rewrite. :`(

      • dfawcus5 days ago
        I've been playing with it hacking a compiler written in C++ to be sort of transliterated to D. Just to see if it then makes the compiler easier to read, while not worrying about the performance yet.

        So far in converting the lexer it does make it more comprehensible, it will probably do the same for the parser and AST. The real interesting bit will be once I tackle the later stages.

    • 5 days ago
      undefined
  • flohofwoe5 days ago
    C99 ;) ...compared to 'popular C' (which is essentially C89 plus some common extensions taken from early C++) C99's main improvements (designated initialization and compound literals) haven't really caught on yet even among many C programmers, but those features (IMHO) completely revolutionize the language, and especially library API design.

    Also on a more serious note: I started some projects in Zig and even though most of my future projects will be built on a bedrock of C code, more and more of the top-level layers will happen in Zig.

    • cassepipe20 hours ago
      I remember reading this some time ago : https://floooh.github.io/2019/09/27/modern-c-for-cpp-peeps.h...

      I do use those so thank you :)

    • codr75 days ago
      There it is again, the urge to port my Lisp back to C.

      https://github.com/codr7/eli

      What I love most about C is the fact that it doesn't talk down to me no matter what crazy ideas I come up with. It's therapeutic for me, reminds me why I started writing code in the first place.

      I realize that's also what many hate about it, the fact that it gives other people freedoms they would never trust themselves with.

    • dfawcus5 days ago
      Designated initialisers and compound literals, sure they have caught on, one just has to know where to look:

          https://github.com/danos/vyatta-dataplane/blob/master/src/npf/config/gpc_hw.c#L600-L623
      
          https://github.com/danos/vyatta-dataplane/blob/master/src/npf/config/npf_rule_group.c#L252-L280
      
      That is code which is around 4 years old.

      For the latter example, one could theoretically avoid declaring the variables 'event' an 'rg_match', instead direcly including the compound literals in the respective function calls. However it is a question of taste, and what is more readable.

      (The above have designated initialisers, I'm can't remember if there are any compound literal examples there.

      There is however one here, when the BSTR_K macro is also expanded, also the earlier BSTR_INIT:

          https://github.com/danos/vyatta-dataplane/blob/master/src/npf/bstr.h#L199
  • gw27 days ago
    C#. While a popular language, it is criminally overlooked for high-performance programming. Obviously, you can't use it for embedded or kernel development. For other use cases though, it can almost reach the performance of C/C++/Rust when written with proper care.
    • Const-me6 days ago
      > Obviously, you can't use it for embedded

      Embedded is diverse. I would not use .NET for small embedded, i.e. stuff running on Arduino or ESP32.

      However, I have successfully used .NET runtime in production for embedded software running on top of more performant SoCs, like 4 ARMv7 cores, couple GB RAM, Linux kernel. The software still has large pieces written in C and C++ (e.g. NanoVG rendering library) but all higher-level stuff like networking, file handling, and GUI are in memory-safe C#.

    • Rohansi5 days ago
      You actually can use it for embedded and kernel development! See .NET Nano Framework [1] for embedded - works on microcontrollers like ESP32. For kernel development there's nothing really built in to support it but people have built tools [2] to do it.

      [1] https://nanoframework.net/ [2] https://gocosmos.org/

      • sterlind5 days ago
        Pour one out for Midori, which would have replaced Windows with a capability-based OS completely written from kernel to shell in a C# dialect. Async/await, spans, and immutable support came from it, along with an (opt-in) Rust-like borrow checker. Satya canceled it, and all the work was lost to history. Singularity was the early public prototype.
        • pjmlp5 days ago
          The only thing Singularity and Midori share is the idea.

          You should also pour one out for Longhorn, where internal politics tanked the idea, and eventually Windows team redid all those .NET based ideas into COM/C++, and were even proud of doing so (see Hilo sample documentation), hence why nowadays COM based libraries are the main way to expose modern Windows APIs (aka post Windows XP).

          Had they collaborated instead, probably Windows would be closer to something like Android userspace nowadays.

          Or for Ironclad, another one from Microsoft research, lesser known, also from the same research group, which even includes type safe Assembly,

          https://www.microsoft.com/en-us/research/publication/safe-to...

          Microsoft Research has plenty of work in such domains, they also had a LLVM like compiler framework, based on MSIL, called Phoenix, among other stuff, e.g. Dafny, FStar, Drawbridge, also come from OS projects.

          Unfortunely classical Microsoft management has been more like it isn't Windows, it isn't shipping.

          • vram225 days ago
            Partly off-topic: which well-known companies have research groups? I knew about Microsoft and IBM. Google, probably. Others? Might be interesting to browse their sites for nuggets to explore or use.
            • indrora5 days ago
              If you hear the name “lab126”, that’s Amazon’s team.

              Nokia owns the shambling corpse that is Bell Labs. Looking beyond the English speaking world, I wouldn’t discount that the chaebols (LG, Samsung, Mitsubishi, etc) all have a few companies dedicated to research at the Bell Labs level.

    • graboid7 days ago
      I sometimes write C# in my day job. But I think I don't know much about how to write really fast C#. Do you have any recommendations for learning resources on that topic?
      • gw27 days ago
        Sure. Here are some resources:

        * Span<T>: https://learn.microsoft.com/en-us/archive/msdn-magazine/2018...

        * C# now has a limited borrow checker-like mechanism to safely handle local references: https://em-tg.github.io/csborrow/

        * Here is a series of articles on the topic: https://www.stevejgordon.co.uk/writing-high-performance-csha...

        * In general, avoid enterprise style C# (ie., lots of class and design patterns) and features like LINQ which allocate a lot of temporaries.

        • neonsunset7 days ago
          LINQ is fine (but enterprise style never is, yes), it’s a matter of scale and what kind of a domain the code is targeted too. C# needs to be approached a little like C++ and Rust in this regard. Having standard performance optimization knowledge helps greatly.

          Also can recommend reading all the performance improvements blog posts by Stephen Toub as well as learning to understand disassembly at a basic level which .NET offers a few convenient tools to get access to.

          • codr75 days ago
            And I can recommend listening to @neonsunset when it comes to C# performance :)

            Helped me a bunch to get Sharpl spinning, much appreciated.

            https://github.com/codr7/sharpl

        • graboid7 days ago
          Thank you. I once read a bit about Span<T>, but some of this reference stuff is very new to me. Interesting, definitely. C# really is a big language nowadays...
          • neonsunset7 days ago
            Spans are just a slice type, but those which any type based on contiguous memory can be coerced to (usually). I’m sure you’re already using them somewhere without realizing that. Their main use case in regular code is zero-cost slicing e.g. text.AsSpan(2..8).
        • xigoi7 days ago
          C# is specifically designed for enterprise-style OOP, so if you want to avoid that, why use C# at all?
          • jiggawatts6 days ago
            You're thinking of Java, which is Enterprize Buzzword Compliant to the maximum extent possible.

            C# is Java-but-with-lessons-learnt, and is significantly less verbose and "enterprisey" in typical usage.

            Modern .NET 9 especially embraces compile-time code generation, a "minimal" style, and relatively high performance code compared to Java.

            Even if the JVM is faster in benchmarks for hot loops, typical Java code has far more ceremony and overhead compared to typical C# code.

            • akkad335 days ago
              > Even if the JVM is faster in benchmarks for hot loops, typical Java code has far more ceremony and overhead compared to typical C# code.

              Can you give an example? I don't think this is true anymore for modern Java (Java 21+)

              • jiggawatts5 days ago
                It's a heavily gamed benchmark, but TechEmpower Fortunes is pretty good at revealing the max throughput of a language runtime for "specially tuned" code (instead of idiomatic code).

                Java currently beats .NET by about 40%: https://www.techempower.com/benchmarks/#hw=ph&test=fortune&s...

                I judge more idiomatic / typical code complexity by the length of stack traces in production web app crashes. Enterprise Java apps can produce monstrous traces that are tens of pages long.

                ASP.NET Core 9 is a bit worse than ASP.NET Web Forms used to be because of the increased flexibility and async capability, but it's still nowhere near as bad as a typical Java app.

                In terms of code length / abstraction nonsense overhead, have a look at the new Minimal APIs for how lightweight code can get in modern C# web apps: https://learn.microsoft.com/en-us/aspnet/core/fundamentals/m...

                • neonsunset5 days ago
                  For those interested in performance ceiling, https://benchmarksgame-team.pages.debian.net/benchmarksgame/... provides additional data points.

                  What matters in practical scenarios is that ASP.NET Core is significantly faster than Spring Boot. If you have a team willing to use ActiveJ or Vert.x, you are just as likely have a team willing to customize their C# implementation to produce numbers just as good at web application tasks and much better at something lower level. There are also issues with TechEmpower that make it highly sensitive to specific HW/Kernel/Libraries combination in ways which alter the rankings significantly. .NET team hosts a farm to do their own TechEmpower runs and it just keeps regressing with each new version of Linux kernel (for all entries), despite CPU% going down and throughput improving in separate more isolated ASP.NET Core evaluations. Mind you, the architecture of ASP.NET Core + Kestrel, in my opinion, leaves some performance on the table, and I think Techempower is a decent demonstration of where you can expect the average framework performance to sit at once you start looking at specific popular options most teams use.

              • jayd165 days ago
                How's modern Java in a game/sim scenario? C# has value types to reduce the GC load, for example. Do Java records close the gap there?
                • homebrewer5 days ago
                  No, records are a reduction in boilerplate for regular classes (the result also happens to be read-only — not deeply immutable, mind you). Value types are in the works:

                  https://openjdk.org/jeps/401

                  • jayd164 days ago
                    Hmm looking at that it seems like being a struct type is a non-goals they seem to explicitly call out C# value types as a different thing...

                    Smaller objects from dropping identity is nice but it really doesn't seem like it gives you more explicit memory layout, lifecycle, c interop etc that C# has with their structs. Maybe I'm missing something.

          • gw27 days ago
            > C# is specifically designed for enterprise-style OOP

            Then why would they add Span<T>, SIMD types and overhaul ref types in the first place?

            • xigoi7 days ago
              Because some people wanted to use C# for low-level programming, so they added these things as an afterthought.
              • neonsunset7 days ago
                You’ve clearly never used it and have no idea what you are talking about.
                • xigoi7 days ago
                  I have used it a few years ago and the enforced OOP boilerplate was too much for me.
                  • moi23885 days ago
                    You can write procedural or functional style as well, and with top-level statement you can write without any OOP or boilerplate whatsoever.
                  • neonsunset7 days ago
                    Trying to write it as if it was a different language instead or, for whatever reason, copying the worst style a team could come up with does happen and must be avoided, but that’s user error and not a language issue. Also the tooling, especially CLI, is excellent and on par with what you find in Rust, far ahead of Java and C++.

                    If you link an example snippet of the type of code that gave you pause, I’m sure there is a better and more idiomatic way to write it.

                    • SoftTalker5 days ago
                      C# was originally a clone of Java. It was almost literally copy/paste compatible.
                      • 5 days ago
                        undefined
                  • johnisgood5 days ago
                    I have the same issues with JVM-like languages, like Java. I only write Java if I am getting financially compensated for it!
      • CrimsonCape6 days ago
        Span<T>, ReadOnlySpan<T>, Memory<T>, CollectionsMarshal, CollectionsExtensions, ref struct, ref return, ArrayPool, ArraySegment, ValueTuple, and using interfaces/structs/generics carefully.

        That is if you don't want to get into unsafe code.

      • HackerThemAll5 days ago
        A few important ones: - Avoid memory allocations as much as you can. That's a primary thing. For example, case insensitive string comparisons using "a.ToUpper() == b.ToUpper()" in a tight loop are a performance disaster, when "string.Equals(a, b, StringComparison.CurrentCultureIgnoreCase)" is readily available. - Do not use string concatenation (which allocates), instead prefer StringBuilder, - Generally remember than any string operation (such as extracting a substring) means allocation of a new string. Instead use methods that return Span over the original string, in case of mystr.Substring(4,6) it can be a.AsSpan(4,6), - Beware of some combinations of Linq methods, such as "collection.Where(condition).First()" is faster than "collection.First(condition)" etc.

        Apart from that (which simply concerns strings, as they're the great source of performance issues, all generic best practices, applicable to any language, should be followed.

        There are plenty resources on the net, just search for it.

    • bunderbunder5 days ago
      And arguably it beats the performance of C/C++/Rust when written without proper care: https://blog.codinghorror.com/on-managed-code-performance-ag...

      The big take-away I got from this (admittedly quite old now) experiment is that getting advertised performance out of unmanaged languages for typical real-world (i.e., non-benchmark) tasks often requires a lot more care than people really account for. Nowadays memory dominates performance more so than CPU, and the combination of a JIT compiler and a good generational, compacting garbage collector - like C# and Java developers typically enjoy - often does a better job of turning idiomatic, non-hand-optimized code into something that minimizes walks of shame to the RAM chips.

    • codr75 days ago
      Well in that case, Java :)

      I've been having a lot of fun with Java lately, the maturity of the language/implementation and libraries allows me to focus on the actual problem I'm solving in ways no other language can currently match.

      https://github.com/codr7/tyred-java https://github.com/codr7/eli-java

  • Froedlich8 days ago
    The only true "system programming" I've done was in Microsoft Macro Assembler, a product I grew to hate with a passion.

    A non-answer, but tangentially relevant:

    I once fiddled with Forth, but never actually accomplished anything with it.

    Several OSs are written in Lisp; in some of them the difference between OS and application is a bit vague. At the time none of them were available to me to play with.

    I discovered Oberon and fell in love. My first real programming language was Pascal, and Oberon is part of the same family. Oberon consisted of a compiler, operating system, user interface, application software, and tools, all self-hosted on Oberon. There was even an Oberon CPU at one time. But Oberon turned out to be just an academic curiosity, and wasn't available for any hardware I had access to anyway.

  • gibsonf15 days ago
    We have had a great experience using Common Lisp [1] for our causal space-time systems digital twin [2]

    [1] http://sbcl.org/

    [2] https://graphmetrix.com/trinpod-server

    • codr75 days ago
      I so envy people who manage to find interesting Common Lisp work, it's like we live in different dimensions.
      • wglb4 days ago
        There are many independent consultants working in Lisp.

        Yes, it is rare.

      • zelphirkalt5 days ago
        Requires open minded middle management and that is rare.
        • felideon5 days ago
          or the CEO of Franz, Inc. as an advisor, it seems.
          • gibsonf15 days ago
            Also helps that the CEO of the company does Common Lisp Dev.
  • lopatin8 days ago
    I started using Idris a few years ago because the idea is fascinating. Such as state machines in your type system, the size of a list being defined in the static type system, even if the list size changes over time (pretty mind blowing), etc..

    But ultimately I realized that I’m not writing the type of software which requires such strict verification. If I was writing an internet protocol or something like that, I may reach for it again.

    • TOGoS5 days ago
      Similar boat. I've read about Idris (and been 'shown the door' enough times) and I love the idea of it, but sadly I haven't yet had any reason to use it.
  • deevus5 days ago
    I am currently contracted 3 days a week writing Zig. I can't say much because NDA, but I just love working with Zig almost every day. I think for the right projects, it is such a great choice for mission critical software.

    You get the added benefit of being able to easily consume C libraries without much fuss. The fuss is in navigating the C APIs of decades old libraries that we all still depend on every day.

    • bsder5 days ago
      Do tell us sometime when you can in the future. It's always interesting to hear what Zig people are doing because they do some very weird stuff.

      They wouldn't be using Zig otherwise. :)

    • johnisgood5 days ago
      In LuaJIT and Odin it is also easy to do FFI.
    • goeiedaggoeie5 days ago
      I write a fair bit of rust/c for my day job. Do you find zig easier than the ffi interface in Rust?
      • flohofwoe5 days ago
        I maintain auto-generated Rust and Zig bindings for my C libraries (along with Odin-, Nim-, C3-, D- and Jai-bindings), and it's a difference like night and day (with Zig being near-perfect and Rust being near-worst-case - at least among the listed languages).
      • bsder4 days ago
        > Do you find zig easier than the ffi interface in Rust?

        Yes, but it's mostly cultural.

        Rust folks have a nasty habit of trying to "Rust-ify" bindings. And then proceed to only do the easy 80% of the job. So now you wind up debugging an incomplete set of bindings with strange abstractions and the wrapped library.

        Zig folks suck in the header file and deal with the library as-is. That's less pretty, but it's also less complicated.

      • deevus5 days ago
        I've somehow avoided Rust, so I can only comment on what I see in the documentation.

        In Zig, you can just import a C header. And as long as you have configured the source location in your `build.zig` file, off you go. Zig automatically generates bindings for you. Import the header and start coding.

        This is all thanks to Zig's `translate-c` utility that is used under the hood.

        Rust by contrast has a lot more steps required, including hand writing the function bindings.

        • dlivingston5 days ago
          You only hand-write function bindings in simple or well-constrained cases.

          In general, the expectation is that you will use bindgen [0].

          It's a very easy process:

          1. Create a `build.rs` file in your Rust project, which defines pre-build actions. Use it to call bindgen on whatever headers you want to import, and optionally to define library linkage. This file is very simple and mainly boilerplate. [1]

          2. Import your bindgen-generated Rust module... just use it. [2]

          You can also skip step 1: bindgen is also a CLI tool, so if your C target is stable, you can just run bindgen once to generate the Rust interface module and move that right into your crate.

          [0]: https://rust-lang.github.io/rust-bindgen/

          [1]: https://rust-lang.github.io/rust-bindgen/tutorial-3.html

          [2]: https://github.com/Charles-Schleich/Rust-Bindgen-Example/blo...

        • steveklabnik5 days ago
          Zig is easier than Rust here, but you can auto generate bindings, you don’t have to write them by hand.
  • sheepscreek8 days ago
    F#! I’m in love with the language. It is my defacto pick for most things these days. Very expressive AND strongly typed. Being a part of the .Net ecosystem is also a plus.
    • robinsonrc5 days ago
      I wouldn’t call F# a systems programming language, but it’s definitely on my list of things to properly try out at some point
    • vram227 days ago
      Can you create desktop GUI apps with it?
  • giancarlostoro8 days ago
    Every now and then Freepascal with Lazarus but the same bug being in the IDE for ten years plus kind of annoys me. If I save a new project and I move any files around it does weird stuff, or if I rename a module.

    Theres also D but finding libraries for whatever I want to work on proves problematic at times as well.

    • Froedlich8 days ago
      On the other hand, the Ultibo OS for the Raspberry Pi is written in FreePascal.
  • Terr_5 days ago
    Using Elixir and Elm at my day job.

    Coming from a more Python/Java/PHP/JS background, Elixir was a lot easier to pick up and doesn't frustrate me as much. Most of the remaining scary bits involve concurrency and process supervision trees.

    Macros are powerful, but also easy to use in a way that makes everything hard to debug. For those unfamiliar with them, it's a bit like a function except any expressions you call it with are not evaluated first, but arrive as metadata that can be used to assemble and run new code.

    • never_inline5 days ago
      The question is about systems programming.
    • mikercampbell5 days ago
      Why elm over LiveView?

      I know “why” elm, I liked everything I saw about it, but how do you combine the two, if you do?

      • Terr_5 days ago
        There's a bit of a struggle between sections that use just one or the other, but Elm has the managerial blessing right now.

        While I think Elm is neat, it suffers from ecosystem issues. It drive a large amount of Not Invented Here because JS invented somewhere else is hard to incorporate. Also, good luck rendering arbitrary HTML that comes in as data from somewhere else.

        • ghayes5 days ago
          Yeah, I loved Elm, but the restriction that you can't build your own "effect" modules really made it impossible to embrace. Say you want to use a new web API similar to using Elm's core `Http`, well... you can try and fork Elm...
          • boxed5 days ago
            You can use webcomponents to work around a few of those limitations.
    • aqueueaqueue3 days ago
      Does the lack of movement on Elm even in terms of bugfixes cause any issues? Maybe you use elm-janitor?
  • Rochus5 days ago
    My major system programming languages are C and C++, but I did some projects in Oberon (which turned out to be not particularly suited for systems programming), and then created a modified, better suited version of it called Oberon+ (https://github.com/rochus-keller/Oberon), which I e.g. used to create platform-independend versions of different Oberon System generations.

    But Oberon+ is still too high-level for many system programming tasks. So I'm designing a new system programming language called Micron (for Micro Oberon, see https://github.com/micron-language/specification) which has the full power of C without its disadvantages. You can even use it for the OS boot sequence when there is no stack and no heap, but also for higher-level application development, due to its selectable language levels.

  • pyjarrett8 days ago
    Ada

    The open source tooling has significantly improved since I started using it in the last five years.

  • atiedebee5 days ago
    I recently dabbled in "hare" which was quite a nice experienced.

    I liked how the language stayed pretty simple compared to other C-replacements. The standard library is also pretty nice. It is however an extremely niche language, but still quite capable

    • siev5 days ago
      I really like the design choices they've made. Namely:

      - Once you cut out the legacy nonsense out of C, you can then add a few nice modern features to your language and still end up with something that's smaller and simpler than C.

      - Performance optimizations are possible. But by default, simplicity is always picked over performance. (i.e. most UB is eliminated, even if it hurts performance)

      - A few basic pointer features go a long way in eliminating memory most memory safety bugs. There are non-nullable pointers, ranges with automatic bound checks, and no C strings.

      - They get a lot of mileage out of their tagged union type. It allows for elegant implementations of algebraic types, polymorphism, and error handling.

      - The error handling!

    • sakras5 days ago
      I was pretty excited about Hare until Devault said that Hare wouldn't be doing multithreading as he preferred multiprocessing. That was a pretty big dealbreaker for me. The rest of the language looks quite clean though!
      • shakna5 days ago
        hare-ev [0] is using epoll under the covers, which means multithreading is there, already. Especially as ev may be merged into the stdlib at some point.

        [0] https://git.sr.ht/~sircmpwn/hare-ev

        • sakras5 days ago
          Maybe I'm misunderstanding something, but it seems like ev is still multiprocessing? Reading the code, it looks like you can read/write to files, and if you want to kick off some other work it spawns a process. I don't see any instance of threads there.
          • shakna5 days ago
            epoll is threaded, not multiprocess. [0]

            hare-ev is using rt to make the epoll syscalls. [1]

            > On Linux, ev is implemented with epoll. Note that, on Linux, I/O on regular files is always blocking.

            [0] https://www.man7.org/linux/man-pages/man7/epoll.7.html

            [1] https://docs.harelang.org/rt#SYS_epoll_create

            • sakras5 days ago
              > epoll is threaded, not multiprocess

              epoll is orthogonal to threads. It _can_ be used in a multithreaded program, but it doesn't have to be. It may very well be implemented in terms of kernel threads, but that's not what I'm talking about. I'm talking about user-space threads.

      • sebtron5 days ago
        You could always link to pthread and use that in your Hare code, no?
        • sakras5 days ago
          Conceptually yes, but I suspect there's going to be a lot hairier in practice. For instance, I think there's some stuff that needs language support such as thread-local storage. I'd guess it would be simpler to just re-implement threading from scratch using syscalls. But I also don't think the language provides any support for atomics, so you'd have to roll your own there.
    • vram225 days ago
      hare will not support Windows.

      https://harelang.org/documentation/install/#supported-platfo...

      Interesting reasons.

    • wduquette5 days ago
      But why 8-character indents as the standard formatting for Hare programs? I notice that Odin seems to prefer 8-character indents as well. It seems like a real blow to readability for deeply nested code. Maybe you aren't supposed to write deeply nested code?
    • 5 days ago
      undefined
  • Jtsummers8 days ago
    Not presently, but not long ago, Fortran and Ada. I still like Ada better than the alternatives, especially as it's changed this past couple decades. I find it hard to miss Fortran, though. I'd consider it for scientific computing and that's about it, which isn't my present domain.
    • fuzztester8 days ago
      Interesting, thanks.

      Did you ever check out Eiffel for systems programming work?

      I had been checking it out some years ago, and apart from the general points about it, one use of it that I found interesting was in an article about using it for creating HP printer drivers. The author had mentioned some concrete benefits that they found from using it for that purpose.

      Edit: I searched for that article, and found it:

      Eiffel for embedded systems at Hewlett-Packard:

      https://archive.eiffel.com/eiffel/projects/hp/creel.html

      • Jtsummers8 days ago
        I learned it once long ago, but never used it for anything other than that learning experience. I did like its concepts, though the language itself didn't quite stick with me.
    • quanto8 days ago
      How would Fortran be used other than numerics/scientific computing?
      • AlexeyBrin7 days ago
        Modern Fortran has ISO C bindings in its standard library. You can call any C library from Fortran and wrap it in a Fortran module if you want to make it easier to use.

        Despite its history it is a pretty modern language if you enable all warnings, set implicit none and ignore the old style of coding (a la FORTRAN 77 of older).

      • Jtsummers8 days ago
        This was in an embedded systems context, I came on later but it was what most of the core system was written in. It's been used in a lot of avionics systems over the years.
      • tjalfi6 days ago
        These days, we have many better options, but back in the day, Fortran was also used for compilers (e.g., IBM's Fortran H), operating systems (such as PRIMOS[0] and LTSS[1]), symbolic computation (e.g., early Prolog implementations), and real-time control systems[2].

        [0] https://en.wikipedia.org/wiki/PRIMOS

        [1] https://en.wikipedia.org/wiki/Livermore_Time_Sharing_System

        [2] https://webhome.weizmann.ac.il/home/fhlevins/RTF/RTF-TOC.htm...

      • fuzztester8 days ago
        not a direct answer to your question, but the use in the domain you mentioned itself, is huge.

        from the Wikipedia article about Fortran, under the Science and Engineering section:

        https://en.m.wikipedia.org/wiki/Fortran

        Although a 1968 journal article by the authors of BASIC already described FORTRAN as "old-fashioned",[58] programs have been written in Fortran for many decades and there is a vast body of Fortran software in daily use throughout the scientific and engineering communities.[59] Jay Pasachoff wrote in 1984 that "physics and astronomy students simply have to learn FORTRAN. So much exists in FORTRAN that it seems unlikely that scientists will change to Pascal, Modula-2, or whatever."[60] In 1993, Cecil E. Leith called FORTRAN the "mother tongue of scientific computing", adding that its replacement by any other possible language "may remain a forlorn hope".[61]

        It is the primary language for some of the most intensive super-computing tasks, such as in astronomy, climate modeling, computational chemistry, computational economics, computational fluid dynamics, computational physics, data analysis,[62] hydrological modeling, numerical linear algebra and numerical libraries (LAPACK, IMSL and NAG), optimization, satellite simulation, structural engineering, and weather prediction.[63] Many of the floating-point benchmarks to gauge the performance of new computer processors, such as the floating-point components of the SPEC benchmarks (e.g., CFP2006, CFP2017) are written in Fortran. Math algorithms are well documented in Numerical Recipes.

        • wglb4 days ago
          > described FORTRAN as "old-fashioned"

          That didn't age well.

          My professor working on control system analysis for electrical power grids later thought that if he were to write it today, it would likely be done in matlab.

      • Tor35 days ago
        My first Fortran program was a tool I wrote to read 8" SS/SD CP/M floppies on a minicomputer. That was very easy to do, as the dialect had a couple of useful string extensions and the operating system had efficient ways of reading a floppy.
  • morphle5 days ago
    Squeak[1], Cuis. Metacircular Smalltalk VM[2] written in itself. We sometimes call it SqueakNOS for 'Squeak no operating system needed'.

    [1] https://ftp.squeak.org/docs/OOPSLA.Squeak.html

    [2] https://tinlizzie.org/VPRIPapers/

  • titzer5 days ago
    These days I write nearly all my code in Virgil (https://github.com/titzer/virgil).

    It has features like classes, first-class functions, tuples, ADTs, unboxing, and a little data layout language, some unsafe features, like support for generating and integrating new machine code, and can talk directly to kernels.

  • anta407 days ago
    Pascal.

    Sure these days not many folks write OS kernel in Pascal, but there are some, e.g: https://github.com/torokernel/torokernel

    I once want to try Forth (perhaps there's a Unix clone in Forth?), but seems like most folks using it are embedded/hardware devs.

    • fuzztester7 days ago
      I had read somewhere that some of the early Apple (not Mac) software, i.e., systems, application or both, was written in some Pascal variant.
      • Someone6 days ago
        https://folklore.org/Hungarian.html:

        “The Macintosh used the same Motorola 68000 microprocessor as its predecessor, the Lisa, and we wanted to leverage as much code written for Lisa as we could. But most of the Lisa code was written in the Pascal programming language. Since the Macintosh had much tighter memory constraints, we needed to write most of our system-oriented code in the most efficient way possible, using the native language of the processor, 68000 assembly language. Even so, we could still use Lisa code by hand translating the Pascal into assembly language.”

        MacOS was clearly Pascal-oriented, with its ‘Str255’, ‘Str63’, etc. data types.

        • mistrial96 days ago
          Pascal interfaces and direct 68k ASM for the first years of Macintosh. C language bindings were third party and discouraged by Apple. There were good reasons for that in those days IMHO, since C came with a lot of Unix software libraries and people would demand that the science libs run. Apple said "no" but third parties built the compilers anyway. Many developers were attracted to Michael Kahl's brilliant ThinkC system, later to MetroWerks. MPW built a more *nix-like environment eventually, also.

          source: C language developers for the Macintosh OS

        • fuzztester6 days ago
          ha ha, nice.

          even early Windows versions were somewhat Pascal-oriented, with things like "long far pascal" used in C function declarations, to indicate the calling convention being used, whether right to left, or left to right, iirc.

  • brigandish5 days ago
    I’ve replaced Ruby as the “glue” language on my machine with Crystal, being able to plop out a binary and not worry about the myriad things that can go wrong with needing the entire environment to be perfect, including reinstalling gems for every version etc is such a relief. Bundler is just a frustrating sticky plaster over that.

    I’d like to give Zig and Nim a go, but Go and Elixir are probably next on the list, simply because I have unread books for them staring at me.

    • raffraffraff3 days ago
      I stopped buying tech books, started a safari online subscription. Much better. Now I have just one thing staring at me (virtually) from the internet, instead of a dozen things staring at me from the book shelf. It's less intrusive.
  • xigoi8 days ago
    Nim, I love its “make simple things simple and complex things possible” philosophy.
    • blashyrk7 days ago
      I absolutely adore Nim.

      That said, the edges are still (very) rough when it comes to tooling (generics and macros absolutely murder Nimsuggest/lsp) and also "invisible" things impacting performance such as defect handling (--panics:on) and the way the different memory management schemes introduce different types of overhead even when working with purely stack allocated data.

      But even with all that it's still an extremely pleasant and performant language to work with (when writing single threaded programs at least)

      • seanw4445 days ago
        Definitely agree that there are rough edges, but Nim is in a better state than ever. The LSP isn't great yet, I'll agree with that. There are great optional type libraries for working around exceptions if you don't want them, and the new memory management system (ARC/ORC) is very efficient compared to the old refc implementation (now much more like the C++/Rust approach).

        For parallel programming, there are also handy libraries. The best of which is Weave[1], but Malebolgia[2] is authored by the creator of Nim and works well in its own way too.

        There is also active work being done on a new implementation of Nim which intends to clean up the some of the long-term spaghetti that the current implementation has turned into (like most long-term projects do), called Nimony[3], and is also led by the original creator of Nim. It is years away from production according to him, but is at least in the works.

        I'd have to say Nim is by far my favorite programming language. The terseness, flexibility, and high performance, make it feel almost sci-fi to me. My only major complaint currently is the tooling, but even the tooling is still adequate. I'm glad it exists. Highly recommend.

        [1] https://github.com/mratsim/weave

        [2] https://github.com/Araq/malebolgia

        [3] https://github.com/nim-lang/nimony

    • xwowsersx6 days ago
      I really should continue my Nim series :( https://youtube.com/@nimward
  • rganesan6 days ago
    zig is coming along quite nicely. If you've not heard about zig, take a look at https://ghostty.org/ (a terminal for Linux/Mac and Windows in future), https://tigerbeetle.com (a database for financial accounting) and http://bun.sh (a modern, faster alternative to nodejs).
  • ajdude8 days ago
    I almost exclusively work in Ada for my hobby projects these days; It's great for doing both high level and low level programming.
    • alok-g7 days ago
      Where does tooling and platform support stand for Ada? Could one develop desktop, mobile, web apps, too using Ada? Thanks.
    • sharedptr5 days ago
      Are there Ada jobs?
      • shakna5 days ago
        Your favourite job board will have a tag for Ada. There are jobs out there. Some in lower level things like finance, reliability testing, embedded software. Some in higher level things like gaming, AI, web services.

        There are fewer, and they do tend to be more demanding, but they certainly exist.

      • grandempire5 days ago
        Huntsville Alabama has some
  • docandrew8 days ago
    Ada for bigger projects, D for quick one-offs and more “scripty” work.
    • fuzztester7 days ago
      I had played around with D some time ago, and wrote some small programs in it for fun and learning. I both liked and disliked things about the language.

      there was some Russian dev running a systems tech company, I forget his name, living in Thailand, like in koh samui or similar place. he used D for his work, which was software products. came across him on the net. I saw a couple of his posts about D.

      one was titled, why D, and the other was, D as a scripting language.

      I thought both were good.

      • docandrew6 days ago
        It’s a little like go in that it compiles quickly enough to replace scripts while still yielding good enough performance for a lot of systems tasks. It predates go and I wish Google had just supported D, it’s a much nicer language IMO
    • johnisgood5 days ago
      What are you using Ada for?
  • kazinator5 days ago
    I don't work in Seed7 by Thomas Mertes but it deserves to be better known.

    https://en.wikipedia.org/wiki/Seed7

    It has a SourceForge page that actually doesn't suck and you will not hate landing into, unlike almost anything else SourceForge:

    https://seed7.sourceforge.net/

    Though there is an old school SourceForge file area with tarballs, the project page also links to a GitHub repo.

  • baddate8 days ago
    • akkad335 days ago
      I don't know if Julia is a system programming language
      • brabel5 days ago
        It's quite funny to classify it as such, given you need to run your programs like a script as it's nearly impossible to compile a binary you can distribute (though I am aware they're working on this as a priority task, currently).
        • leephillips5 days ago
          Compilation to small binaries has made great progress:

          https://lwn.net/Articles/1006117/

          It’s not always clear what is meant by “system programming”. I’ve begun writing utility scripts in Julia; it’s practical now because the startup time is vastly improved. These can be run like bash scripts, with a shebang line that invokes Julia with the desired environment (using the --project flag).

          • brabel2 days ago
            > It’s not always clear what is meant by “system programming”. I’ve begun writing utility scripts in Julia; it’s practical now because the startup time is vastly improved. These can be run like bash scripts, with a shebang line that invokes Julia with the desired environment (using the --project flag).

            I think it is clear enough. The language must have a small or non-existing runtime so it is practical to write systems that do not ship the same fat runtime on every binary. The language must support compiling to binaries, otherwise it really cannot be used by itself for systems. It must provide access to the available Operating System API directly without the need for bindings (to the extent possible, as some OSs only expose the C API;ABI).

            What is a system, you may ask. I think you can define that as anything that can run by itself (no runtime) and perform any "low level" operation permitted by the OS.

      • adgjlsfhk15 days ago
        It's certainly not a traditional one, but it is increasingly used as one https://arxiv.org/abs/2502.01128.
  • kagevf5 days ago
    OK, here's a pretty niche blast from the past: the boo programming language. It ran on the CLR (.NET) and had syntax similar to python. I recall using it back around 2006 - 2008 because it offered scripting features for .NET on Windows.

    https://boo-language.github.io/ "A scarily powerful language for .Net". I didn't use it for too long before switching to Iron Python.

    • vram225 days ago
      I remember reading about the Boo language and IronPython some years ago. Do you still use IronPython?
      • kagevf5 days ago
        I do not.

        These days I would reach for a shell script for general scripting, filling in the gaps with maybe a C# console app or something in Common Lisp if I want/need some interactivity.

        Something that happens pretty frequently is I'll take information I've written into an emacs org doc and run it through a CL function, whose output could be an org mode table which I can from there export to a different document format if necessary.

  • netbioserror8 days ago
    Nim. Fantastic choice for modern headless software. Simple obvious type system, preference for immutability and referential transparency. Dynamic collections are by default managed by hidden unique pointers on the stack. So the default RC isn't necessary unless explicitly invoked for a ref type.

    Currently solo managing a 30k line data analysis application I built for my company. Easily fits in my head given the obvious pyramidal functional-like structure. Maybe two lines of memory semantics anywhere in the entire thing, and only one module that's OO with a constrained scope. Lots of static data files (style sheets, fonts) slurped up as const strings at compile time. Incredible performance. Invoked by our PHP server backend, so instead of doing parallel or async in the analysis, the server gets that through batch invocation.

    Working stupid well for our product, plus I can easily compile binaries that run on ARM and RISC-V chips for our embedded team just by invoking the proper gcc backend.

    Replaced an ailing and deliberately obfuscated 20 year old jumble of C and PHP designed to extort an IP settlement from my company. Did it in a year.

    • digdugdirk6 days ago
      Do you have any recommendations for well designed open source Nim projects for someone study to get a feel for the language?
      • archargelod5 days ago
        Anything written by Treeform[1] is a good place to start, their libraries make up a big chunk of Nim ecosystem.

        1 - https://github.com/treeform/hobby

      • netbioserror5 days ago
        Honestly hard to say. There are a number of styles of architecting Nim libraries and programs, and almost none match my own. My most particular criticism of the Nim ecosystem is the abuse of macros: There are a number of libraries implementing huge chunks of functionality behind macros such that code paths only appear at compile time and are not reflected in sources. Some libraries constrain macro use, but many are built entirely out of macros. I'd say to avoid looking to those examples.
        • seanw4445 days ago
          I think it's safe to say that the proper way to go about using Nim is to use macros only as a last resort. They're powerful and awesome when necessary, but obfuscate the code a lot, and the LSP can't really follow them either. They also make the code feel "fragile" in my opinion. I find that I rarely need to go further than templates, personally.
  • ChrisMarshallNY5 days ago
    I think that it depends on the system.

    Firmware is probably still best done in C (sometimes, C++), mostly because so many SDKs, libraries, and toolkits are done in those languages. Also, there's decades of "prior art" in C. Lots of places to look for solutions.

    I worked on a project, where we tried using a very "less-popular" language for firmware.

    It didn't end well.

    I'd say that being a "Systems Programmer" means that you are operating at a fairly "advanced" level, where the benefits of "safer" languages may be less of a factor, and the power of more "dangerous" languages is more attractive.

    Of course, on HN, suggesting C or C++ is suggesting "less popular" languages...

    • zelphirkalt4 days ago
      Though we see "fairly advanced level" of C and C++ leading to 70 percent of the vulnerabilities in the Chromium project. So I would bet on anyone's advanced level.
      • ChrisMarshallNY4 days ago
        Good, safe, code tends to come from Discipline, Humility, and thoroughness.

        I've known many highly experienced and intelligent software devs that are terrible at that stuff, and are like coding time bombs.

    • spauldo3 days ago
      I highly doubt they're less popular, it's just that most people who use them aren't as vocal as their detractors.
  • ptspts6 days ago
    I do systems programming in i386 (32-bit) assembly language with NASM.

    For me it doesn't scale beyond a few dozen kilobytes (executable program file size) per program. For others (such as Chris Sawyer) assembly scales much better.

    • az09mugen5 days ago
      Did you get a look at fasm [0] ? It has nice capabilities

      [0] : https://flatassembler.net/

      • ptspts5 days ago
        fasm is indeed great. It has many features, it can do all the code size optimizations, it even knows the sizes of variables (e.g. `mov myvar, 5` depends on `myvar db 0` vs `myvar dw 0`). NASM and fasm syntax are quite similar.

        NASM supports more output file formats (i.e. .o files for many systems), and it can receive macro definitions from the command line (e.g. `nasm -DDEBUG`).

  • qingcharles6 days ago
    If the support was still there I'd still be using VB.NET.

    I've coded professionally in a dozen languages, including a lot of time in x86 assembler, C++ etc.

    Still like VB.NET better than any other. To me, it was the most readable code.

    • aqueueaqueue3 days ago
      You may like Python! At least the syntax.
    • wglb6 days ago
      Is it not still supported?
      • qingcharles5 days ago
        It is, but it's really on life support. It's supported for legacy development in the most part. There are so few people coding in it now you'll never see any example or tutorial .NET code in VB.NET.
      • drewnoakes5 days ago
        it is still supported and developed.
  • artemonster8 days ago
    Tried trying zig, but was baffled by all the allocator dance you need to do and asking nicely to access a list (catching potential exceptions?) Tried odin, but the tooling is very raw. Tried rust, didnt want to try to please borrow checker that distracts me from my thoughts.

    Idk, if someone just reinvents clean C without the nonsense garbage with some modules and package manager this will be a huge win. Let me access my null pointers, let me leak memory, just get the hell out of my way and let me program and hold my hand only where I want it to be held - sane types that give me refactoring, code completion and code understanding, modules with imports. Let compiler give sane error messages instead of this cryptic c++ garbage. Is this too much to ask?

    • gw27 days ago
      D's "Better C"[1] mode looks like what you describe. Has syntax similar to C with a real module system, metaprogramming, slice types etc.,

      1 - https://dlang.org/spec/betterc.html

    • flowerthoughts8 days ago
      I also had a brief look at Zig for writing a WASM module, but settled for Rust. I had no real gripes with the language, but the spartan documentation made making progress into a slog.

      I wouldn't mind a "better C" that could use an LLM for static code analysis while I was coding. I.e. be more strict about typing, perhaps. Get out of my way, but please inform me if I need more coffee.

    • acheong087 days ago
      Allocation in Zig takes some getting used to but it's actually really nice. It took me a few weeks but I honestly believe you should give it another chance and more time
      • LiamPowell6 days ago
        I personally find it much more ergonomic to have the allocator attached to the type (as in Ada). Aside from the obvious benefit of not needing to explicitly pass around your allocator everywhere, it also comes with a few other benefits:

        - It becomes impossible to call the wrong deallocation procedure.

        - Deallocation can happen when the type (or allocator) goes out of scope, preventing dangling pointers as you can't have a pointer type in scope when the original type is out of scope.

        This probably goes against Zig's design goal of making everything explicit, but I think that they take that too far in many ways.

        • deevus5 days ago
          There is no reason you can't attach an Allocator to the type (or struct, in Zig).

          A fairly common pattern in the Zig stdlib and my own code is to pass the allocator to the `init` function of a struct.

          If what you mean is that allocation should be internal to the type, I don't agree with that. I much prefer having explicit control over allocation and deallocation.

          The stdlib GPA for example is pretty slow, so I often prefer to use an alternative allocator such as an arena backed by a page allocator. For a CLI program that runs and then exits, this is perfect.

      • timeon5 days ago
        Same can be said about Borrow Checker.
    • feelamee8 days ago
      looks like zig is exactly what you want. Difference only in std. C prefer global allocator, while zig ask it explicitly.

      So, if only there is std with implicit allocators?

    • milesrout5 days ago
      C's compilation unit model, lack of a formal module system and lack of language-level package management are the best things about it.

      Separating interface and implementation is a good thing, but often you just want to split things into separate files without separate compilation. C supports #include and so it is maximally flexible.

    • tahirmurata8 days ago
      [dead]
  • eadmund5 days ago
    Common Lisp. It offers powerful abstractions and high speed. I’m happy with it.
    • wglb4 days ago
      I do as well. I've replaced a lot of my production bash scripts with Lisp.

      Pretty much all SBCL.

    • tmtvl5 days ago
      Seconding CL. For my personality, purposes, and preferences it's the closest thing to a perfect language.
      • flavio815 days ago
        +1, it is my go-to language whenever I have no idea how complex the task will get
  • fuzztester7 days ago
    commenting after seeing multiple comments here, after about a day.

    first of all, thanks, guys, to all who replied. that's a wealth of info to follow up on.

    referring to the comments seen so far:

    I considered mentioning (Free) Pascal, but thought of not doing it, because I thought it is nowadays too niche, even though it is one of my early programming language loves (forgetting that the title of my post says "less popular languages" :)

    and I also didn't think of Ada at all, somehow, although have been interested in it, too, lately, and have been checking out websites and blogs about it, and also have been searching hn.algolia.com for posts about it.

    so it was cool to see multiple mentions of Ada here, by people who like and use it.

    • UncleOxidant6 days ago
      > so it was cool to see multiple mentions of Ada here, by people who like and use it.

      I'm surprised how popular Ada is here in these comments. I like some of the ideas (ranged types, for example) in Ada, I'm inspired to give it a try after seeing all the comments here.

  • atemerev8 days ago
    D and Crystal always fascinate me. And if Go is a system language, Erlang and Common Lisp are even more so.
    • renewedrebecca5 days ago
      I wish Crystal had better IDE support, otherwise it’s just about perfect.
      • etra05 days ago
        I wish we lived in a world where Crystal dominated over Go, but we're far from being there.

        Admittedly, the slowness of the compiler (due to the nature of the language), and lack of better tooling are not helping, but 9 out of 10 times I enjoy way more writing Crystal than Go.

      • knowitnone5 days ago
        I think Crystal is going to need much more community support (articles, tutorials, blogs, community) and corporate sponsorship for it to even thrive in today's environment where we have an abundance of choices.
      • vram225 days ago
        IIRC, I read somewhere, several months ago ago, that its type inference made it slow to compile anything but small programs?
        • felipeccastro5 days ago
          Yes, I believe so because it uses global type inference. I would gladly add explicit types everywhere instead of this to use Crystal if it had decent tooling, because everything else about the language is really perfect.
          • vram222 days ago
            Intriguing. You really didn't find any issues with it? Anything that you thought should / not be there, or some non-trivial bug?
          • renewedrebecca5 days ago
            > I would gladly add explicit types everywhere instead of this to use Crystal

            Agreed. Good IDE support can easily add explicit types too.

    • SoftTalker5 days ago
      I like Erlang a lot.
      • worthless-trash5 days ago
        Me too buddy, super powerful, syntax is a little weird but once you get used to it..

        gen servers, everywhere.

  • markldevine2 days ago
    Raku, for scripting jobs. It is a huge language, done right imo. It has more "flow" phenomenon than any other language I've encountered (like playing a great video game).

    Not write-only like its ancestor. So many language criticisms solved. A true pleasure.

    Still in its function-first development phase but apparently near the end. AST rewrite is still underway, then the team will address performance.

  • namshe5 days ago
    I will put in a plug for Mercury: https://mercurylang.org/
    • vram225 days ago
      I read a while ago, when checking out Prince XML (a high-end HTML to PDF conversion tool), that is written using Mercury.

      https://www.princexml.com/

    • adastra225 days ago
      Wow, I haven’t heard about that language in a long time. What do you use it for?
      • johnisgood5 days ago
        Looks like Prolog.

        I wonder what the major differences are.

        • adastra225 days ago
          I guess it's like Prolog in the same sense that Rust is like C? It's a modern functional programing language that also supports logical programming. I never got a chance to actually code with it though.
  • purpleidea5 days ago
    Using the `mcl` DSL language in https://github.com/purpleidea/mgmt/

    It's awesome. But I'm biased because I designed it.

    You can't build anything, but you can build many things much more easily. Particularly distributed systems.

  • creshal5 days ago
    I like nim so far, but I have to admit I haven't done all that much with it yet.
    • jasfi5 days ago
      Nim is great, I wrote a crypto trading engine with it. The performance is excellent, memory safety works well, and it was much easier to write compared to Rust.
      • maxresdefault5 days ago
        What kind of profits are you seeing with it?
        • jasfi5 days ago
          No actual profits yet, I've just been back-testing as well as forward-testing various strategies. It looks like writing the trading engine was the easy part.
  • unquietwiki5 days ago
    As the founder of r/altprog on Reddit (been following random languages for 12 years now), my favorite "alt" language is Nim. It feels like Python & Javascript had a baby that was C++. Wish it had lambda operators like C# and JS, but it does have the cool feature of defining your own language constructs.

    Also, shoutouts to Zig, Crystal, and Ballerina: those are other interesting ones off the top of my head, that folks should look into.

  • harry_ord5 days ago
    Perl is kinda less popular now. I use that at work. Used to write perl6/raku in my previous job, I loved the grammars made a nice way to try and Wirte an nginx configuration manager.
    • warpspin5 days ago
      Perl here, too.

      We still use it for all kinds of web services development work, mainly because there's years of in-house modules for everything and the malleability Perl has.

  • bragur2 days ago
    I’ve been using ReScript and ReasonML (derived from OCAML, originating from Jordan Walke) for the better part of 7 years now professionally instead of TypeScript/JavaScript and couldn’t be happier. Blissfully enjoying a fully sound type system with stronger type inference vs. TypeScript and without all the complex type jugglin, null hell and an insane compiler with good errors, enormous speed and output of optimized JS code. It does indeed have a learning curve (these days far less steep than it used to be) but the benefits are just so many in my eyes. At the same time TypeScript has come a long way but it still struggles in comparison and I never feel as safe when writing TypeScript.

    One downside is, of course, far less adoption and libs usually will have to have ReScript bindings written for them but that's fairly straight-forward and not something I have to do very often.

  • anonymoushn5 days ago
    I've been using Zig for nearly 4 years now. A lot of changes in that period were not great, but I haven't really wanted to use anything else.
    • anacrolix5 days ago
      I have been watching with interest. I can't help but think Rust will easily win. Zig isn't different enough, and it's somewhat opinionated (in good ways but not always clearly better)
    • lukan5 days ago
      I just looked into Zig and it looks great on first glance. What recent changes were not great in your opinion?
  • hansvm5 days ago
    I've been using Zig for years, and for the last year I've been using it at work. I've coded professionally in all the usual languages, but Zig does what I want much more easily.
  • em-bee6 days ago
    i don't know if pike counts as a systems language, but i consider it an alternative to C, if only because it has good C integration so that you can easily include a module written in C. pikes syntax is also very close to C, which may be appealing to some (ironically that's an aspect i don't really care about myself)

    if the question of go being a systems language is controversial, then pike is even more so. i would situate pike somewhere between python and go. pikes major drawback is that it doesn't produce standalone executables.

    the real question i'd like to ask is, what actually is a systems language?

  • mrweasel5 days ago
    I'm not very good at using it, but every now and then try to do a small project in Chicken Scheme. Mostly I'm unsuccessful, but I enjoy the language a lot and have a great time using it.
  • whitehexagon5 days ago
    I'm looking forward to getting back to Zig soon, especially now that there is support for Asahi linux.

    I like that for low level SoC stuff, there is now the packed struct, which makes register representation very nice to deal with, especially with the definable int types, although I'm often torn between a u1, bool and sometimes even a enum(u1) for certain flags. I tend to let the SoC documentation (naming convention) drive that decision.

    Otherwise there is a lot of nice and simply designed language stuff in Zig that also takes me back to my C / asm days. My least fav. part is maybe multi-line string literals that look like comments. I prefer the kotlin approach there.

    I'd like to find a non walled-garden Zig community if there are other Zig fans here, ie just a forum. Or tips on editor to use? since I am tired of electrons being burned needlessly, and almost feel like I need to VM these modern dev tools.

    • bsder5 days ago
      Does https://ziggit.dev/ not cut it for you in terms of non-walled garden?

      It seems to be good enough that I basically don't interact with the Zig Discord anymore.

    • deevus5 days ago
      I’m doing all my Zig editing in Zed and it works great.

      For version management I use mise (or scoop on Windows).

  • worthless-trash5 days ago
    Fuzztester here is asking about system languages. I see a lot of people suggesting things I'd consider non systems languages.
    • tmtvl5 days ago
      Yeah, but unfortunately 'systems programming language' is a bit vaguely defined. I'd call any language which can deliver a binary executable and which offers some degree of lower-level control (like getting the disassembly of a procedure or deliberately being able to stack-allocate things) systems languages, but others may have different ideas.
  • keyle6 days ago
    Have a look at hare. It's got some interesting bits [1]

    Also C3

    [1] https://harelang.org

    [2] https://c3-lang.org

    • xxami5 days ago
      I'm a fan of C3, I like that it's not trying to be _too_ far removed, but adds enough to rid you some of the tedious chores of C. Dev and their community is also really nice.
      • keyle4 days ago
        Nice to hear! I will definitely give it a go now. It was on my list.
  • mirekrusin5 days ago
    MoonBit [0]

    It's still being developed but on man the language is good.

    You read its documentation and pretty much every-single-thing is right decision (from my PoV).

    Beautiful language if you like OCaml, Rust. Primary target is wasm, but compiles to native as well.

    [0] https://www.moonbitlang.com/

  • dlivingston5 days ago
    Does anyone remember BlitzBasic / BlitzPlus / Blitz3D? They were my first programming languages. I loved how dead simple it was to spin up a DirectX-based 3D game or a GUI application. There was something very nice about a simple, performant, batteries-included programming environment.
    • Sodaware4 days ago
      I started with DarkBasic but moved to BlitzBasic after a few months. Wrote a couple of small games and really enjoyed just messing around with it - it felt just like when I started programming on my first computer.

      I still use BlitzMax for game development (when I get time) - there's an updated version with some nice language additions support for more architectures, including the Raspberry Pi: https://blitzmax.org/

  • nuudlman8 days ago
    Take a look at Pony https://www.ponylang.io/
    • dismalaf5 days ago
      Pony is fun and I love the actor paradigm but it definitely feels like the community lost a lot of energy when Sylvan Clebsch stopped working on it (to work on a similar project for MS).
    • fuzztester7 days ago
      I will, thanks.
  • Nales5 days ago
    I am using Haxe which compiles to other languages (C++, JavaScript, PHP...). This is a nice language when you want to create a web application or a even a CLI.

    If you have played video games by Shiro Games (Evoland, Dune Spice Wars) or Motion Twin (Dead Cells) or even Paper Please!, then you had been exposed to this language.

  • dismalaf8 days ago
    Odin. It's just too easy and fun.
    • fuzztester7 days ago
      Why is Odin easy for you? Because it is non-OOP (I think, have not confirmed that) or some other reason?
      • dismalaf7 days ago
        Build system, module system, simplicity of C but much nicer, clearer syntax, lots of batteries included, it just does a lot of stuff to make life easier versus Zig or C/C++.

        I personally don't think programming paradigms like OOP, procedural or functional make anything easier/harder necessarily, just talking QoL stuff.

        And obviously "easy" is relative: Odin is still a low level language where you need to manage your own memory.

        • 7 days ago
          undefined
  • mxvanzant2 days ago
    Lately I've been using: https://janet-lang.org/ It's not a systems programming language, but it can be embedded in C.

    https://jank-lang.org/ looks interesting to me --I have not tried it yet. I'm not sure if this language could qualify as a systems programming language. What do you think?

  • auntienomen5 days ago
    Cython. Writes like Python, runs like C. Strangely underappreciated.
    • DmitryOlshansky5 days ago
      It certainly doesn’t run like C. I once thought to port my JSM machine learning engine to python and it felt Cython might just be what I needed. Simply put it’s tight loops doing bitwise ops on bit-vectors. In reality no amount of adding type annotations would help the thing was slower then C++ by an order of magnitude.
      • auntienomen5 days ago
        I've generally found it to be within a factor of 2 of hand-tuned C. (It's literally autogenerated C.) But implementation matters, and I doubt we're going to check your work here in the comments.
  • tonis25 days ago
    Trying to make a game with https://c3-lang.org/, quite happy so far.
  • 5 days ago
    undefined
  • whateveracct6 days ago
    Haskell + copilot (from Nasa) and/or MicroHs when targeting embedded (like RPi Pico)
  • ghfhghg7 days ago
    F# and Haxe. Love both of those languages
  • henning5 days ago
    I've written a non-trivial (5K SLOC) app in Zig and it's very nice.
  • DASD6 days ago
    Chicken Scheme for personal projects.
  • rubymamis8 days ago
    I’m considering Mojo.
    • kaycebasques5 days ago
      Oh, yeah! Haven't heard anything about it in the last 6 months. Any interesting developments?
      • fnands5 days ago
        Lots! [1]

        The biggest thing to be added recently is GPU programming, which given Mojo's focus on ML/AI makes a lot of sense.

        It's probably not the best language to look into for general purpose systems programming, but if you are going to be interacting with GPUs or other hardware then maybe it's good to give it a look.

        It is still changing a lot, so no real stability yet, but to be expected for such a young language.

        [1] https://docs.modular.com/mojo/changelog/

    • zoom66285 days ago
      Me too for projects and tools. Would like to use Ada for IOT projects and tools.
  • Surac5 days ago
    Forth. Old but very versatile. wrote the runtime myself years ago in portabel c.
  • myko5 days ago
    I'm converting an old C codebase to Swift, though given Swift's non-support for mix-language targets I'm considering switching to Zig.

    Unfortunately the Zig compiler crashes when building my project and I haven't looked into debugging the compiler to find out why. There's a lot of gnarly code in this project (based on dikumud from ~1989?) with many inexperienced hands touching it over the decades.

  • jdougan6 days ago
    D language (Dlang). It is especially good if you are porting from C as the semantics are the same enough to rum a lot of code via copy and paste, or if not it will fail to compile.
  • bitwize5 days ago
    Ada and Scheme.
  • nodramallama5 days ago
    I’ve been using Odin and really enjoying it lately. In my free time I’ve been using it for gamedev and for some Python interop at work
  • julianeon5 days ago
    An interesting takeaway from this is that it looks like Rust has really fallen off, in terms of popularity. There was a time when it would've topped these lists (and yes I know you mentioned it - I mean people would've mentioned it anyway). It seems like Nim has claimed 100% of its mindshare.
    • ohazi5 days ago
      Rust hasn't fallen off, it's just largely considered mainstream now.
    • marcosdumay5 days ago
      Some time ago, Rust had no viable replacement at all. If somebody came asking "hey, how can I replace Rust on this system level software" the only possible answer was "you don't".

      Nowadays, alternatives exist, and so people can answer with one.

      None of that has any meaning for the popularity of Rust or lack thereof.

    • VertanaNinjai5 days ago
      The prompt explicitly says “not Rust”. So the answers don’t say Rust.
  • HHalvi5 days ago
    Lua: Picked it up when I was dabbling in building Games relying on Love2D which uses Lua as the underlying language.

    CoffeeScript: Felt in love with CS as I wanted to rapid protypes with (now defunct) Framer Classic.

    Smalltalk/Squeak/Vala are something I have wanted to dabble with for a while but haven't gotten around to.

  • noelwelsh5 days ago
    I've been writing some Scala Native recently. See https://github.com/creativescala/terminus/. It's a high-level language but you can still reach down and grub about in memory if necessary. I'm having fun.
  • jlengrand5 days ago
    I still absolutely love my Elm. Never a programming language has made me as confident and joyful when writing code <3.
    • pclowes5 days ago
      Are people using Elm for systems level programming? I have only used it on the front end.
      • vram225 days ago
        Roc was inspired by Elm, and has CLI as one of its "platforms", which is systems in a loose sense. Early days for Roc, though there may be orgs using it in productiom.
  • slevis5 days ago
    I am using Ada atm. Not a "modern" language but I believe it might have a great future :)
  • standeven6 days ago
    The IEC 61131-3 languages, though 95% of my work is Structured Text. Anyone need a PLC programmed?
  • deepsquirrelnet5 days ago
    Does Cython count? I’ve been trying to learn more advanced usage. It’s pretty small and reasonably familiar.

    I also have messed around with nim a little bit. I like it, I’m just not sure it’s worth putting a lot of effort into.

    • sfpotter5 days ago
      I don't think I would recommend using Cython outside of writing Python bindings. In my experience, the community is too small and the documentation is too lacking. Even writing bindings, I spent an inordinate amount of time debugging inscrutable compilation errors.
  • 5 days ago
    undefined
  • dfawcus5 days ago
    Playing with D, while reading up on Odin and the various Cyclone papers.
  • srik5 days ago
    Elm. Gonna hold on to it for as long as possible because it’s fantastic for personal projects - drama free, functional, simple, typed and comes with batteries and great errors and tooling.
  • 7 days ago
    undefined
    • brudgers7 days ago
      Learning a new language for the sake of learning a new language might be why they asked.
  • Peteragain5 days ago
    I've used c and java, and have recently been thinking about go. It's interesting that the comments here only mention go in the negative. Can someone give me the back story about go?
    • anta405 days ago
      I think Go is fine for application development (any stuffs that runs on top of OS).

      But for system programming, which is generally understood as developing OS kernel (filesystem, memory management, device driver etc) or embedded which you built a mini OS), then Go is not the proper choice (features like coroutine, AFAIK, needs OS). You'd want C/Pascal/Rust/Zig/<what else?> ...

    • hereonout25 days ago
      I don't know if go counts as "systems programming" like the other commenter mentions.

      But I have been recently using it for some tooling and small servers on personal projects where I'd have used python before.

      Frankly it's been a joy and I wish I'd started earlier. The concurrency primitives are great, and the static binaries make deployment easy (raspberry pi in this case).

      Struggle to use anything other than python professionally, the need to settle on a common denominator trumps pretty much everything else.

  • rixed5 days ago
    I personally mix languages, using higher level languages for the structural work and calling C for the code or data structures that require it.

    So a good FFI to C has always been an important requirement for me.

  • cjj_swe6 days ago
    It hasn't been released yet, but I'm very excited for Carbon :)
  • creer5 days ago
    Perl 6 / Raku. The swiss army chainsaw (Perl) raised to the power of the swiss army chainsaw. For its expressiveness in spite of some learning curve.
    • raffraffraff3 days ago
      I had to take a look at some examples, because my last Perl work was over 15 years ago. This is neat:

          grammar Parser {
              rule  TOP  { I <love> <lang> }
              token love { '♥' | love }
              token lang { < Raku Perl Rust Go Python Ruby > }
          }
      
          say Parser.parse: 'I ♥ Raku';
          # OUTPUT: 「I ♥ Raku」 love => 「♥」 lang => 「Raku」
      
      The same thing in other languages would require a lot more code, without a parser module. An LLM tells me that functional languages can handle this stuff well too, but that Raku code is just extremely simple to grasp
      • creer3 days ago
        So to be fair, "extremely simple to grasp" because looked at superficially. Once you dig into it, there is a lot happening in there. Both in what can go into an actual production grammar, and what is present in the parsing output. which is why I was mentioning learning curve.

        But yeah, most stuff is easy, there is more than one way to do it, and the impossible isn't. Or something.

  • delduca5 days ago
  • johnisgood5 days ago
    Ada and Odin that I would consider less popular, rarely Forth.
    • johnisgood5 days ago
      OCaml and Factor, too, but I am not sure OCaml is not popular. Factor rarely, but I love it, too, just do not use it as much. I actually write more Factor than Forth.
  • 8 days ago
    undefined
  • Rounin5 days ago
    D. It's quite C-like, but more concise, has a richer standard library, garbage collection, threading, etc. etc.
  • 5 days ago
    undefined
  • zerr6 days ago
    Haxe and Dart (without Flutter) are quite nice.
  • matej-almasi5 days ago
    Ada (safety critical stuff) in work. Not a great fan, but it has its passionate defenders.
  • egberts15 days ago
    Cobol, Vimscript/VimL and Ada
  • pyuser5834 days ago
    Fortran. Had to use it for some obscure projects a while back.

    It’s still kicking.

  • glonq5 days ago
    I had a college prof who in his real job wrote -all the things- in Modula-2.
  • moechofe5 days ago
    Rebol
  • meta-level5 days ago
    Micropython of course
  • yamapikarya7 days ago
    visual basic. i learned a lot from this language because i'm able to create a system from scratch without import a library.
  • therealfiona6 days ago
    My team hates when I write POSIT shell.
  • gallier25 days ago
    D such a fantastic language.
  • Pompidou5 days ago
    J
    • MortyWaves5 days ago
      By what metric is that a systems language?
  • frizlab5 days ago
    Swift
  • ztetranz6 days ago
    I've been learning Elixir just for fun. I wish I was using it in my day job.
    • nesarkvechnep5 days ago
      Elixir is not a systems programming language.
  • sim7c005 days ago
    what do you understand as 'systems programming'?

    there are people making operating systems for AMD64 in Pascal etc.... so there's plenty of choices, odd and even ones.

    some examples of different interpretations of 'systems programming'.

    low level systems code - like interacting with devices directly on bare metal. (mmio/io etc.)

    kernel code - like writing a new subsystem in linux, which uses other drivers.

    high-level systems - like game engines, automation frameworks, other high performance oriented systems-with-lot-of-subsystems?

    These different domains, on different targets, might have more or less plausible options for you to try.

  • billwear8 days ago
    (kebab-use-elisp)
  • uwagar5 days ago
    tcl
    • ripe5 days ago
      I agree, tcl is pretty sweet (if totally weird). I use it because it's the mother tongue of sqlite.
  • cisrockandroll5 days ago
    RPG
    • degrees575 days ago
      Dude, you're taking the easy way out. Please go purist and pull the wiring boards out of the closet.
  • 5 days ago
    undefined
  • nodramallama5 days ago
    [dead]
  • greenheadedduck6 days ago
    Python.
    • neilv5 days ago
      It's not a systems programming language, but I actually wrote a userland "device driver" in Python, for startup MVP pragmatic firefighting reasons.

      It was somehow rock-solid in over a year of factory production overseas. Which might not have been the case, if I'd written it in C and put in the kernel, like might normally be good practice.

      (It hooked into some kernel interfaces, did a little multiple USB device management, and low-level keyboard-like decoding with efficient I/O, and buffered and parsed and did something with the output of that.)

      I have mixed feelings about Python: it often it hurts more than it helps (if you know better ways to do things), but the ecosystem has some nice off-the-shelf components, and it's popular/employable. However, due to the popularity, the average quality of any article you might find through Web is unfortunately low.

      (For an unpopular language, you'll get a few people writing articles from a junior/neophyte knowledge level, as part of their learning, or because someone said it was good for resume-boosting. That can be good. But no one is going to waste time pounding SEO low-quality filler for a language that doesn't make money. Well, at least they wouldn't before LLMs, but who knows how the economics have changed, now. :)

  • neonsunset7 days ago
    C#, to match the performance of reference implementations in C and Rust, and completely crush the performance of those in Go :)
    • Thaxll6 days ago
      Why do you need to downvote / compare Go on every of your post? Do you have insecurities with Go?

      I was reading another post about someone showing some AI stuff and then: https://news.ycombinator.com/item?id=43246127

    • fuzztester7 days ago
      what do you mean by reference implementations, in this context?
      • neonsunset7 days ago
        All sorts of algorithms for processing data useful in high-load scenarios: checksum calculation, text searching/analysis/transformation, data compression, networking (request analysis, routing, filtering), interop with other C ABI dependencies, etc.
  • inetknght5 days ago
    > Less popular

    Bash.

    > I used C for both application programming and systems programming

    Gross. Learn C++, it's better than C in every way! shotsfired.jpg

    > I've been wanting to get back to doing some systems programming, but preferably in a more modern language (than C) which is meant for that.

    Use C++ then. Or if you're a hater and/or don't know how to keep footguns pointed away from your legs, use Rust.

    > less commonly used ones

    but tbqh why not Xojo?