54 pointsby pncnmnp6 days ago6 comments
  • svat5 days ago
    Knuth's tribute to his first love; a delightful article which I've read multiple times.

    One thing that he wrote in this 1986 article about the 1950s has remained true for several decades since:

    > During my student days I had never heard of the 701, and this, I think, leads to an important point: The IBM 650 was the first computer to be manufactured in really large quantities. Therefore the number of people in the world who knew about programming increased by an order of magnitude. Most of the world’s programmers at that particular time knew only about the 650, and were unaware of the already extensive history of computer developments in other countries and on other machines. We can still see this phenomenon occurring today, as the number of programmers continues to grow rapidly.

    (BTW I wish someone would write an IBM 650 emulator on which we could try out the programs like his "Number Perverter Demonstration Card" in the appendix. Some of Knuth's early programs are also preserved and it would be wonderful to see them running.)

  • MarkusWandel5 days ago
    Basically the joy of programming on the bare metal - just a lot more metal in those days. Those of us decades younger got first experience with, say, a Commodore PET. Which despite running a programming language was still pretty bare metal. You poked at this memory location, and that happened. If it didn't happen fast enough, learn some machine language.

    I really don't know what the modern day equivalent is - of a machine that is so limited that you can really understand it completely, and develop programming skills by doing wizardly things with the limited resources.

    • theamk5 days ago
      A small micro like Atmel (now Microchip) AVR series.

      One of my favorites are attiny84/attiny85 - 8 KB of program memory, 512 bytes of RAM, 8/14 pins DIP or SO package, needs no supporting circuitry (even power cap is optional), runs from 1.8-5V so you can connect it to a couple batteries directly.

      The device can be programmed in C or assembly. If programming in C, don't expect to be able to use regular C libraries: with just 512 bytes of ram, "char last_error[512]" will cause your device to run out of memory, so you normally write things from scratch. And 8 KB codespace is really not that big either. If programming in assembly, it's a RISC with 130 instructions, and there is no cache or unexpected interrupts - so things like counting execution cycles by hand work reliably. Even in C, I find myself thinking: "OK, that is going to be X instructions for prologue and Y instructions to write, when means I have Z uS of latency total"

      The device has some peripherals on board (like serial port and PWM generator), so capability-wise, it's closer to entire old computer (like ZX Spectrum) rather than a single CPU (like Z80). At the same time, there is no pre-programmed ROM, and all you need is to entirely understand a chip is a datasheet (177 pages) and maybe instruction set manual (147 pages) - and that's it, there are no multi-thousand-pages supplemental docs.

      The best part is those things are actually useful. For example one of them is next to me, sniffing the signals that drive LCD display on commercial CO2 meter, and formatting them for IOT transmission. Capturing a ~2MHz signal with 8MHz chip was non-trivial, and required some neat tricks to get right.

      Another example is various toys - I have a box with a dozen of lights and a bunch of buttons and knobs. Pressing the buttons and turning the knobs changes the light pattern. Even adults often play with this.

      • MarkusWandel5 days ago
        The love letter to the IBM 650 specifically included interactive computing with it, having the machine to yourself, being able to single-step through programs and patch them and so on.

        Also it includes the social aspect (compete in what you can do with a program fitting on a single punched card). That was present in a room full of Commodore PET computers, with no other computing machines available to anyone there. Come up with a wizardly hack and immediately have an appreciative audience.

        The interactive aspect is missing with the attiny85 type of microcontrollers. You already have to be pretty good at bare metal type optimizations to use them, you can't easily learn it on those devices

        I see mindbogglingly wizardly things being done with RP2040 type boards. But that's orders of magnitude more wizardly than described here. But at least the "show it off" community exists.

        • kevin_thibedeau5 days ago
          I have a shell interface (line editing, command history) that runs on modern ATtiny 2-series. It is easy to have nice things on limited platforms.
        • theamk4 days ago
          attiny85 is pretty interactive, it has full debugging capabilities (via RST pin no less, so no taking away precious GPIO). And with cycle times of <30 seconds, printf-based debugging (or more often, LED+button based debugging) becomes very feasible too.

          When comparing with old machines, you just need to look not just at the chip itself, but at the whole package - attiny85 + ICE + host PC using to program it. You know how original VAX-11 included a whole separate PDP-11 computer just for booting and diagnostic? It's the same with attiny85, it needs a whole separate computer for program upload and debugging. The fact that this "program upload" computer is many magnitudes faster does not really matter.

          As for audience / social aspect, I don't think this is coming back - the choice is simply too big now. When Commodore was the only option, everyone had to use it. But now some people like python, some like small MCUs, some do websites, some do JS demos, some do robotics, etc... - so there is less chance that you'll find room full of homogeneous computers.

          (and RP2040's manual is 642 pages, substantially bigger, so one is less likely to know it all by heart. At least it's better than STM32 family, where some devices have multiple thousands pages of documentation)

    • zabzonk5 days ago
      It was a joy. I remember in the late 1970s, after reading two text books I figured out how to write a little program in Z80 assembler (and how to use the assembler), and shortly after how to read and debug the machine code the assembler produced (and a bit later how to use the very spooky linker).

      Nothing has quite got to me as much since. I thought I was Master Of The Universe!

      • sedatk5 days ago
        I had the same euphoric experience but with an opcode table published in a magazine instead of an assembler. :)
        • zabzonk5 days ago
          Indeed. After I had learned Z80 assembler from books, I managed to teach myself M6809 programming from the (very good) Motorola spec sheet. I was Master Of The Multiverse!

          Then I calmed down, and learned 6502 and 8086 from books. And nowadays I can hardly remember anything about them, or C or C++ which I've actually used to make money for most of my life. I'm nearly 72 now, so perhaps this is forgivable. I'd be interested to know how many young programmers learn an assembly language these days.

    • kjellsbells5 days ago
      > I really don't know what the modern day equivalent is

      This comes up on HN quite a bit, without a universal answer. I suspect that each respondent answers through the lens of what they remembered, or liked, about the platforms that were formative for them.

      Example: 1980s kids seem to have had formative computer experience where the computer was a self contained thing: no understanding of electronics needed, instant on, built in programming environment. Also there was a rich ecosystem of accompanying offline media, eg books on BASIC, magazines with program listings, etc. They gave their full attention to the machine.

      I also see people who grew up on Raspberry Pi and they have a different view, much more about hardware and pins and voltage. Equally valid but much more electronics-y. Also has a rich offline ecosystem.

      Of course modern hardware and OSes dont get close to either and so the closest experience is probably python in a browser, which really feels more like Learning for a Job than Fun with implicit Learning. The browser itself is an attention suck too - all too easy to switch away as soon as a little boredom sets in, and, as we know, boredom spurs creativity.

      I dont know if this exists, but a Switch-like platform with HDMI to the TV plus built in screen, BT keyboard, and a minimal, barely there OS might be one path. But it might still fail without the offline media. Maybe you sell it with an accompanying book of games you have to type in.

      Yeah, I know. Get off my lawn.

    • ferguess_k5 days ago
      I bought a STM32 dev board and I'm pretty certain I can do bare metal programming on it too. It does have its ROM pre-loaded with the bootloader and maybe something else, but I don't have to use any library if I do not want.

      It is also very possible and profitable if we go down to 8-bit and 16-bit processors too, like 8051.

    • beng-nl5 days ago
      Maybe the modern equivalent is an fpga? It is a flawed analogy but it is an environment where you program on the bare metal and there are limited resources and you may have to make every one count.
  • yodon5 days ago
    >Written by Donald Knuth

    A discussion of the joy he found programming a machine with 2000 words of memory and 34 instructions.

    • fipar5 days ago
      I always loved his dedication of TAOCP to this computer, and this article is a great way to learn more about that particular relationship (of him with the computer).
  • bableck5 days ago
    I remember having to write an IBM 650 emulator using IBM 360 assembler language in 1978/1979 as an assignment at the University of Iowa. We had to account for the drum rotation speed and where instructions were placed on the drum for efficiency. At that time, they did not yet offer a degree in computer science, so my major was mathematics with a minor in computer science. I took every available computer science class and had to take a “filler class” in COBOL from the college of Business. I went onto my first job after graduating in 1970 to write Assembler language code for one of the first online transaction processing systems in CICS for a hospital patient registration system. It was an amazing time and experience.
  • mmpollard5 days ago
    Tangentially related, one of the coolest books I've ever read is called "the dream machine" -- it goes through a ton of the detail around opinions on projects like the IBM 650 and others at the time... Neat stuff.

    https://press.stripe.com/the-dream-machine

  • oliviergg5 days ago
    Very interesting read. I think you can have the same nostalgia for your first computer/langage in each generation ! I love the part where he describes the trick to reverse the 10 numbers 0123456789 with a program that fits on a single card.