One thing that he wrote in this 1986 article about the 1950s has remained true for several decades since:
> During my student days I had never heard of the 701, and this, I think, leads to an important point: The IBM 650 was the first computer to be manufactured in really large quantities. Therefore the number of people in the world who knew about programming increased by an order of magnitude. Most of the world’s programmers at that particular time knew only about the 650, and were unaware of the already extensive history of computer developments in other countries and on other machines. We can still see this phenomenon occurring today, as the number of programmers continues to grow rapidly.
(BTW I wish someone would write an IBM 650 emulator on which we could try out the programs like his "Number Perverter Demonstration Card" in the appendix. Some of Knuth's early programs are also preserved and it would be wonderful to see them running.)
I don’t know how good they are and there may have been multiple IBM 650s, but Google gives me several hits, including:
- https://opensimh.org/simdocs/i650_doc.html
- https://github.com/rsanchovilla/SimH_cpanel (adds front panel GUI to the above)
- https://github.com/snick-a-doo/IBM650
- https://code.google.com/archive/p/sim650 (incomplete)
I can't figure out how to use it though :-) If anyone succeeds in running the (very short) “number perverter” program, and turning “0123456789” into “9876543210” I'd love to know.
(The last two I'm not sure how far along they are.)
I really don't know what the modern day equivalent is - of a machine that is so limited that you can really understand it completely, and develop programming skills by doing wizardly things with the limited resources.
One of my favorites are attiny84/attiny85 - 8 KB of program memory, 512 bytes of RAM, 8/14 pins DIP or SO package, needs no supporting circuitry (even power cap is optional), runs from 1.8-5V so you can connect it to a couple batteries directly.
The device can be programmed in C or assembly. If programming in C, don't expect to be able to use regular C libraries: with just 512 bytes of ram, "char last_error[512]" will cause your device to run out of memory, so you normally write things from scratch. And 8 KB codespace is really not that big either. If programming in assembly, it's a RISC with 130 instructions, and there is no cache or unexpected interrupts - so things like counting execution cycles by hand work reliably. Even in C, I find myself thinking: "OK, that is going to be X instructions for prologue and Y instructions to write, when means I have Z uS of latency total"
The device has some peripherals on board (like serial port and PWM generator), so capability-wise, it's closer to entire old computer (like ZX Spectrum) rather than a single CPU (like Z80). At the same time, there is no pre-programmed ROM, and all you need is to entirely understand a chip is a datasheet (177 pages) and maybe instruction set manual (147 pages) - and that's it, there are no multi-thousand-pages supplemental docs.
The best part is those things are actually useful. For example one of them is next to me, sniffing the signals that drive LCD display on commercial CO2 meter, and formatting them for IOT transmission. Capturing a ~2MHz signal with 8MHz chip was non-trivial, and required some neat tricks to get right.
Another example is various toys - I have a box with a dozen of lights and a bunch of buttons and knobs. Pressing the buttons and turning the knobs changes the light pattern. Even adults often play with this.
Also it includes the social aspect (compete in what you can do with a program fitting on a single punched card). That was present in a room full of Commodore PET computers, with no other computing machines available to anyone there. Come up with a wizardly hack and immediately have an appreciative audience.
The interactive aspect is missing with the attiny85 type of microcontrollers. You already have to be pretty good at bare metal type optimizations to use them, you can't easily learn it on those devices
I see mindbogglingly wizardly things being done with RP2040 type boards. But that's orders of magnitude more wizardly than described here. But at least the "show it off" community exists.
When comparing with old machines, you just need to look not just at the chip itself, but at the whole package - attiny85 + ICE + host PC using to program it. You know how original VAX-11 included a whole separate PDP-11 computer just for booting and diagnostic? It's the same with attiny85, it needs a whole separate computer for program upload and debugging. The fact that this "program upload" computer is many magnitudes faster does not really matter.
As for audience / social aspect, I don't think this is coming back - the choice is simply too big now. When Commodore was the only option, everyone had to use it. But now some people like python, some like small MCUs, some do websites, some do JS demos, some do robotics, etc... - so there is less chance that you'll find room full of homogeneous computers.
(and RP2040's manual is 642 pages, substantially bigger, so one is less likely to know it all by heart. At least it's better than STM32 family, where some devices have multiple thousands pages of documentation)
Nothing has quite got to me as much since. I thought I was Master Of The Universe!
Then I calmed down, and learned 6502 and 8086 from books. And nowadays I can hardly remember anything about them, or C or C++ which I've actually used to make money for most of my life. I'm nearly 72 now, so perhaps this is forgivable. I'd be interested to know how many young programmers learn an assembly language these days.
This comes up on HN quite a bit, without a universal answer. I suspect that each respondent answers through the lens of what they remembered, or liked, about the platforms that were formative for them.
Example: 1980s kids seem to have had formative computer experience where the computer was a self contained thing: no understanding of electronics needed, instant on, built in programming environment. Also there was a rich ecosystem of accompanying offline media, eg books on BASIC, magazines with program listings, etc. They gave their full attention to the machine.
I also see people who grew up on Raspberry Pi and they have a different view, much more about hardware and pins and voltage. Equally valid but much more electronics-y. Also has a rich offline ecosystem.
Of course modern hardware and OSes dont get close to either and so the closest experience is probably python in a browser, which really feels more like Learning for a Job than Fun with implicit Learning. The browser itself is an attention suck too - all too easy to switch away as soon as a little boredom sets in, and, as we know, boredom spurs creativity.
I dont know if this exists, but a Switch-like platform with HDMI to the TV plus built in screen, BT keyboard, and a minimal, barely there OS might be one path. But it might still fail without the offline media. Maybe you sell it with an accompanying book of games you have to type in.
Yeah, I know. Get off my lawn.
It is also very possible and profitable if we go down to 8-bit and 16-bit processors too, like 8051.
A discussion of the joy he found programming a machine with 2000 words of memory and 34 instructions.