To Do:
- make it easier to quit Emacs
- change the temporary directory names we've been using - bin sounds like its for unwanted files, dev sounds like its for development, etc needs a better name. Its silly
Etc is strange, yeah.
But the computer field just shrugs and keeps doing whatever they were doing. Given what the hackers of the 60's and 70's did on crap machines with no resources, you'd think people would want to review what they can teach modern developers.
Even essential films are lost; some burn up in fires; only some private groups have tried to save and restore the most important ones. For example, I read about one legendary American silent film thought lost forever and then was found somewhere in Spain, in a library IIRC (they had to translate the Spanish titles back to English).
It happens in music and other fields. Perhaps the artists and businesses are focused on the present, not seeing their work as historic, and move on to the next thing. What happens to old projects you work on - do you preserve them carefully or are they just kind of left in whatever state they were at the end?
She mentioned that even if you could find one of the machines that was working, keeping it running required routine maintenance and that they were down to essentially one guy who was nearing the age of retirement who had the skill and parts to keep one running. So they were in a race against time to figure out which masters to convert.
The problem gets even more thorny for sessions that were recorded using software like ProTools, which has been around in some form or another for almost 40 years, has gone through countless revisions of project file formats, and has a complicated relationship with specialty audio hardware and software plugins.
It seems like there's a general awareness of the problem now and good studios are taking some measures to archive sessions in ways that allow them to be imported in the future, but in the meantime there are two decade's worth of recordings at risk, even if their media hasn't been lost or corrupted. I guess if nothing else its a cool opportunity for people who like to hack on systems of this type though.
Hell, perhaps it's good it's "forgotten" as it's what's powering the latest versions of Windows and other proprietary O/S.
On Basic, there's the games example -Basic Computer Games- made into a repo at GitHub, and some people are recreating those in modern languages as it's a trivial task (I'm doing ports myself to JimTCL).
https://github.com/GReaperEx/bcg
You can actually use any language, even sh, but for these cases JimTCL it's ridiculously easy to use.
There is a massive interest in older computing, both from a programming and from an art perspective. The demo scene thrives on it.
I have seen many times (and recently) a lot of interest here on HN about ancient 90s arcade machines with "unbeatable" encryption, etc.
There is a massive interest in doing reverse engineering old games to a bit-perfect level.
Retrocomputing is huge, but it rarely goes much older than CP/M.
Hackers (1995) is fictional but a cult classic. Freedom Downtime (2001) is about Mitnick and hacking culture. There's smaller documentaries too. There's that one about Wikileaks, that one about Cambridge Analytica. There's books like The Dream Machine (Mitchell), Unix: A History and a Memoir (Brian W. Kernighan), The UNIX‑HATERS Handbook (Simson Garfinkel et al.). There's http://folklore.org about the early days at Apple. Asiometry has https://www.youtube.com/watch?v=Ffh3DRFzRL0, a 20 minute bit about the Unix wars.
The source code to the original Microsoft DOS is at https://github.com/microsoft/MS-DOS. The Anarchist Cookbook is on Kindle, https://www.2600.com/Magazine/digital-back-issues goes back to 2010.
DefCON got too big for the Riv and the Sahara, and is now at the LVCC. Yeah it's not the same. It's never going to be the same, but some still gather for their yearly mecca and watch Hackers and get drunk in hotel suites paid for by corporate sponsors. Others stay home for various reasons.
Do we still keep what we're doing? I mean, I don't program in Z80 ASM assembly anymore. There are still classes in my code by the focus on OOP isn't what it once was. I'm not sure if I want to call it progress, but I don't program Win32 applications anymore. I can spin up a web app with an LLM in an afternoon, and have it web-scale to the whole world in less time than it used to take to get a computer racked in a colo.
It's not 1979, the cable I use to go from USB-C to HDMI is more powerful than the computer that took us to space. By like, a million times.
Look, I'm not saying we shouldn't respect our elders. By this point, though my beard's not yet grey, relatively speaking I am an elder. I learned to program from paper books. Before ChatGPT, before Stack Overflow, before Google. There are some here that predate me by decades. If you're competing with a $10 million Oracle db system, and going from 6 ASM instructions to 5 in the inner loop will eke out that extra percent of performance, and win you the contract, by all means, sit down, roll up your sleeves, and hand optimize assembly in order to figure out how to get rid of that one instruction.
The joke is oft made, that other fields stand on the shoulder of giants, while in computer science, we stand on their toes. And it's not wrong. I can't wait to for the next new language to pop up and reimplement a DAG solver for their package management woes, and to invent a better sandbox for running untrusted code. That's still an unsolved problem. If this stuff interests you, the Computer History Museum in Mountain View, California is worth the visit. The only problem is that at the end of the tour is computers I grew up with, which has a certain way of making a fella feel old.
The travesty that is happening right now, is in the wake of Paul Allen's death, is the debache with his surviving sister and the Seattle Living Computer Museum.
There's TUHS, too.
On AI and such... errors accumulate over time, exponentially. Beware.
Don't expect it to do much, but it's fascinating if you're interested in OS history.
The original VMS system manager who moved from 7000 series hardware to emulation was somewhat inquisitive, and we did install VMS 7 on simh. He retired and passed away some years ago, and none of his replacements have wanted to touch simh. I find that apathy appalling.
All downhill from here.
Maybe this explains why we have to call "creat" to "create" a file.
1964, with the IBM 360's 8-bit bytes.
On one side I think we need to preserve this relic as we did with Homer's poetry. Because it just deserves.
On another side I think we won't (and should not) try to preserve in an infinite present whatever has been written by humanity. For what purpose?
It's also critical for understanding how and why the engineering choices were made when documenting the evolution of processing. Instruction sets, processor design, programming languages, computer culture, corporate trends, all of those things have roots in design decisions, and the software preserved on tapes like this are a sort of DNA.
The effort needed to incorporate the information is dropping, with AI you can run analysis and grab important principles and so on, and whatever principles govern optimization and performance under constraints will be useful on a permanent basis.
Also, what risk is there to preserving?
I just listened to a great new episode (podcast) of The Truth (audio drama anthology series, they’re fantastic). It was called “The Joke.” Basically this archivist finds an old hard drive with a dumb pun joke - turns out she didn’t even understand it because jokes were no longer allowed in society. Kind of has an Equilibrium vibe but more bureaucratic and less “killing people for feeling.” Anyway the joke itself takes on great importance as a result. Bit of a dramatic comparison, but you see what I’m driving at.