That got me thinking “I wonder if anyone has done this on an oscilloscope” and oddly I can't fine anyone who quite has. That DOOM objects are sprites and not actual 3D objects would limit the fidelity, but the scenery could be rendered at least. There are several examples of managing to use a high-speed scope as a low-res monochrome raster device (scanning like a CRT monitor does and turning the beam on & off as needed).
I did find an example of Quake being done on a scope the way I was imagining: https://www.youtube.com/watch?v=aMli33ornEU - as all objects are actual 3D models in Quake that even manages to give them some presence & shape.
EDIT: then I read the second half of this post and saw ScopeDoom! I'm surprised there are no earlier examples that are easy to find.
I pulled inspiration from this port to a vectrex: https://www.youtube.com/watch?v=VuVnoqFF3II, there is a wayback machine link to the authors writeup credited in my writeup.
I have seen a lot of ports of DOOM on overpowered Windows based test equipment, particularly the Keysight MXA's, but they're just using them as a computer. Spectrum DOOM though. Could it be done by taking snapshots of a waterfall plot?
Was never quite sure if I should raw XY it or soft modem so I could decode on a web page on a handy device.
How about analog raster scan? a.k.a. slow-scan TV? [0] Like how they returned the live television images from the Apollo missions. (They only had 1 MHz of bandwidth for everything - voice, computer up and downlink, telemetry, and TV. Standard analog broadcast TV was 6 MHz. So they reduced the scan rate to 10 frames per second instead of 60, and halved the horizontal line resolution -- that could fit in 500 kHz.)
Most modern SSTV standards are super-narrowband, designed to fit into just a few hundred Hertz for amateur radio. But what if you had the full 20 kHz of bandwidth of a nice audio channel? With 100 horizontal lines per frame, and 1 frame per second -- that is about 200 cycles per horizontal line, or enough to resolve, in theory, 100 vertical lines on each horizontal line. I.e., 100 x 100 pixels (ish) at 1 fps.
It looks like a web page with audio input permissions can be expected to sample at 48KHz I wonder what the quality is like from a cable bodged off a spare pin.
A little webapp running on your phone could actually do some nifty on-the-fly display.
I've written my own s-expr library to inject footprints and symbols and it's a huge pain and flakey. I'd love to move to something a bit more fleshed out and official.
I'd be keen to have a look at your s-expr library, it likely has some overlap and utility for one of my other projects, https://www.circuitsnips.com , which is like thingiverse for electrical circuits. I had to figure out how to feed the embedded kicanvas renderer a full sheet, as it can't handle subcircuits, but allow the user to export subcircuits to the clipboard.
When I first shared CircuitSnips with the KiCad discord, the KiCad 9+ design block feature was brought up, which might be of interest to you as well?
KiDoom I don't fully get. The website says "All components connected to a shared net; the PCB could be sent to a fab house (it just wouldn't do anything useful)" but I don't see any of the component pins hooked up in the demo video.
Something that actually connects the components and routes the traces in a way that makes it somehow still recognizable as the 3D environment would've been cool, otherwise this is kind of just like piping draw commands into a <canvas> from a hook in the Doom renderer. KiCAD just happens to be a complicated line drawing app.
Don't get me wrong, still a fun little hack. But some more PCB-ness would make it even cooler.
It might be that the website undersells it and there's more PCB-ness than I can detect in the visuals. Is it using layers and vias between them for the z-sorting or so? Both the website and the commits have a distinct AI slop feel to them and are somehow not very detailed on this part.
As for the drawing, we pulled the vectors as a list from the C, and used a painters algo and drew back to front using the distance from the player in the python code.
We then treated them as polygons to allow us to work out occlusion to hide things behind walls, but the data pipes to kicad/the headphone jack is just the vector/wireframes/outlines, filtered by what's left after the occlusion test.
So yep, using footprints as sprites was my (clunky) nod to electronics, as I didn't like the idea of drawing polygons. Kicad can definitely handle them, but they're less fun.
Now, if I'm really bored over Christmas, I may port it to fusion360, which will have a 3d engine.
I 100% abused Claude code to get here, and i tend to get it to write the bones of a write up, which I then populated with my own thoughts, else I can't get started. We are worryingly becoming more aligned.
The part I love most is how many unrelated systems had to cooperate:
extracting geometry directly from DOOM’s drawsegs/vissprite internals
mapping sprite classes to physical component footprints
running real-time updates through KiCad’s object model without triggering full recompute
and then running the same vector stream to an oscilloscope via audio DAC
That’s a really clever chain of “use the tool for something it was never designed to do.”
ScopeDoom might end up being the more interesting long-term direction, vector displays force you to think about rendering differently, and there’s something poetic about DOOM being rendered as literal analog voltage traces.
If you ever take it further, the combination of:
faster DAC (or multi-kHz arbitrary waveform generator)
true analog persistence phosphor scope
and dynamic sprite simplification
…could get you surprisingly close to a smooth vector-shooter aesthetic.
Either way: great hack. The world needs more playful abuse of serious tools.
In lieu of an intensity channel for the scope implementation, slowing down the vector drawing for segments would make them 'brighter' on a proper scope, but I don't know if my relatively cheap Siglent would be able to distinguish between them.
I've got some NI DAC's here, or could use a mcu, the practical limit of the DAC on a teensy 4.1 is reportedly around 1Mhz, however for me I don't think theres a practical reason to do so.
Although, a native implementation on a teensy, with the wad on an sd card and direct input, no computer at all is very tempting...
Back on track, i've got to spend a bit more time focussing on work and other projects, but I have my next ludicrous port planned already.