> Among them are IoT nodes, sensors and front-ends, actuators including motors, test and measurement instrumentation, boards and interconnects, line drivers/receivers, motor drivers, physical-level wired/wireless links, MEMS devices, RF functions, power sources and systems, EMI, noise, thermal extremes…and that’s just a partial list. These roles and functions are not going away as long as the laws of physics, as we know them, remain in place.
(I also think it's misleading to use the term "computer" for things like differential analyzers, just as it's misleading to call a person who adds up numbers a "computer", even though both usages were well established before the invention of the inherently digital devices we call "computers" today. But that's a different discussion.)
But that's a very cool story.. do you remember which model of an analog computer that was?
But if you told me the actual name, I really would not recognise it - so long ago.
> Among them are IoT nodes, sensors and front-ends, actuators including motors, test and measurement instrumentation, boards and interconnects, line drivers/receivers, motor drivers, physical-level wired/wireless links, MEMS devices, RF functions, power sources and systems, EMI, noise, thermal extremes…and that’s just a partial list. These roles and functions are not going away as long as the laws of physics, as we know them, remain in place.
- As you note, signal conditioning to stuff things into an ADC
- Anywhere firmware is viewed as a liability (often medical or other hi-rel stuff)
- Existing proven designs (do not underestimate this sector!)
- Anywhere the cost of the signal conditioning circuitry might be comparable to the cost of just doing it outright in analog. This is mostly the low-cost realm, more rarely ultra-low-power, but sometimes you see it in other places too.
- Line power supplies happen to be all of the above, so you see plenty of analog there
You used to see analog in high-performance stuff (ultra-high-speed/RF or ultra-high-bit-depth), but this has mostly gone into either digital or whackadoodle exotica. Like frontends for those 100GHz oscilloscopes, which are both!
The reverse path (DAC) is less common, like 10% of the cases you need a good DAC for signal generation. It's more hardcore analog and harder to design a DAC.
Once you get the hang of the basics at Audio and low RF frequencies, you can then set up GNU Radio, which works with your audio I/O of your computer, etc.. maybe add a $30 RTLsdr dongle, and the next thing you know, you've got a bit of RF under your belt.
You can play around with analog programming of a sort with modular synthesizers. It's a pretty neat way to dip your toe into analog signal processing.
Another couple of ways to get started with analog signal processing:
- Build an AM radio from transistors. There are lots of tutorials out there.
- Simulate circuits with Falstad's circuit.js. There are some interesting analog circuits already in the set of examples, like https://tinyurl.com/24gccg7p.
- Build an Atari Punk.
You can get very, very good op-amps very cheaply these days. Some of them even still come in through-hole packages. This makes it possible to build interesting audio synthesizer circuits for pennies that would have required significant money outlay in the 70s.
Most coders in my vicinity are interested in woodworking, is that analog? I think not.
To expand a bit, since my day job involves this stuff, physical stimuli are always analog. Even the discrete energy levels of an atom make their transitions in continuous time. Yet there are good reasons to do virtually all computation in the digital domain, where "noise immunity" allows processing to occur without the introduction of additional noise, and you enjoy all of the other benefits of computer programming.
These days, the job of the analog person is often to understand the physics of the quantity being measured, and the sensor, but to get a signal safely to the front end of an analog-to-digital converter.
Now, the irony is that I actually spend most of my time working in the digital domain. The reason is that analysis of the digital data stream is how I know that my analog stuff is working, and how I optimize the overall system. So if you watched me work for a week, you'd notice that I actually spend a fair portion of my time coding. I just don't write software for other people to use. That's another department, and their work usually starts after mine is done.
> Among them are IoT nodes, sensors and front-ends, actuators including motors, test and measurement instrumentation, boards and interconnects, line drivers/receivers, motor drivers, physical-level wired/wireless links, MEMS devices, RF functions, power sources and systems, EMI, noise, thermal extremes…and that’s just a partial list. These roles and functions are not going away as long as the laws of physics, as we know them, remain in place.
Woodworking can be analog if the wood shapes and positions (and maybe velocities, etc.) are used to quantitatively represent something other than the wood itself, as an analogue to those quantities. For example, you can carve some wooden cams to drive a little automaton, or you can make a clock out of wood gears, where the angles of rotation of the gears represent the amount of time that has passed. But this article is specifically about electronics.
I could probably be described as living in the "analog" domain, as a physicist working for a company that makes measurement equipment. Naturally, this could be an ingrained bias, but I've formed the impression that something about getting your hands dirty confers the intuition needed to work productively in this domain. You need to experience being proven wrong by mother nature, over and over again.
Also, if you're sitting at your screen all day, nobody's going to pull you into the loop. It's quicker to just do that stuff ourselves, than to explain it to someone.
So I agree with everything else in the article, because I love analog and love coding. But come on, join us in the lab.
Why? Because the dipshits in leadership decided to project the revenue growth during the chip shortage as a straight line for the next 10 years.
Looks like those same dipshits decided the best course of action is to get their soft skulled alumni to write some blog posts to try to herd more cattle into the grinder.
It is somewhat ironic that the single profession AI is best at replacing seems to be software engineering.
So telling people to move over to analog will depress that job market even more than it already is.
Smart things drive me completely insane and I find peace with things that just work without a wifi connection or firmware of any kind.
The biggest difference is that software is just closer to "the product". When something goes wrong on a typical embedded device, the first thing they say isn't "well, better bring in exmadscientist to redesign the board", it's "let's fix this in firmware". And so on and so forth. Most electrical projects just aren't, directly, the project. There's a big layer of software in between, and that software becomes the face of the product or even business and captures so much of the mindshare.
The other big reason is simply that the complexity of software is unbounded. For me, there are only so many parts I can stuff onto a board. But software has no such limits. I was just looking at 2.5GbE Ethernet switch chips for a hobby project -- a hobby project -- and concluded that they weren't bad at all and would take me somewhere around 40 hours to deal with, start to finish. That's for one of the nastier things around in your typical consumer environment (short of a 5GHz CPU) and represents a tremendous amount of investment on so many levels to get things to be not only possible, but down to the level that a seniorish engineer can just do it that quickly.
In contrast, a dinky web interface to manage the stupid thing would also probably be around 40 hours (less if crappy, way more if done "to modern standards"). Which is kind of insane, when you think about how many Gbps SERDES links are in each project! But software can do whatever it feels like, while the hardware side necessarily has constraints on it, so this is what we get.
However, I also think things are pretty imbalanced right now: SWE is somewhat overpaid, and will correct down over time. So it's not a great career to leverage the farm against, though I'd say it's never going to pay worse than EE or ME.
The workers owning the means of production doesn't seem ironic at all to me in this case; it's textbook Marxist economic theory. The means of production for software are, from Marx's point of view, your laptop and maybe a colo box. When you own (or rent!) them, you aren't alienated from them (in the purely objective sense of https://en.wikipedia.org/wiki/Marx%27s_theory_of_alienation#... rather than any emotional sense) and consequently you own the product. By contrast, when you depend on your employer for access to the means of production, they own the product of your labor.
Open-source tooling, based on the idea that "software should not have owners" (https://www.gnu.org/philosophy/why-free.html), enables every worker to own not only their own computer but their own compiler, linker, etc., rather than depending on angel investors to invest the capital necessary to license them from Computer Associates. You are precisely correct there.
Where my analysis departs from orthodox Marxism (perhaps because I am in fact a liberal) is that I see this as a question of bargaining power rather than a Hegelian mystical thing.
Civil engineers have very little bargaining power because the state power necessary to build a highway is very scarce indeed, while the expertise required to design it is relatively abundant.
Electrical engineers have more bargaining power because they can choose between many potential employers, who have to compete with one another for their relatively scarce skills, but bringing a new electronic product to market still requires a significant investment and several months, if not a year or more. For more advanced electronics like submillimeter ASICs, we're probably talking about several years and tens of millions of dollars.
Programmers have enormous bargaining power because they can bring a salable product to market over the weekend with an investment of a few hundred dollars. Or a few thousand if they're targeting the iPhone.
So I don't think collective awareness and action are what's going on here. I think individual programmers are in a better bargaining position than individual electrical engineers, and consequently they get better bargains individually, without functioning in a fashion akin to a guild. (And collectively owning the means of production, as orthodox Marxism prescribes, would not provide the same benefits. Observably it did not, neither in the Soviet Union and Mao's China nor when US retirees owned the majority of the stock market through their pension funds.)
Incidentally, there have been a lot of innovations over the past 25 years or so that have greatly dropped the investment required to bring a new web service online. Sourceforge and later GitHub and then GitLab eliminated the need to spend a week configuring a server to support a software team. Rackspace and then Amazon Web Services eliminated the need to buy the server, haul it over to the colo, and maybe commit to a service contract. MySQL (now MariaDB), Postgres, and SQLite eliminated the need to license Oracle. (Linux had already eliminated the need to buy a Solaris license 25 years ago, but a lot of people hadn't noticed yet.)
Companies like JLCPCB, PCBWay, and OSHPark seem like they're sort of trying to do the same thing with PCB products, FPGAs and especially Yosys are doing the same thing with digital designs, and companies like Skywater, (the nonprofit) IHP, and Matt Venn (Tiny Tapeout) seem to be trying to do the same thing with ASICs, including mixed-signal ASICs. (I'd list Efabless here, but they seem to have gone out of business last week.)
But being able to pay US$10 for one-week turnaround on a stack of prototype PCBs isn't going to replace a 1GHz LeCroy oscilloscope or a Kuka industrial robot, so I'm not confident that they'll have the same effect.
Also, orthodox Marxism sees material relation to the capitalist economy as something of a continuum—the three main classes consist of variations of degree of relative importance of two (labor and capital) means of interacting with the economy—the capital-dominant group is the haut bourgeoisie, the labor-dominant group is the proletariat, and the middle class, the petit bourgeoisie, has significant dependence on both labor and capital (the textbook case being someone who applies their own labor, rather than rented labor, to their own capital to produce goods or services, though there are other mixes possible that are also petit bourgeois.) It is purely material, not mystical.
Oddly enough I have a side-business that makes an electronic gadget -- think something like a guitar pedal. But I have to choose my battles very carefully to avoid needing any kind of capital investment to speak of, and the real barrier to entry is the knowledge that I've gained from being immersed in the industry. My entire physical capital is less than what an engineer's employer pays per year for a seat of a CAD package.
A very big barrier that software developers don't face is regulatory approval.
Basically with a digital circuit you mostly only care about two things: whether it computes the function you want to compute, and how fast it is. Circuits that don't compute what you want to compute can simply be ruled out, and among the circuits that work, the faster the better.† Digital circuits don't have input bias currents, or rather their input bias currents don't introduce error. They don't have dropout voltages or non-rail-to-rail inputs or offset voltages or power supply rejection ratios or noise figures. Either they compute the right answer or they don't.
But in analog design, nothing computes the exactly right answer. Every component introduces errors of different kinds in varying amounts. So, there are a lot of different desirable parameters, everything trades off against everything else, and which parameters matter most depends on the situation. If you're designing a circuit that gets used in a lot of different situations, like a new op-amp IC, you have to kind of guess which of those situations are the most important ones.
I don't think it's true that analog circuit design is harder than programming. Like cooking, how hard it is depends on what you're doing. In all three cases you have problems of a whole range of difficulty from "trivial even for a beginner" to "beyond human capability", and, for more difficult problems, deep knowledge can diminish the amount of trial and error required but never eliminate it.
______
† This is kind of a lie. Slew rates that are faster than you need can cause ground bounce and impact your EMC, but those are analog phenomena and usually of only peripheral interest. Power consumption, another analog phenomenon present in digital circuits, is always a concern if you're on battery, though much more so for analog designs. If you're doing asynchronous logic design, you have to worry about glitches, so faster isn't always better, but almost nobody does asynchronous these days because synchronous logic is so much easier and almost always adequate. Finally, cost trades off against other desirable attributes in any kind of engineering, even digital circuit design. Still, it's a lie that's more true than false.
I'm skeptical of this comment, and of possible bias given your username.
I'd think roughly the same amount of people that were able to learn to code at more than just the hello world/vb macro level could learn analog circuit design. It's just that the interest isn't there.
When the presenter explained, it turns out to be programming and managing the systems that do warehouse / product movement in facilities owned by scrappy little companies like Wal-Mart and Amazon…you know, because humans need bathroom breaks and pesky things like safety considerations. Apparently graduates walk into the field regularly getting $70-80,000 a year jobs, which to me sounds really low. Then again, the program is like 18 weeks and a surrogate for higher education in a field where demand exists.
So in a way the grandpa who wrote this article is right, but little does he know it’s eliminating low skill jobs that his meth addled nephew might be actually qualified to do!