* Chip design pays better than software in many cases and many places (US and UK included; but excluding comparisons to Finance/FinTech software, unless you happen to be in hardware for those two sectors)
* Software engineers make great digital logic verification engineers. They can also gradually be trained to do design too. There are significant and valuable skill and knowledge crossovers.
* Software engineers lack the knowledge to learn analogue design / verification, and there’s little to no knowledge-crossover.
* We have a shortage of engineers in the chip industry, particularly in chip design and verification, but also architecture, modelling/simulation, and low-level software. Unfortunately, the decline in hardware courses in academia is very long standing, and AI Software is just the latest fuel on the fire. AI Hardware has inspired some new people to join the industry but nothing like the tidal wave of new software engineers.
* The lack of open source hardware tools, workflows, high-quality examples, relative to the gross abundance of open source software, doesn’t help the situation, but I think it is more a symptom than it is a cause.
Where are these companies? All you ever hear from the hardware side of things are that the tools suck, everyone makes you sign NDAs for everything and that the pay is around 30% less. You can come up with counterexamples like Nvidia I suppose, but that's a bit like saying just work for a startup that becomes a billion dollar unicorn.
If these well paying jobs truly exist (which I'm going to be honest I doubt quite a bit) the companies offering them seem to be doing a horrendous job advertising that fact.
The same seems to apply to software jobs in the embedded world as well, which seem to be consistently paid less then web developers despite arguably having a more difficult job.
As for a list of companies, in the UK or with a UK presence, the following come to mind: Graphcore, Fractile, Olix, Axelera, Codasip, Secqai, PQShield, Vaire, SCI Semiconductor and probably also look at Imagination Tech, AMD and Arm. There are many other companies of different sizes in the UK, these are just the ones that popped into my head in the moment tonight.
[Please note: I am not commenting on actual salaries paid by any of these companies, but if you went looking, I think you'd find roles that offer competitive compensation. My other comments mentioning salaries are based on salary guides I read at the end of last year, as well as my own experience paying people in my previous hardware startup up to May 2025 (VyperCore).]
As for startups/scaleups, I can testify from experience that you'll get the following kind of base salaries in the UK outside of hardware-for-finance companies (not including options/benefits/etc.). Note that my experience is around CPU, GPU, AI accelerators, etc. - novel stuff, not just incrementing the version number of a microcontroller design:
* Graduate modelling engineer (software): £50k - £55k * Graduate hardware design engineer: £45k - £55k
* Junior software engineer: £60k - £70k * Junior hardware engineer: £60k - £70k
* Senior/lead software engineer (generalist; 3+ yoe): £75k - £90k * Senior compiler engineer (3+ yoe): £100k - £120k * Senior/lead hardware design engineer: £90k - £110k * Senior/lead hardware verification engineer: £100k - £115k
* Staff engineering salaries (software, hardware, computer architecture): £100k - £130k and beyond * Principal, director, VP, etc. engeering salaries: £130k+ (and £200k to £250k not unreasonable expectation for people with 10+ years experience).
If you happen to be in physical design with experience on a cutting edge node: £250k - £350k (except at very early stage ventures)
Can you find software roles that pay more? Sure, of course you can. AI and Data Science roles can sometimes pay incredible salaries. But are there that many of those kinds of roles? I don't know - I think demand in hardware design outstrips availability in top-end AI roles, but maybe I'm wrong.
From personal experience, I've been paid double-digits percentage more being a computer architect in hardware startups than I have in senior software engineering roles in (complex) SaaS startups (across virtual conferencing, carbon accounting, and autonomous vehicle simulations). That's very much a personal journey and experience, so I appreciate it's not a reflection of the general market (unlike the figures I quoted above) so of course others will have found the opposite.
To get a sense of the UK markets for a wide range of roles across sectors and company sizes, I recommend looking at salary guides from the likes of: * IC Resources * SoCode * Microtech * Client-Server
No, there is no misunderstanding. Even the US companies mentioned _in the very article_ that have both software and "chip design" roles (however you call it) will pay more to their software engineers. I have almost never heard of anyone moving from software to the design side, but rather most people move from design side to software which seems like the more natural path.
The "misunderstandings and lack of awareness" I was referring to is in regards to many people outside the semiconductor industry. These aspects are hurting our industry, by putting people off joining it. I was not referring to people inside the industry, nor the SemiEngineering article.
As for salaries: See my other comments. In addition, I think it's worth acknowledging that neither hardware nor software salaries are a flat hierarchy. Senior people in different branches of software or hardware are paid vastly different amounts (e.g. foundational AI models versus programming language runtimes...). For someone looking at whether to go into software or hardware roles, I would advise them that there's plenty of money to be made in either, so pursue the one which is more interesting to them. If people are purely money-motivated, they should disappear off into the finance sector - they'll make far more money there.
As for movement from software into hardware: I've primarily seen this with people moving into hardware verification - successfully so, and in line with what the article says too. The transfer of skills is effective, and verification roles at the kind of processor companies I've been in or adjacent to, pay well and such engineers are in high-demand. I'm speaking from a UK perspective. Other territories, well, I hear EU countries and the US are in a similar situation but I don't have that data.
Do more hardware engineers transition into software than the other way around? Yeah, for sure, but that's not the point I think anyone is arguing over. It's not "do people do this transition" (some do, most don't), rather it's:
"We would like more people to be making this transition from SW into HW. How do we achieve that?"
And to that I say: Let's dispel a few myths, have a data-driven conversation about compensation, and figure out what's really going to motivate people to shift. If it only came down to salary, everyone would go into finance/fintech (and an awful lot of engineering grads do...) but clearly there's more to the decision than just salary, and more to it than just market demand.
If you're feeling like learning SystemVerilog, then learn Universal Verification Methodology (UVM), to get into the verification end.
If you want to stay in software but be involved in chip design, then you need to learn C, C++ or Rust (though really C and C++ still dominate!). Then dabble in some particular application of those languages, such as embedded software (think: Arduino), firmware (play with any microcontroller or RPi - maybe even write your own bootloader), compiler (GCC/LLVM), etc.
The other route into software end of chip design is entry-level roles in functional or performance modelling teams, or via creating and running benchmarks. One, the other, or both. This is largely all C/C++ (and some Python, some Rust) software that models how a chip works at some abstract level. At one level, it's just high-performance software. At another, you have to start to learn something of how a chip is designed to create a realistic model.
And if you're really really stuck for "How on earth does a computer actually work", then feel free to check out my YouTube series that teaches 1st-year undergraduate computer architecture, along with building the same processor design in Minecraft (ye know, just for fun. All the taught material is the same!). [Shameless plug ;) ]
The vagaries of analog electronics, RF, noise, and the rest is another matter. While it's possible that a CS graduate might have a hint of how much they don't know, it's unreasonable to expect them to cover that territory as well.
Simple example, did you know that it's possible for 2 otherwise identical resistors to have more than 20db differences in their noise generation?[1] I've been messing with electronics and ham radio for 50+ years, and it was news to me. I'm not sure even a EE graduate would be aware of that.
they even made us use them in practical labs, and connect them up to an ARM chip
RF design, radars, etc... are more an art than a science, in many aspects.
I would expect a Physics-trained student to be more adaptable to that type of EE work than a CS student...
Learning KiCad took me a few evenings with YT videos (greetings to Phil!).
Soldering needs much more exercise. Soldering QFN with a stencil, paste and oven (or only pre-heater) can only be learned by failing many times.
Having a huge stock of good components (sorted nicely with PartsDB!) lowers the barrier for starting projects dramatically.
But as always: the better your gear gets - the more fun it becomes.
The exception was cutting edge motherboards that had to be released alongside a new Intel chipset but that project had at least a dozen engineers working in shifts.
I'm guessing this isn't part of most curricula anymore?
Definitely no ALU design on the curriculum, no interfacing or busses, very little physics. They don't even put a multimeter in your hand.
Informatics is considered a branch of logic. If you want to know how to design a computer, you should have studied EE, is their thinking.
The supercomputer thing... never happened. And I turned out to have a CE career anyway.
My sibling is a CS@UIUC grad and they as well as CS+X were still required to do that.
In other universities such as Cal it's a different story. Systems programming and computer architecture course requirements have either been significantly reduced or eliminated entirely in CS programs over the past decade.
I've documented this change before on HN [0][1][2]. The CS major has been increasingly deskilled in the US.
[0] - https://news.ycombinator.com/item?id=45413516
Here's an example of my implementation of the original Tamagotchi: https://news.ycombinator.com/item?id=45737872 (https://github.com/agg23/fpga-tamagotchi)
I understand that it makes sense for a blog called Semiconductor Engineering to be focused on semiconductor engineering, but I was caught off guard because I have been working on the reasonable assumption that "hardware designer" could be someone who... designs hardware, as in devices containing PCBs.
In the same way that not all software developers want to build libraries and/or compilers, surely not all hardware designers want to get hired at [big chip company] to design chips.
Also there's computer architects being like "So, are we hardware design? Interface design? Software? Something else?"...
Meanwhile, all the mechanical engineers are looking from the outside saying "The slightest scratch and your 'hard'ware is dead. Not so 'hard' really, eh?" ;) ;)
Every sector has its nomenclature and sometimes sectors bump into each other. SemiEngineering is very much in the chip design space.
I would bet that a CS guy would have similar problems switching to hardware engineering.
The former (CS -> EE) is very unlikely to happen at a large scale than the latter (EE -> CS). It is much easier to teach EEs to become (albeit, often bad) software engineers, than teaching CS student to be good engineers.
Also, the former (CS -> EE) will not happen in academia because of (1) turf wars, and (2) CS faculty not having any understanding, nor interest in electronics/hardware/engineering.
I once proposed to teach an IoT class in the CS department of a major university in US, the proposal basically fell on deaf ears.
I learned Ada sometime around 1991. Counting assembly for various platforms, I had already learned about a dozen other languages by then, and would later learn many more.
Sometime around 2000 I learned VHDL. In all of the material (two textbooks and numerous handouts) there was no mention of the obvious similarities to Ada. I wish somebody had just produced a textbook describing the additional features and nomenclatures that VHDL added to Ada -- That would have made learning it even easier. The obvious reason that nobody had done that is that I was among a very small minority of hardware people who already knew Ada, and it just wouldn't be useful to most people.
In all of my work, but especially in systems integration work, I've found that my knowledge of multiple domains has really helped me outperform my peers. Having an understanding of what the computer is doing at the machine level, as well as what the software is doing (or trying to do) can make the integration work easy.
More on-topic: I think it would be a great improvement to add some basic hardware elements to CS software courses, and to add some basic CS elements to EE courses. It would benefit everyone.
The article is more in the area of chip design and verification than PCB hardware, so I kinda understand where it's coming from.
"Electrical and Computer Engineering" (ECE) departments already exist and already have such a major: "Computer Engineering".
Hardware people go to software because it is lower-stress and can pay better (well, at least you have a higher chance of getting rich, start-ups and all that).
My courses didn't get into the details of semiconductor design (particularly manufacturing), but we had one on the physical principles behind this whole thing - bandgaps and all.
We also had to design analog circuits using the Ebers-Moll transistor model, so pretty basic, but still not exactly linear.
Overall these are very different fields but at the end of the day they both have models and systems, so you could make a student of one of them learn the other and vice versa.
It just has to be worth the effort.
And CS folks should design hardwares because they understand concurrency better?!
A whole lot of my coursework could be described as UML diagramming but using glyphs for resistors and ground.
Robots handle much of the assembly work these days. Most of the human work is jotting down arbitrary notation to represent a loop or when to cache state (use a capacitor).
Software engineers have come up with a whole lot of euphemistic notations for "store this value and transform it when these signals/events occur". It's more of a psychosis that long ago quit serving humanity and became a fetish for screen addicts.
I used Vivado (from Xilinx) a bit during my undergrad in computer engineering and was constantly surprised at how much of a complete disaster the tooling chain was. Crashes that would erase all your work. Strange errors.
I briefed worked at a few hardware companies and I was always taken aback by the poor state of the tooling which was highly correlated with the license terms dicated by EDA tools. Software dev seemed much more interesting and portable. Working in hardware meant you would almost always be searching between Intel, Arm, AMD and maybe Nvidia if you were a rockstar.
Software by comparison offered plentiful opportunities and a skill set that could be used at an insurance firm or any of the fortune 100s. I've always loved hardware but the opaque datasheets and IP rules kills my interest everytime.
Also, I would argue software devs make better hardware engineers. Look at Oxide computer. They have fixed bugs in AMD's hardware datasets because of their insane attention to detail. Software has eaten the world and EEs should not be writing the software that brings up UEFI. We would have much more powerful hardware systems if we were able to shine a light on the inner workings of most hardware.
Most people that land a successful long career, also refuse to solve some clown firms ephemeral problems at a loss. The trend of externalizing costs onto perspective employees starts to fail in difficult fields requiring actual domain talent with $3.7m per seat equipment. Regulatory capture also fails in advanced areas, as large firms regress into state sponsored thievery instead.
Advice to students that is funny and accurate =3
"Mike Monteiro: F*ck You, Pay Me"
tests are ofcourse very important, but fact of the matter is, bright smart and arrogant young engineers-to-be are very eager to show everyone how much better their version of the 'thing' is, and desperately want to write their version of the thing: they don't want to verify someone else's version of the thing.
if we're being honest, how many people do you really need to do the design of some hardware feature? realistically the design can be done by one person.
so you might have one lead designer, delegates each block to 10 guys, and everything else is basically 'monkey work' of writing up the state machine logic, testing it, and hooking it all up.
and now lets count the number of companies that can put up the capital for tape-out: amd, intel, arm, nvidia, meta, aws, google chips, apple, and lets say plus 50 for fintechs, startups, and other 'smaller' orgs.
so if you want to do design, you might be competing for... lets say 3 lead designers per org on avg, 3 * 50 = 150 silicon design spots for the entire globe. to add, a resource in such scarce supply will no doubt be heavily guarded by its occupants.
i did this calculation back when i was still in uni. i'll never know if it paid off, or if it was even rooted in logic, but i remember thinking to myself back then: "no way in hell am i gonna let these old guys pidgeon hole me into doing monkey work with a promise of future design opportunities." arrogant, yes, but i can't say i regret my decision judging from the anecdotes i get from friends in the hardware world.
i think that, for digital design to be interesting, the cost of entry must be lowered by probably orders upon orders of magnitude.
the google skywaterpdk thing, whatever it is (or was?), did produce a great deal of hobbyist designs and proved that there really isn't anything special about rtl - infact, its really quite monotonous and boring.
which is a good attitude to have, really. lots of hobbyist designs got cranked out quickly on what, as i understood, was a very obsolete pdk from two decades ago.
but its fundamentally still too expensive and too limited. open source software 'blew up' because
1. the cost of entry was free...
2. ...for state of the art tools.
its not enough to be free, or open source. it also has to be competitive. llvm/gcc won the compiler world because they blew the codegen of proprietary compilers out of the water, ofcourse being open source it became a positive feedback loop of lots of expert eyeballs -> better compiler -> more experts look at it -> better compiler -> ...
for digital design to become interesting, you can't trick the kids: they want the same tech the 'big boys' are using. so, what scope is there to make it economical for someone like Intel carving out some space for a no-strings-attached digital design lottery?
i get the impression that, unlike for most manufacturing processes, the costs of silicon digital electronics increases every year, and the amortisation schedule becomes bigger, not smaller.
so if anything, it seems that the more high tech silicon manufacturing becomes, the smaller the pool of players (who have the ever-increasing capital expenditure necessary) becomes, which should indicate that the opportunities for digital design work are actually going to be shrinking as time goes on.
The reason for the "talent shortage" (aka "talent more expensive than we'd like") is really just because hardware design is a niche field that most people a) don't need to do, b) can't access because almost all the tools are proprietary and c) can't afford, outside of tiny FPGAs.
If Intel or AMD ever release a CPU range that comes with an eFPGA as standard that's fully documented with free tooling then you'll suddenly see a lot more talent appear as if by magic.
Mostly B. Even if you work in company that does both you'll rarely get a chance to touch the hardware as a software developer because all the EDA tools are seat-licensed, making it an expensive gamble to let someone who doesn't have domain experience take a crack at it. If you work at a verilog shop you can sneak in verilator, but the digital designers tend to push back in favor of vendor tools.
Which is fair in my experience because Verilator has serious limitations compared to the other three - no 4-state simulation (though that is apparently coming!), no GUI, no coverage, UVM etc. UVM is utter shite tbf, and I think they are working on support for it.
Also it's much slower than the commercial simulators in my experience. Much slower to compile designs, and runtime is on the order of 3x slower. Kind of weird because it has a reputation for being faster but I've seen this same result in at least two different companies with totally different designs.
I gave up on Verilator support in a previous company when we ran into a plain miscompilation. There was some boolean expression that it simply compiled incorrectly. Difficult to trust with your $10m silicon order after that!
It's definitely nice that it doesn't require any ludicrously expensive licenses though.
Which doesn't pay as well as jobs in software do, unfortunately.
The problem, I think, is that there are many competent hardware design engineers available abroad and since hardware is usually designed with very rigorous specs, tests, etc. it's easy to outsource. You can test if the hardware design engineer(s) came up with an adequate design and, if not, refuse payment or demand reimbursement, depending on how the contract is written. It's all very clear-cut and measurable.
Software is still the "Wild West", even with LLMs. It's nebulous, fast-moving, and requires a lot of communication to get close to reaching the maintenance stage.
The article was about chip design.
Not trying to stop you debating the merits and shortcomings of PCB Design roles, just pointing out you may be discussing very very different jobs.
Very specifications-driven and easily tested. Very easy to outsource if you have a domestic engineer write the spec and test suite.
Mind you, I am not talking about IP-sensitive chip design or anything novel. I am talking about iterative improvements to well-known and solved problems e.g., a next generation ADC with slightly less output ripple.
And from what I know of SemiEngineering's focus, they're talking about chip design in the sense of processor design (like Tenstorrent, Ampere, Ventana, SiFive, Rivos, Graphcore, Arm, Intel, AMD, Nvidia, etc.) rather than the kind of IP you're referring to. Although, I think there's still an argument to be made for the skill shortage in the broader semiconductor design areas.
Anyway, I agree with you that the commoditized IP that's incrementally improving, while very important, isn't going to pay as well as the "novel stuff" in processor design, or even in things like photonics.
The notable exceptions are:
* Formal verification, which is very widely used in hardware and barely used in software (not software's fault really - there are good reasons for it).
* What the software guys now call "deterministic system testing", which is just called "testing" in the hardware world because that's how it has always been done.
I know them. Especially older folks. Ramming all parts on one huge sheet instead of separation by function. Refusing to use buses. Refusing to insert part numbers into schematics so they can just export BoM directly and writing BoM by hand instead.
Watching these guys is like watching lowest office worker inserting values from Excel into calculator so he can then write the result into same Excel table.
If you want old dogs to learn new tricks, teach them. No company has the money to spend nor the inclination to even suggest education to their workers. Companies usually consider that a waste of time and money. I don't know why. Probably because "investing" in your work force is considered stupid because they'll fire you the moment a quarterly earnings call looks less than stellar.