Early classes on circuits in EE will usually take shortcuts using known circuit structures and simplified models. The abstraction underneath the field of analog circuits is extremely leaky, so you often learn to ignore it unless you absolutely need to pay attention.
Hobbyist and undergrad projects thus usually consist of cargo culting combinations of simple circuit building blocks connected to a microcontroller of some kind. A lot of research (not in EE) needs this kind of work, but it's not necessarily glamorous. This is the same as pulling software libraries off the shelf to do software work ("showing my advisor docker"), but the software work gets more credit in modern academia because the skills are rarer and the building blocks are newer.
Plenty of cutting-edge science needs hobbyist-level EE, it's just not work in EE. Actual CS research is largely the same as EE research: very, very heavy on math and very difficult to do without studying a lot. If you compare hard EE research to basic software engineering, it makes sense that you think there's a "wall," but you're ignoring the easy EE and the hard CS.
I knew a number of folks in the first year who were very good at practical electronics, having come in from a technician side, but simply gave up due to the heavy maths load.
It got more complex when doing Control Theory, what with Laplace and Z transforms, freq domain analysis, and the apocryphal Poles and Zeros.
Further culling ensued at that point.
However, control theory turned out to be my favorite class. Learning how negative feedback loops are everywhere was an eye opener.
Also learning Laplace transforms was one of my first “holy shit this is freaking clever and cool” moments. Just like how parity bits in data streams can be used to detect AND correct errors.
There might be a structural issue if you have a bunch of guys coming in from the technician side, as you say, who almost all get filtered out. You might need remedial classes, a different curriculum progression, something. Or else recruitment standards/expectation-setting are wacked-out.
Might just be me, but I found it all clicked when we started learning the fundamentals underneath these abstractions. For me it was harder in the first classes because it's about memorizing poorly understood concepts, my brain prefers logically deriving complex concepts as a learning method.
The gaps between the analogy and the real world actually make it harder to understand the fundamentals and just confuse people when you get to a deeper level understanding. It requires more unlearning than is worth it for the slight benefit of making the concepts slightly more intuitive to understand at the beginning.
At most levels, software will be in there somewhere, even those fake flickering candle LEDs have RAM, ROM, and a processor these days.
The Perseus Cluster of galaxies is estimated[0] at something like 816,592 light years in diameter, so that's 10^21 meters, and on the other end 2008 TC3[1] is an asteroid 4.1m across.
That is largely true of academic research. A critical difference though is that you don't need big expensive hardware, or the like to follow along with large portions of the cutting edge CS research. There are some exceptions like cutting edge AI training work super expensive equipment or large cloud expenditures, but tons of other cutting edge CS research can run even on a fairly low-end laptop just fine.
It is also true that plenty of software innovation is not even tied to CS style academic research. Experimenting with what sort of perf becomes possible via implementing a new kernel feature, can be very important research but isn't always super closely tied to academic CS research.
Even the more hobbyist level cutting edge research for EE will have more costs, simply because components and PCBs are not exactly free, and you cannot just keep using the same boards for every project for several years like you can with a PC.
1. Learn soldering
2. Treat circuits like black boxes. If I need X amount of Y, e.g. I need a circuit to smooth the voltage, I pick one black box with adequate attributes.
However this is pretty introductory and I have no idea how to learn to fix old consoles. Sometimes it’s just a broken capacitor but I first need to figure out which part is broken.
a) inspect for obviously damaged components. Capacitors that leaked, chips that released the magic smoke, etc.
b) confirm voltages are good
c) inspect the inputs and outputs of the ics to see if they're doing what you expect
d) depending on the boards involved, a lot of checking if pin A is electrically connected to pin B when it should be. Sometimes traces get broken and need to be fixed up.
Fabs are specific to the manufacturing of integrated circuits.
EE encompasses more than just manufacturing of ICs, for example research and applications in radio propagation and EM/wireless, signal integrity, antenna design, coexistence/desense, advanced power electronics, control systems, simulation/solvers, etc.
I say all this as a recovering semiconductor engineer: EE is a huge field. I can’t think of a subdiscipline where we’ve run out of new ideas to explore, and most of them don’t require bucketfuls of HF. The real problem is that the financial rewards are relatively small, the math is ferocious, and there are so few practitioners, let alone experts doing research.
But aren't there a lot of actual hardware products that are "simple circuit blocks connected to a microcontroller"? Like a toaster, shaver, keyboard, etc. If that's not "work in EE" then what is it classified under? It's not CS either.
Most of the orgs I worked in building simple circuit blocks connected to a microcontroller either farmed out the actual EE work to contractors or design houses or had 1 EE for like 20 different projects.
Another commenter pointed this out, but those products take about 1-2 days of engineering time.
But, yes, probably half of my classes were a real drag to get through. It all depended on who the lecturer was, and how enthusiastic they were.
My junior project in EE was a guitar fx pedal with a shielded breadboard on top. I won’t be bashful, that was the most popular project in the room.
Then… I got divorced and never finished my EE degree. I already had a degree in CS, and had pursued a second degree because I thought software was too limiting. Now, here I am, all limited.
The reason I never subsequently finished my degree was that I didn’t really want to work on CMOS nor transmission lines or microwave, and graduating with an ECE degree from U of Utah offered those as your career paths.
They're different jobs.
The Lego level is more like being a technician. You can slap a few ready-made building blocks together, maybe tweak them a little using basic algebra, and you've got your design.
That's fine for guitar amps and simple synth circuits and such.
But if you use that approach while designing the control circuitry for a power plant or a rocket motor, in the best case failure will be very expensive, worst case people will die.
That's where the real engineering happens. You're modelling systems from first principles and you know enough to be fairly confident that the equations you create to characterise a complex design with multiple inputs and outputs accurately predict its behaviour.
If you start with hobby electronics you have zero experience or insight into that level. So when you begin your course you're completely blindsided by how much math there is, and have no idea what it's for.
And some domains, like robotics, have even more math. You can use plain old EE control theory, but you can extend it into modelling systems using Lie groups and Lie algebras - which are more often used in quantum physics.
Sounds like if you start with the math before you're old enough to pick up a soldering iron it might be a little different.
And then circuit analysis is just a big exercise in building "castles in the sky" and worrying your upside down staircase has railings
(Among other several pet peeves about EE that I could go on about)
I worked with EE for a while and it was very boring building stuff.
It basically took me changing careers to SWE and working for a games company to finally use the math part of my EE degree.
I ended up building my guitar amp years after.
I had an eye opening experience when I had my first taste of programming when I took C programming in my second year of university. What do you mean I can run a command and see instant output? Amazing! This was not the case for my electronics and power engineering lab sessions. We were using equipment that had been around since the 80s with little to no supervision. Just a bunch of routine "experiments" which I can barely remember any of. In my third year, I took Digital Computer Design (a C.E elective) and I realized I had been wasting my time learning about how the power grid in my country works. I tried my best to salvage as much as I could by picking more C.E electives, albeit not many available, did as best I could.
Everyday I wonder, how different would my life have been if I studied CS or even CE, I do not know. But, I appreciate the little this journey taught me, that you can always squeeze lemonade out of whatever lemons life gives you. I see my old EE notes now and they don't make sense to me, but I appreciate the happy chills solving circuit problem sets gave me. I work in software now, and I get that 1000x more, and that is how I know I made the right choice.
We're making electric super cars that still use a steering wheel and gas pedal. Just because it's old doesn't mean it's deficient. Humans haven't fundamentally changed in the past 100 years, so it is probably the most intuitive way to manually control a car.
The typewriter/keyboard is probably the most intuitive method to input alphabet characters. Of course that doesn't preclude entirely new ways to control our computers. But if you set out to simply replace the mouse and keyboard without fundamentally changing how we interact with a computer, then you're setting yourself up for failure.
Software development for the most part is extremely easy. It's one of the few "engineering" fields where you can go to a bootcamp and learn it in 6 months. You won't see this kind of thing for quantum mechanics or electrical engineering.
Also the gap between theory and application in software is miniscule. Instantaneous even. You basically learn software via application.
A lot of software engineers take pride in their jobs and in their intelligence but they don't fully understand just how easy software is. Like you guys (to be accurate: most of you guys, not all) have an easy job and you learned a easy field. EE is challenging. You don't like it because it's harder and the intellectual capacity to handle it isn't there for everyone.
There's a reason why all hardware development moved to Asia. Software is just too attractively easy and the internet boom made it lucrative. Asians took whats available while the west took what’s most fun. And now the skill gap is more evident and we can’t go back.
But oh how much workload the 6002x course was... I needed 10-15 hours per week for all the reading, problems and labs, and doing that while working full time and commuting 2 hours a day was a grueling pace to survive!
But the cool thing about software is that it has paid so well despite its relative ease, it’s a well paying field that still secures a middle class life that’s resilient to inflation.
Once that’s gone, a whole lot of people are going to find themselves condemned to dropping an economic class or two.
It's definitely the case that there's a bigger jump from school project to actually useful product for EE than for CS. But now that we have affordable but decently featured FPGA boards, the barrier is much lower than before, at least for digital design.
For the same mental effort, you get orders of magnitude more "end product" from software than hardware, with greatly less overhead and greatly more flexibility.
Hardware is extremely punishing and "complexity friction" kicks in almost immediately. A multi-feature door alarm on a microcontroller is a one hour affair that a newbie could finagle. With a pure hardware implementation its a multi-day effort, plus another day of reworking the board to dial it in. And if you aren't copying a design, you likely need a degree as well.
There is also the fact that software pays much more than hardware, can be done remotely from just about anywhere, doesn't involve working in labs full of lead and solvents, and like the author noted, has a much higher "wow!" factor from people in general. Software makes you feel very powerful, hardware will humble you into the ground.
I started college in EE, I absolutely loved my digital logic class. I talked to one of the lab TA's in Circuits 2 and said this wasn't as interesting as digital logic. He told me to switch to computer engineering. I looked at the courses and there were only 5 classes different. I switched that week.
I've been designing computer chips the last 30 years.
> Maybe I Am Just a Software Guy
That's the TL ; DR version of the article.
The TL ; DR version of my post is "I like programming but I find hardware far more interesting."
A key purpose of the repeated exercises in circuit analysis is to build up the student's intuition for how electricity works. Mathematically, it's "simple" -- just systems of (possibly complex) equations and basic diff eq. But for sophomores, all that is still new, and most students don't enjoy going deep into derivations.
Building kits and plugging pre-made modules into microcontroller development boards is fun, but it's not really engineering. You don't hire an EE to plug off-the-shelf components together, you hire an EE to do design work, to make sure everything is going to work under all operating conditions, and to diagnose problems when something goes wrong.
Finally, software is just easier[1] than hardware. Modern software is a mathematical idealization that runs of top of decades of high-level tools and abstractions. That's why it's so cheap and popular!
[1] This does not mean that everything in software development is easy, just that you don't need to deal with physics or chemistry or manufacturing or the procurement of physical goods in order to create new software.
edit: +1 on the "I just started bruteforcing" part of getting frustrated with everything. It was not a good way of learning but even after switching programs I found myself preferring to just bruteforce problems I had lost hope in thinking through to completion without running into a mistake that'd require me start back over from the top when I have 200 of the same type of problem to do after. So much mental effort would be wasted trying to "get" things I just wasn't getting that I started to getting more satisfaction mentally from just managing to get the solution without doing the effort of doing it "right" (ignoring that my methods of bruteforcing would probably still take far more time and energy, it was at least something that didn't hurt me spiritually on every failure).
Contrast this:
> One of the EECS professors was kind enough to offer a RC car kit to his students to program it. I decided to give it a try. Maybe the toy car wasn’t exciting or maybe I was pre-occupied with other course work during that summer, I didn’t even open the box.
With this:
> Writing web applications blew my mind. I can just write some code, click a few buttons and boom all my friends and family across the globe can just see it! This feels like magic.
Anecdotally, I saw this a lot in college. Students would start out in electrical engineering because they thought hardware was really cool, but when the time came to do the hard work they didn't have much motivation. They wanted to be a hardware engineer, but putting in the work was unappealing. Software has a wider range of job opportunities from intense FAANG-level jobs down to being the person who pokes at a company's old PHP website long enough to keep it serving pages. You can jump in and find a level that matches your motivation. With hardware, you have to clear some hurdles to begin being useful at all.
To my surprise, I think Arduino and Raspberry Pi have made this worse. I talk to a lot of people who see themselves as amateur EEs because they bought an Arduino and used some jumper wires to connect some sensors to the right pins. It's exciting. Then they lose motivation when they encounter a problem that requires doing anything more complex or custom. These people often overlap with the CS students who think the entire world of software engineering is writing CRUD apps composed of APIs connected together.
There are plenty of "applied" electronics technician or electrician's apprenticeship programs that are more like your software education. Take an induction motor, a variable frequency drive, a few sensors, and a programmable logic controller, and hook them together according to the manufacturer's instructions, and you can be off to the races operating a pump or a conveyor on day 1. But will you understand how the insulated gate bipolar transistors and filters in that variable frequency drive turn the rectified high-voltage DC bus into three-phase AC that generates a rotating magnetic field and induces a current in the motor armature? No, you don't need to know any of that to make the pump work.
You wrote:
> I couldn’t imagine ... a toy CPU implemented in SystemVerilog being ... useful
No, it's really not, but your work on real CPUs depends on registers and combinatorial logic and ALUs and MMUs. End users can typically just download Python and treat everything behind the screen as a black box, but if you really want to call what you're doing "engineering" or a "science", then developing an understanding for what happens behind the curtains is incredibly useful. If you've implemented a toy 8-bit CPU with load, store, compare, jump, and a few basic math instructions, you can write some assembly or a toy interpreter for it and you will have an understanding of how real CPUs work that can enable you to write better code later. Add some interrupts to that CPU and build a multitasking operating system, and you'll understand parallelism better.
All of modern technology is a pyramid. At the point of that pyramid is just a single doped semiconductor with a P-N junction. We build that junction into transistors, and transistors into gates, and gates into CPUs, and on those CPUs we execute assembly, and we write low-level languages that compile into assembly, and build operating systems and syscalls with those low-level languages, and access those systems with high-level languages, and connect those computers together with networks, and write applications that operate on those networks, and at the broad base of the pyramid there are billions of people using those applications.
In 2025, no one human brain comprehends the full stack anymore, we all become our own individual bricks in a particular layer. But to do the best work at any point in the pyramid, you ought to know a bit of how it works above and below you!
It's a nice way of putting it. The blunt thing that everyone is sort of dodging here is this: I think it's less learning style, and more IQ. EE is THAT much harder.
I understand why a lot of people bail out of EE, and why a lot go to web dev specifically. EE relies so heavily on simple calculus that there's a distinct moment where you have to go "what the heck am I actually learning?". And seeing that software has this apparent depth (design patterns, OOP principles, Haskell, ORMs, Fieldingian REST, GraphQL, 10,000-word blog posts on vim vs emacs, etc.), they naturally get drawn there.
Maybe one day I will actually understand signal integrity but so far my experience has been "check return paths, match impedances and pray to the EE gods".
Career wise and financially, its worked out great. Even most of my EE friends didn't do much in EE after the first 5 years. Everyone just migrated to where the jobs were: Software development, IT, and Cybersecurity.
You might love mechanical engineering and machines so you get into machining parts. By hand it is exactly what you thought, but once you hit CNC you're back to a desk job, spending most of your time in CAD software not even touching the machine.
For me, a senior-level circuits class using Horowitz and Hill "Art of Electronics" was the game-changer (in 1981!). 3rd edition (2015) still looks great to me (although yes it is large and expensive).
This is why I did an EE degree, didn't get paid much, went into software and used that to pay for a mathematics degree.
In that analogy it also works that in that the level of cognitive difficulty is most challenging @ physics theoretical work --> engineering --> software. Inversely proportional to pay check size. Though a physicist can probably figure out software whereas the other way is a tougher slog.
If you want to get paid in software don't do something utterly commoditised and popular or you're just a fungible meat flavoured work unit. Get really damn good at something with some longevity in a stable niche.
Hardware and software are called different things for a reason? I do agree that tinkering with the hardware always needs to be in-step with the lesson at hand. You can't just state KVL/KCL and move-on, you need to have the student build a circuit and play with it for a day or two.
Obviously yes, if you're doing heavy analog/power/RF stuff you're going to be pretty far from software, but EE is a really broad field.