* I graduated as a Computer Engineer.
* Worked as EE/EEE entire career (currently almost 14 years).
* Half of my class of EEs and CEs went into other careers immediately after graduation because money was better.
* 10 years later my best conservative estimate (based on sampling from friends from my year) is that at least another quarter has left because money.
* I have mainly worked in companies that make embedded devices, which is a market that is supposed to be exploding. So I should be getting paid better right? Have always been paid worse than my SE peers even though there's x5 - x10 as many of them.
* Know lots of SEs who moved from EE because money was better (chief complaint from them is that SE is easier but they don't care because job less stressful and again more money).
* Have never met an SE moving into EE.
* I am now at a junction point in my career where I will either: Leave EE completely and work in something else (probably SE or IT), start my own EE business, consult in compliance of EE products (had one gig for a while and it paid well). Why? because money.
And I don't want fast cars or huge houses or any of that absurdity (although A house would be nice). Just living comfortably would be nice. Having the salaries of my SE and CS friends would be amazing.
To those commenters who say that you need to be bright to be an EE, it is very flattering, but I am obviously not very bright :D
(edit: formatting because, again, not very bright :) )
I did a CE degree as well. Reading all the comments saying that it's super hard at university makes me feel like my university was really shit.
The difficulty in EE exams came from having to solve stupidly long and complex equations by hand in an exam with significant time pressure. Pretty much every single exam question was look at problem, identify equations to use, solve them by hand. And all the exam questions were close enough to stuff that was solved in lectures/tutorial.
In a way, that made it easier because everything was basically the same. I only had to get good at solving equations quickly and then I could get an A in an exam after just looking at all the tutorials/lecture notes the day before the exam. There are entire subtopics of my degree that I had 0 interest in (high voltage, electromagnetism), skipped all the lectures and then still got an A. I had/have 0 understanding of those topics, I just pattern matched questions to equations and then solved them like I solved everything else.
For some reason, the CS exams never had this sort of "difficulty". They were much harder to bullshit through with 0 understanding, but if you did understand the material, were much easier. I remember in a computer graphics exam they did force us to solve some matrix multiplication by hand, it was so absurdly trivial compared to the shit in EE exams that I almost laughed. But a lot of my pure CS peers really struggled because they didn't practice solving hundreds of much nastier problems by hand.
One thing that used to blow my mind was the difference between logical circuit design in CS vs CE/EE.
E.g. if the task was "Build a circuit to do X logic using these components AND, OR, XOR etc"
- In CS, it was " Please use the common shapes for each component"
- In CE/EE, it was "Please use the EXACT part code for EACH and EVERY component you use!"
I remember thinking: "Isn't the key lesson to learn the logic flow design vs being able to lookup each and every party number?? Is the EE department trying on purpose to get rid of people?"
You could make the argument that forcing people to do the detail but boring work is a good filter for people who are REALLY serious about EE but I would argue that there have to be better ways to do that.
Sounds virtually identical to my experience from high school on, across almost every subject. Thankfully I retained >0, but still not what I should’ve. Instead, what I learned is I could JIT memorize what I needed to know just to pass the test. Took me years after school to unwind that and deeply learn topics again (outside things I had an interest in learning, those were never a problem).
Even on CE, we had for first 2 years tons of absolutely mandatory EE courses which had 0 relevance to software development. One (Theoretical electrotechnics II) was especially hardcore due to 1) math used was tougher than actual dedicated math courses we had at that time; 2) the professor was an absolute a-hole, I mean proper evil twist in his personality... everybody on whole university hated him, I mean teachers, management, everybody, students were properly scared of him. He was so well known even people on other universities knew him well. But he had some good expertise in some topics so he was tolerated, and he served as biggest student filter on whole faculty.
He literally fired people from whole school (as in last attempt to pass this mandatory course, 2 attempts per year, if failed could repeat next year) in their 3rd year at uni, because of a single dot in whole equation calculation (which was at least 1 A4 per exercise). Dot was just above given variable in equation to give it different meaning than non-dot ones, and pens did often fail us back then (so after solving an example we all tripple-checked all dots were visible where they should be).
He often told girls they shouldn't study EE since its not for them, guys that had long hair that they should get back to their moms, people with hungarian-sounding names that they should go to Hungary etc... He was fired eventually.
If it hadn't been for him, EE would be above-average difficulty but definitely manageable subject. As it was done, one basically done university for CE degree once passing him. And the best thing of all this - he was consistently given only to teach and examine CE people only. EE people had such an easygoing professor that everybody passed it.
Needless to say, I loathed anything EE-related for quite some time. Schools have ways to effectively discourage even good topics to folks like me.
I also keep thinking, surely other industries will start paying more and treating folks better, so they can attract top talent, but nope!
Why is this?
I've been coding since I was 12, I have flow days where it's just an absolute pleasure. But on a bad day where I never hit flow, it's brutal. It's so hard to force myself to focus. And often when I come back the next day, the code I wrote is absolute shit and I spent a good chunk of the next day just debugging it.
So yeah, echoing those who say not easy.
The other piece of it, is that I actually think a lot of software engineers are massively underpaid. I was at my last job for 7 years. My total compensation, including benefits, options and what have you, was probably less than $1 million (over 7 years mind you). But I can draw a direct line between work I did and the enablement of millions of dollars of ARR. The company probably got anywhere from a 5x or a 10x return from their investment in me. I was paid a little under market, but not so far under market that I'm that different from the norm. I worked on some particularly high impact features in terms of return, so that line is particularly clear for me and not all of my peers there could say the same -- but a lot of them could, and even for the ones where it was less obvious it's still true. As software engineers at a software company, our work is ultimately essentially. The companies don't exist with out us.
As a class, given that, we're still underpaid ;)
There are extremely few EE jobs available to be people with only undergrad degrees, and literally zero available to those without college degrees. This is a huge limit on the number of potential job applicants.
So you'd _think_ this would drive salaries up, but...
There's also just fewer companies doing EE work, where basically every company on earth does something with software at this point.
So there is a supply/demand component, but there just aren't that many places people who want to do EE can actually work. If you're top-of-the-field in EE/low-level CE, you will be paid handsomely.
In most industries, the money to pay more does not exist.
American companies dominate in software, and they are also highly profitable. Because immigrating to the US is difficult and because many people don't even want to move there, there is a shortage of software engineers in the US. Compensation is primarily driven by the domestic job market, where businesses compete for talent.
In other areas of technology, American companies are not so dominant. There is also more competition, driving the profits down. If an American business pays too much, it will get less talent for the same money. Their products will be worse and more expensive, and they will lose to their competitors.
As someone from Finland, I've been familiar with this dynamic since childhood. You hear about it in the news all the time. High wages are a grave threat to the economy, because they make our businesses less competitive.
There's also a bit of a coordination effect: just as you can't replace Ronaldo with 1,000 cheaper footballers, you can't easily replace one good dev with lots of less good devs.
I think you overestimate the quality of median software engineer. Even at a company like Amazon, I think something like 20% engineers can barely code. Add to that industry expectation to work independently with little guidance and not that many people who will fit the bill.
There are lots of junior engineers, who with guidance and mentoring can actually flourish but your average "move fast" startup won't invest in them.
I see all these coding bootcamps that supposedly graduate people who can get right to work, but in my experience less than 20% of the bootcamp graduates I've interacted with were even remotely competent, or seemed like they could even be trained up to be competent. Many who I'd kept in passing contact with ended up going into engineering or product management within a year or so (with very entry-level roles).
That's not to say you need a 4-year CS (or related) degree to be a successful software developer, but in my experience it's more difficult than you seem to think.
I do think it's bonkers that software developers in the US (and especially in northern California) make orders of magnitude more money than software developers in western Europe, in places with more or less comparable costs of living.
I think EE/CE-related jobs are mostly harder than software-related jobs, but that doesn't make software easy.
Regardless, I think it's just a matter of supply and demand, plus where the easy VC money has been directed. Most companies these days do something with software. Most do not do anything with hardware beyond buying finished products. Add to that the fact that most EE/CE jobs have moved away from the US and Europe, so Western EE/CE types don't have much in the way of employment prospects compared to the number of people who graduate into the field.
It requires above average:
- Working memory capacity
- Tolerance to extreme frustration, persistence
- Ability to learn fast
- Capacity to deal with particularly high incidence of imposter syndrome
- I could keep going all day...
SE is not overpaid. I actually think it's underpaid. Above all, the profession pushes for an early retirement.
An electric engineer is working in physical systems and you can't copy physical systems that cheaply. Unless you are in RnD, you can't have the level of impact a junior software engineer can have.
Oh, and also the scale I guess. I worked at a startup that did Hardware + Software. One hardware team (6-8 people), 5 software teams (~50 people).
I started out doing EE at UT Austin and never finished because I was a lazy child and had zero awareness of what I was getting myself into. Decided to get a computer "engineering" degree from a cheaper local institution.
Fast forward 15+ years, and I am now responsible for more than I dreamed possible and am considering an EE path as my next steps for when I burn out as a software developer. I still deeply enjoy all of the nuances of electronic devices, how they are made, why they work, etc... I used to work in a semiconductor factory, and might return with a new title some day. I also might go macro and help out with the grid. Both things are still very fascinating to me.
There's no way guys like Intel/AMD can keep this up.
"Can't afford" simply isn't true given our insane need for chips.
They can. They're aggressively expanding in Europe and Asia where the EE wages they pay are very good for the local market.
Also, Israel is a major tech hub where Intel, AMD, Apple, Nvidia and others develop plenty of cutting edge tech.
Not everything has to be done in the US with US wages and those companies know it and are taking advantage of it.
Hardware can't hit the market reach of software so your potential value can never be as high.
The fact that most software companies fail is an entirely different fact.
For me, on the East Coast of the US, there didn't seem to be many jobs at the time and I wasn't feeling moving to the West Coast where I assumed the jobs would be. In the end I went down the software track for my career and feel like it would be impossible to get back into this. I can probably get my kicks by playing with a Raspberry Pi but I always wonder what that alternate universe would have felt like going down the CE career path.
This more or less describes me. I switched after graduation (nearly 20 years ago) more because I realized I didn't enjoy it that much, and I already had done a bunch of programming for a while, which I did enjoy. I only realized the financial upside later.
When I'd hear over the years from friends who'd stayed in EE/CE fields, it did indeed sound way more stressful than what I had to deal with, even though there was a lot of stress in my jobs as well.
I have EE friends with master degrees who design PCBs that are printed millions of times that struggle to afford rent.
It makes no sense. Why is web software so easy to make money in? Why do we value hard engineering so little?
Interest rate intervention (making them lower than equilibrium) changed the DCF valuation that is the fundamental way of evaluating business (i.e. people working together on large endeavors) to one where promising massive returns in the future(20+ years) made low returns in the near term (x<20 years) acceptable. Investment dollars chased growth and VC and PE grew. Meanwhile, in order to compete for capital, all businesses focused on delivering value now had to adjust their income statements (try to grow revenue, or more likely cut expenses) as hard as possible to compete with businesses that aren't delivering on much now but seem like they will change everything in the future.
Google "DCF Valuation" and model out 100 years in excel. Add a row that divides the PV of each year by the NPV so you can see the % it contributes to the NPV. Then setup a bar chart so you can visually get a vibe for the integral of different time periods. Once that is all setup, try a real simple assumption: 0 growth for CF and .5% for your Risk Free Rate. Then try it at 5%.
If the Federal Reserve sticks to its guns and gets rates up, and keeps them there, it will be unsurprising to see "hard" engineering jobs be valuable again, while all the CSS people suddenly can't find two pennies to rub together.
All of the goldbug quantity-theory-of-money folks were screaming about Greenspan holding rates artifically low for so long -- both before the tech bubble in 2001 and up until they started to delicately raise rates after 2003.
Once the real crash starts to unfold it may get bad for awhile, but once the Billionaires start getting worried about it they'll hit the monetary gas pedal again. SWEs will mostly weather the storm, and even if it gets bad enough that a lot of them lose their jobs (particularly at the margins), it'll still all come back.
Which is to stay that the Fed will definitely not stick to their guns once something significant pops.
(And your analysis kind of shows that there will be significant pain because everything has been built up around the interest rate environment that we've had post-2009 and trying to create any kind of structural change there will definitely cause a crash -- And the Fed works for the Billionaires so once they start to feel the pain then policy will reverse)
I was under the impression that "hard" engineering isn't as financially attractive to investors due to the long payoff period as compared to web software. In that case, won't high interest rates actually hurt "hard" engineering due to a greater discount being applied to future cashflows?
"Software engineering" is probably subsidized by borrowed money aimed at growing locked-in userbases and therefore valuations.
According to the Friedman rule the optimal interest rate is 0%.
I can maybe see how low interest rates could maybe increase software salaries. But I can't see anyway that would depress EE salaries.
The economy does not reward people for how smart they are, how impressive their credentials are, or how hard their skills were to learn. It rewards people who solve problems. Preferably problems that other rich people have. Many, many, many more rich people have website problems than PCB circuit board problems. Ergo those that can solve website problems make more money.
Doing something difficult can be a way to make a lot of money, because it restricts supply and is a competitive moat. But low supply is a winning recipe when coupled with high demand. The supply of string theorists is low, but so is the demand.
If you’re going to do something difficult make sure that it’s something that lots of rich people need. For example being a hedge fund quant is a really well paying job because it’s both difficult, and highly in demand because every rich person on the planet wants higher returns on their portfolio.
Expert machinists using fancy equipment in a factory don’t really have the same ability to just go to a competitor, so they can’t negotiate a raise.
Not sure this is reality; or that a meritocracy for webdevs exists.
In recent years, it seems the quality of big "prestige brand" websites (not exactly big tech) is declining, to the point of being unusable on mobile browsers or throttled connections. Not because of technical trade-offs, but reliance on frameworks to achieve "demoable effects" instead of real world features or usability.
We're more than fifteen years into "ajax" and today's kids simply can't do it without burning 10x the cpu cycles and bandwidth. To me it seems there is a decoupling between the utility of webdevs and their pay, and some sort of market failure is occurring.
Machinists own their own tools, in general... and I knew a CNC machinist who quit, went somewhere else, and came back to the shop I was working in over the course of a few months... they aren't easy to replace, even in a job shop. The equipment is a sunk cost, and if you don't have someone who knows how to use it, you're wasting money while it's idle.
I noticed the trend about 20 years ago when I decided that I'd never again have a job that was mostly hardware design. The money's in software.
Also, I'm not sure about the supply and demand argument. The typical ask for a data scientist is much simpler than specialized ChemE knowledge; supply should have ramped up so quickly that comp would have fallen to typical engineering salaries long ago if that were the case.
I would question my assumptions before disputing the function of supply and demand curves.
My one my high school classmates was a mechanical engineer who was one of the top guys for some semiconductor process that failed to scale and got abandoned. Got laid off and was unemployable. He sold herballife to go back to school and is a physical therapist now.
It happens in software too. Plenty of sad tales of geezers let go from banks and government who have legacy mainframe or middleware skills.
I've worked with many people who were realistically less than 5 years from retirement, most of them were "trapped" but this was often by design. They knew they could ride out the next 5 years getting paid pretty well, having low stress, and the stability of this ensured that they'd hit their retirement financial numbers. Sometimes younger engineers fall into this trap, and that's when it's sad, often they don't even realize it and no one tells them.
In Latin America there's also a meme that lots of engineers have become Uber drivers. I have definitely chatted with more than a handful that were engineers. All of them said they hated programming classes too much during university, so they won't try to change to SE now, or to anything really.
I graduated as EE but immediately became SE. Ironically I also doubt that I could go back into being an EE now, so I'm also a bit "trapped".
Well, a lot of people warn of those, but you were one of the naive people who believed throwing your career on rusty crap was a way of "never ending unemployed" or you thought banks for some reason were the best technological choice option then I guess those are the consequences of your actions (not to mention the mental toll)
It was a really basic CRUD app, but it increased efficiency dramatically. Insane amounts of value were created.
The repairers were experienced professionals. Some had advanced degrees in EE. But that silly little CRUD app made them exponentially more productive.
They could have been heart surgeons, astronauts, or anything really.
ZIRP, and the VC ecosystem, FAANG stock bubble, EBITDA-creative accounted IPOs, etc. that grew up around it, are a helluva an alphabet soup:
> why do companies do mental gymnastics to call themselves a tech company. It’s because venture as an asset class traditionally invested in technology because that is what presented the growth and return characteristics that matched their risk profile. So you try to call a desk rental or mattress seller a tech company.
> Then, for the companies that attracted the money had to spend it. Salaries inflate. Cultures change. Consumers are subsidized. Sure, some technology is created, but overall, nothing operates as it would without that thirsty capital. It changes the economics for competitors that do not welcome in the dollars.
I’m sure it’s more complicated than that, but that being said...
Finally someone said it!
Ok, I understand that we’re in some anal-retentive age as humans, where we’re discovering our new digital medium and whatnot, but enough with the fixation on small shiny digital bullshit - it’s robbing us of real depth and wonder.
And if it’s reached to the point where real engineering is depreciated, I wonder what carcasses lie ravaged in fields no one looks at until it’s too late.
> It makes no sense. Why is web software so easy to make money in? Why do we value hard engineering so little?
I think it’s understandable if you explain it using the same reason that mushrooms are clearly the better drug, but cocaine is the more popular one (our dopamine wiring? idk), criss-crossed with an actively pushed economic bubble that’s high on the rise, and will be for a long while.
This post in no way diminishes the complexity of the hard sciences.
What makes shrooms better than coke? Never tried either one, probably not stable enough to risk it, so curious what your valuation criteria are.
Not difficulty, danger, or years spent studying challenging topics.
Changing a color... that's basic html/css-- that's the lowest pay rung of front end developers, which is the lowest paid occupational field within the occupation of full stack web app engineers. So, you're grossly exaggerating. People who just change colors of a web site might make $15-20/hr in the US.
The folks making $150-$200k+ have skills in frontend development, serverside development, databases, plus often things like data engineering or setting up/maintaining cloud computing infrastructure. From there, add in specialty Security knowledge or Machine Learning or highly efficient massively global scale+speed programs, and you'll begin to understand how they're making $400k-$600k+
For example: Go have a look at kubernetes (combined with Docker, Helm, and a Cloud provider + all the various underlying technologies which involve an OS and 2+ programming languages and/or frameworks) and tell me how difficult it is compared to machining + CNCing + welding a metal part. Not to knock machinists-- I've worked on metal machines and I understand why master mechanics make $100k/year.
But I also see why they don't make $200k/year (exception for luxury vehicles)-- there's only so much to learn and only so many dimensions of complexity, especially in terms of continuing education on new technology.
ORMs and cloud DB providers have enabled people to charge ahead without understanding the consequences of their actions, and when latency starts climbing, just scale vertically!
Re: Kubernetes, again, managed services and package providers like Helm have made it easy for anyone to spin up a K8s cluster and even successfully run things on it, without having the underlying ability to fix or maintain it when things go awry.
> But I also see why they don't make $200k/year (exception for luxury vehicles)-- there's only so much to learn and only so many dimensions of complexity, especially in terms of continuing education on new technology.
I can simplify k8s down to "this is just a bunch of containers that run small pieces of software that talk to each other and autoscale up and down as you tell it to", and naturally that doesn't scratch the surface of what's going on.
In 4+ DoF metal manufacture, there's so many things to keep mind of. Basically, those people are metallurgical-based materials scientists. They'd be in charge of helping choose the materials for the application, billet sizes from the ingot factories, QA at all levels, impurity calculations from the ingots, grain structure sizes, can even be radioactivity measurements.
And then there's the actual machining process. If you've ever used a 6DoF mill, its nowhere near idiot-proof. And one improperly tightened part = damaged 6DoF head. That's a sad day indeed. And you're not done after the part completes. There's also post-processing all the way on up.
And I didn't even discuss metrology - or the study of measurements. Measuring what you're doing is the difference between a passed part and a failed part. And your failures may be caught upstream. And depending on some parts, you may also be doing xray spectroscopy to determine voids and other subsurface defects with the machining type you used.
You also mentioned welding. That's its own massive area of tons of failure modes, not all which also can be seen by the naked eye. Or, imagine doing underwater welding in a water tower that sprung a leak because someone shot at it. You're going up with 200 lbs of equipment, including SCUBA gear and thermite or a thermic lance.
To be honest, I have it easy. I work remote as systems engineer. I thought about switching to EE if the economy cools. But the "blue collar" (Read: tremendously skilled roles) are looked down on because they mess with physical stuffs. And the EE's do physical but are considered white collar, so they're "more acceptable". But I try to see them as the fellow professionals they truly are. The end of the day, they can hold their stuff and go "I made this". I certainly can't hold up an EC2 and say the same. Doesnt have the same feel.
There’s like no ecosystem for Hardware in North America. Definitely primarily software
You'd be surprised how many people have, sell, and get venture capital for ideas without a huge upfront spend attached just to try something out.
Software is the scratch tickets of "engineering."
Sometimes this creates falsely inflated salaries- like when a company insists on hiring Harvard MBAs. Often it creates lower than expected salaries. People forget just how many Asian EEs there are.
Imagine if you make airplane engines for GE as an ME, and you want a higher salary, who else are you going to go to? There's barely any companies in the country that have the capacity to make airplane engines, so there's less competition for your labor. Your work is probably fairly specialized to, so if you transfered into some other engineering domain, you'd start at entry level salary.
Now look at a software engineer working for google, if he wants more there's a dozen companies, in silicon valley, FAANG, wall street that will take his services, he and a few friends can even start their own thing with their laptops and a garage. And skills transfer relatively well between different software engineering jobs.
Because software as low entry to barrier and many employers and non specialized skillset, the competent generalist has massive leverage in the labor market. Mr PhD in electric optics, has 3 employers in the country that can give an opportunity to actually use his degree.
If the current supply and demand stays the way it is eventually things will even out. Programmer salaries will stagnate as more people enter the field and other fields may have salaries go up if there are not enough candidates.
This process is very slow though. It’s not a very efficient market due to opaqueness and the fact that people don’t like to change jobs too much. A huge reallocation can take a generation or more.
Only if ditches are valuable enough.
Otherwise everyone does without ditches (long drop, sump hole), or find solutions that don’t need ditches (eg wireless or overground cables), or invent new ways of making ditches that are not digging (ditch witch, suction vacs), or other smart workarounds (trenchless water pipe, fibre optic horizontal drilling).
Engineering processes are very well developed and therefore less leadership is expected out of the ICs.
I mostly troubleshoot this dreadful old C# application these days. At this point, you almost couldn't pay me to do it. And I don't have any better prospects with the great depression 2 coming up
Also, looking at a Marketing Company for example. There would be a ton of need for Software Developers, but generally no need for an electrical engineer.
What you earn is a function of the complexity of what you do and the value you generate. Web engineers earn a lot because of the second part of the equation.
low barrier to entry, high potential impact.
also web development is trendy and developers in general will make a lot of fuss about pretty much anything (tabs vs spaces? emacs vs vi? xorg vs wayland, "btw i use arch" is literally a meme) and about money in particular... a lot of company finally give up and accept paying more and then a lot more companies have to follow the trends and pay more.
The close a business or individual is to justifying their salary, the more they are paid.
It's why bankers are often paid more than they are worth.
This is an industrial effect: 'hardware' is moving to China, because VC doesn't like long business cycles. And so the jobs shift there etc..
Video game logic leads people to believe hard => valuable. But the real world is different. Valuable => valuable.
But that's also kind of gauche to say because it's a bit of punching down. The tradition in society is to say kind things about those who have to work hard and produce little value. "The janitors are the real backbone of America" and so on and so forth. Absent value we must feed them platitudes so that they can clothe themselves in shreds of dignity.
Occasionally, though, it's worthwhile to look at the truth. Which is most definitely revealed in our revealed preferences. How much would you spend on this occupation if you started a company?
Value is determined by quantitative exchange in a market and not by any intrinsic qualities such as difficulty of labor.
This is with 3 years of experience in Full Stack Web Dev SWE, and zero years in DevOps (but some basic experience with containers, cloud, and general linux server config).
I was hired as a Web App SWE, but was offered the chance to train in DevOps and took it, in order to grow additional skills. Full stack web apps are mostly CRUD. To learn the ecosystem around them... in terms of Cloud Computing, Kubernetes, and Customizing Containers... I consider that incredibly valuable.
My company needs DevOps but is having difficulty hiring for the role-- it's in high demand but low supply apparently. So, they're letting me become one of their interdisciplinary DevOps/Full Stack people.
I'm very happy as I am learning new things, working with a great team of enthusiastic, positive-attitude people, and upon my first promotion I imagine I'll make around $250k or so.
I don't know of many other professions where this is possible, while working 100% remotely.
Due to the nature of SW, you can reach thousands of customers with little investment, so you may as well slap a fat margin on it. There are SW products associated with pretty much any economic activity these days, from textile machines to farm management (got job offers to work on products in these areas recently), so SWEs are in high demand. And if the market is not doing well, you can always strike on your own and try building something people will want to use and to pay for, for little investment (again).
My long term plan was to switch to SE, but DevOps jobs are absolutely on fire rn.
Seriously, people, what did you expect was going to happen?
I graduated a double degree of EE and Comp Science. My final year thesis was a project with a local company integrating their custom designed GPRS module with a GPS module to demonstrate mobile tracking tech (it was 2003).
Because of the project I went straight into a hardware design job out of uni. I was designing boards for mobiles! Working on circuit design, prototyping PCBs, I was psyched!
Problem is the company was run _terribly_. 3 months in we all got put on 2 weeks forced leave and then given redundancies. Aside from dinner pretty terrible management, the economics of doing mobile hardware design in Australia just didn't stack up (we were the only company trying).
In the end there were probably 10:1 software jobs for every hardware job. Given my experience so far, I opted for software.
It has been great, I love software, but I sometimes wish I had lived somewhere that had a higher critical mass of EE work.
I still build audio amplifiers, filters, and dabble in RF, but my day job is mostly administrative, now.
I see this within regions of the US in regards to software. If you grow up in a tech hub(even just your local state one) it can seem like it was "easy" to get into computers and mod games. But the farmers son's who are technical near me where I live now go and tinker with cars and diesel engines the way I did with my dad's old 286.
This is what I was thinking. There are few fabs in North America anymore, so SMD work requires a lot of shipping and patience. You can also outsource EE work to China very easily and they have a long history doing this stuff, but outsourcing software is not so easy for some reason (yet).
I had the exact opposite experience. The CS dept. actively recruited me, and I didn't even realize what CS was until I had completed 2 years of study. I like CS, but had I known better at 17yo, I would have gone EE or CE all the way, because that is my interest, and there is little chance I could uncover anything on my own about electronics without the structure of university. I never wanted to be a programmer, but programming is not computer science, nor does a lucrative programming career require a computer science degree, or any degree. But every where I have seen, a computer science career requires a computer science degree or a mathematics degree, but I have little doubt there is the odd physics grad or engineering grad working as a computer scientist. Just remembered, I know a guy with a biology or botany degree that works as a computer scientist on the modeling side and has for almost 20 years. But he just does modeling, not everything CS.
E.g. if you got a job doing low level programming, like firmware development or even chip design (basically still programming) your EE knowledge would be very helpful.
Also at the time I went to school (2006-2008) there wasn't any soldering and very little hands-on anything in the courses I took at CMU and UMass Amherst. It was formulae and Verilog (more coding than a lot of CS classes).
In the Intro to CivE class I had to take for scheduling reasons we made a cardboard bridge and watched it fail to support the professor who was suspended over a pit. I can't remember a single ECE lab from the two years of classes I took. I learned a lot more from playing around with Arduino and such in the years after.
I recall numerous instances in my undergraduate career where professors and the students passionate in these fields would actively persuade people who didn't get exposes to the stuff from even trying.
I remember one professor teaching an elective class on the Linux kernel and how he would try his best to scare people into dropping the class early on. We lost 25% of the class after the first lecture (and the only girl). To be fair the class was hardcore for the typical level of CS students you'd see in my school. It was painful but fun. I just wonder about the people who got scared away, maybe some of them could have really embraced the Kernel and become contributors.
To this day I see hostility among the low level crowd in my dealing with people in the industry. They think that because what they do is more complicated that writing a bog standard web app that they are special and should be left in their caves not to be disturbed.
On a different note: I feel this is playing at least a small part in hindering Linux adoption. I have dealt with the community on and off for over 10 years and just the level of negativity that comes out of that community has got to be putting off at least some people wanting to tip their toes in the water. We need to MLLGA: Make Low Level Great Again! At that starts with really welcoming normies with open arms and patience while they get over the initial hurdles.
We're looking for bare metal developers so basic SPI, I2C, UART knowledge is essential, but even in that realm it's surprising how many embedded devs can't work outside of an RTOS and lack basic hardware knowledge
If it’s not paired with humour though, it can be quite horrible
I've a few unsubstantiated hypotheses why:
Zero Sum Game taught by boomer parents conscious that the party was already over but would be sustained inevitably with increasing ruthlessness.[1]
Intellectual Property laws started slicing and dicing knowledge that became temporarily more valuable by fencing it in.
Higher Education caught performance metrics (without much if any performance pay) and so especially elite schools but eventually the rest cottoned onto input pre selection as a way of surviving.
Society became a overt lottery particularly for low income families. This placed the easy weight of the burden of responsibility of tutors on the side of direction to easier subjects.
I believe all these factors and more began compounding exponentially with the enclosures act imposition of a requirement for a degree for work with no or marginal utility for the same.
Marketing, loans and for a particularly insidious enclosures act, the UK's "classless society" introduction of University status not only for a tsunami of new institutions (the UK will now accredit degree granting university status in only two years..) and the white washing of the vocational STEM/ Engineering focused UK Polytechnic system (making them independent degree awarding institutions enabled diversification and dilution) which multiplied the demand for least effort maximum passing grade tertiary education throughout the world ,by the turn of this century to lamentable result.
[0] turning up next week seems to have been the objective of one I spoke with, to demean the honest and intimidated.
[1] The last market play I laid on with my late cofounder was in reaction to the dotbomb. Bought military tech suppliers :-| Point being we need the new program before we are fully grown up so to pass it on. With massive shifts to younger population and the current generation of middle aged folk being capitalized by three bull cycles the last two payable like a Japanese 80's mortgage, by the grandchildren, this means now.
Edit: second footnote placed and "enclosures act" gains a act adjectivally. Edit2 for work para7 Edit3 last para payable not relatable.
Security failure is a systematic fault of businesses and regulation: not the fault of the grunts mining at the coal face.
EE is hard. Bad EEs are unemployable. There's no demand. EE is also beautiful and absolutely fascinating. You get to:
- Do beautiful math
- Build things and get your hands dirty
- Do super-creative design
... and so on. It's just fun!
I have a Ph.D in electronic engineering from a top school, and I don't regret one minute of the program. I graduated during a downturn, and finding a job was neigh-impossible, despite being one of the best graduates from probably the best EE Ph.D program in the nation. Industry wanted experience. SE jobs were easy to find and paid better. I eventually found an EE job but many didn't.
I noped it out of the field after that. The core problem is EE companies are no fun. Employees just aren't treated well. IC design jobs mostly have all the accoutrements of Office Space and Dilbert: cubicle farms, rigid bureaucracies, limited vacation policies, button-down shirts, and so on. None of these things make people more productive or contribute to the bottom line.
I do EE only as a hobby now.
I don't think the trick is to water down schools, so much as to make the industry less oppressive.
That said, the people I know who work in EE in aerospace seem to have an uphill-both-ways job experience, in terms of all the documentation and justification they do for their designs. There's a revolving door of new hires, and every once in a while somebody will get hired and actually stick it out.
Now the latest trend are MS/PhD only openings. Especially at IC manufacturers
But I have two questions for those: why not on-the-job training, and did the people that joined 20/40yrs ago have those qualifications? All of them?
Well, good luck filling those positions then! /s
I imagine part of the problem is there's just fewer EEs than there are CS grads, so the EEs who end up teaching might be the best at that in their field but since the field of CS grads is so much larger you generally get better teachers in CS. I feel there's also been a huge push to make CS approachable over the past 30 years and it shows, it's easy to find great teachers both online and in universities for CS. It's much harder to find good teachers for EE.
Knowing EE or CS does not make a good teacher. Being a good teacher is significantly more than simply knowing the material.
I have not completely thought this through, so I'm posting it as a condensation nucleus for discussion rather than as some "truth", because I only formulated it in word form just now.
While that's obviously an extreme scenario and unlikely to ever happen with electrical engineers, it's not too far off from what happened to many trades like welding.
Despite graduating top-of-my-class and honestly have a very good command of the material at the time, I graduated having ZERO skills in electrical engineering. I went the software route basically because its something I could pick up and learn and get work with, what the FUCK was I supposed to do with my "skills" solving textbook problems? As I said it was a shit school where the courses didn't involve nearly enough project work to build real skills.
The only people I know who went on to do legitimate EE stuff got a masters. Not sure how they are doing. I realized by the end of the degree I had zero passion for this shit, I wouldn't even know what I would want to specialize in. The only courses I found interesting were control theory and signal processing somewhat. I though the actual electronics courses blew.
Best decision of my life.
Similarly, the FAANG/MANGA folks of the world are beneficiaries of being closer to the end customer, at least in terms of always being able to track customer usage of products. And hey, some of the highest paid ones are doing some hyper scale stuff that touches billions of users. Then there's just the general markete conditions of having much more need than talent available, especially at the upper eschelons.
Credentials: 20 year EE, have the largest podcast about designing electronics (The Amp Hour, check us out)
FAEs were seen as the lowest level support who were a stop-gap for our application engineers (non-field). If a big customer had a problem, we would send an AE out to show that we were taking it seriously. No idea if it's the same in other industries though. Application engineers weren't that we'll paid either.
Technical marketing and sales were paid well, but maybe this is a just a title thing (what you called FAE is what we called sales/marketing). Their salaries were predominantly sale/deal based bonuses though.
For people that actually developed the chips, analogue designers were the highest paid followed by digital designers. Then digital verification engineers, with verification engineer salary increasing very quickly. The salaries were shit compared to software though.
But now, talking to my friends still in the sector(doing design), competition is absolutely fierce and their salaries are increasing rapidly along with getting a lot more shares. There was a point where a senior analogue engineer could move to a graduate software position and make more money. Those days are gone now, and salaries are pretty close.
A friend interviewed for a hardware position at Apple and the salary was definitely SWE levels of high.
Which probably explains why Apple is the only, or one of the rare few, consumer electronics companies with products most people actually want to buy and love to own.
People keep pointing out how shitty products form Apple's competitors are (Samsung, Dell, etc.), but when you look how little they pay for talent in comparison to Apple, it becomes obvious why their products are so inferior.
Engineering great devices is expensive, and since the West offshored everything to China, and went for cutting costs as much as possible including on engineering wages, how can they expect to deliver quality?
Inflation has been brutal to gen y/z in ways that aren’t tracked. A recent college graduate in EE makes about the same they did 30 years ago, but paid 10x for college and 4x for housing.
Soon after graduation (~y2k) I know another, older EE who just retired early because it just wasn't worth it to work (and his SO made good money). The salary he was going to get with 10 years experience as an EE was about what I got straight out of college.
https://ee.stanford.edu/about/fast-facts
https://eecs.berkeley.edu/about/by-the-numbers
[edited: formatting]
Source: http://www.boblucky.com/reflect/may98.htm via
At the time of this comment almost all the top comments are about pay. No doubt pay is an important consideration, but it worries me when it’s the primary consideration. The best engineers I know like engineering because they have a natural curiosity to learn how things work, not because it’s the easiest route to riches. Any engineering position can lead to a comfortable life, but when everything is about the hustle to make the most money the quickest way possible, it’s worrisome.
It’s like when you look at how the career fields for elite schools tend to fall into a few select categories: law, finance, consulting. (Sometimes medicine but that can be for or against this point, and would be a digression.) It’s not bad per se, but it’s can smell an awful lot like status climbing. To paraphrase a professor of mine: “there are people who’s goal is to climb to the top of the world and those who’s whose goal is to build the world. Be careful not to confuse the two.”
If EE paid better, I would be an EE. EE is interesting. Neither SWE nor my chosen career are sufficiently less interesting than EE though so it's not like I picked an uninteresting career for financial reasons.
At the end of the day it's all just problem solving. If the puzzles are all fun and don't go against my principles, I might as well go work for the highest bidder.
In uni I got offered internships as part of the course, to write my Master's thesis.
An engineering firm in Aerospace offered me a job doing something with airplanes. Pretty interesting, but paid £12K/yr, which really isn't much money, even for a student.
I kept my mouth shut (uni wanted us to take whatever came) and got offered a job at a major chip manufacturer, but in marketing. Also interesting, £15K/yr. I took it and learned a bunch about that business.
I went to visit a friend in London who'd gotten an internship working at an investment bank. £38K/yr.
What do you suppose I applied to when I graduated? It wasn't engineering or marketing.
As things happened, I'm still an engineer. I just make algorithms to move money around, and that's also got nerd value, since there's interesting problems in financial trading as well. I've literally has days where I made more money for myself than I would have gotten in a whole year at either of the other careers. I also ran into people who'd dumped those careers to work in finance, because the incentive is so strong (colleague directly told me he saw his boss's payslip, then decided to leave).
This is not incompatible with what the comments are saying. I'm an ECE, I love hardware and working in VLSI, and I eagerly engage with the open hardware movement. I also love working with software, it scratches the same itch, hacking on multiplier layouts and hacking on SIMD accelerated code both involve the same type of analytical mindset.
And yet I work in software because it pays more. As always, these "this highly skilled occupation is collapsing" posts are linked entirely to pay. A talented EE or CE can easily transition to software, and they do.
You also may have a friend who's already made the move telling you to join.
My issue with this take is that it’s also quite likely that the jobs are completely dissimilar. How long do you think it would take a front-end SWE to develop a 480kV electrical grid or vice versa? Both domains require specific skills and should be respected as such
And it is not about "hustling to make the most money the quickest way possible", if you switch to a company which pays a lot better for a similar job. And as a lot of those jobs are located in expensive places, an engineering salary doesn't easily pay for a nice house or any house at all.
The new goal is to have enough money for gas and healthcare while also paying off the student loans.
However, I would not go back to work in research for the miserable salary I was getting. If they paid the same amount I make now, maybe.
(And how good is it these days with things like, the RISC-V ESP32 with wifi and cheap as chips, unreal little modules, for example measuring temperature to +/- 0.01C, air-born particles, radiation etc etc).
This said, I doubt I could afford a comfortable life with a house and family in a major city like Sydney on half the pay. I know as I have thought about switching from SE back to EE many times.
Incidentally, I had a friend who was an EE who went on to become a Patent attorney. He told me he did so because he didn’t want to spend the rest of his life maintaining a print server design.
I don’t think there’s any disagreement here, but it’s because you layered on an important constraint regarding location. Similarly, there are very few jobs that could support living in Manhattan or on a yacht, but that’s not really the larger point. The larger point being, the posts focused singularly on money. Like your location, if we narrowly constrain the problem, of course it means there are only a few options that make sense. If all I care any is money, there are only a handful of acceptable job prospects.
Surely, there’s jobs that pay double SWE. And ones that pay double that. But I think most people would agree, there’s a certain point where it becomes absurd to focus solely on that.
It’s understandable when a physicist goes to work for a hedge fund, but when it becomes the prevailing track for degreed physicists, it might say something about our collective value system
Pay is the primary consideration because capitalism forces us to make it the primary consideration. Most folks would rather it not be, but when our survival depends on it, this is the result you get.
Strangely, the EE degree seemed to trigger certain hiring managers into downplaying my software experience and lowballing my offers. A lot of "He's an electrical engineer but he can write some code too" introductions. I heard a lot of "Wow, you're really good at software for an EE!" from various people. The stereotypes out there are baffling, given how multi-talented all of my EE colleagues are.
There are companies out there that value EE experience and pay appropriately, but you have to look around. If you get stuck at a place where management believes CS = high pay and EE = necessary evil, it's time to get out.
Traditional engineering: you work on one field for X years to become a specialized expert of that field. The industry employs thousands of these "specialists" solving similar problems over and over.
CS/IT: you worked on one field for X years, codified your know how know why in a piece of software/algorithm/library, the field became so mature that minimum wage high schoolers can use your creations. You move on into the next highly demanded blue sea.
Needless to say which industry has higher productivity that could translate to higher income.
Going into software development was a no brainer over EE. Twice the pay, "twice"* the job security, and no credentials needed beyond my degree and ability to deliver.
It was made even easier given that most of what I did during my EE project labs was doing the C coding that the other EEs didn't want to do. At that point I figured if I ever needed to go back to EE (some super-dot-com burst or whatever) I probably could.
* twice is just a vibe. Graduating in 2011, during a recession, there were NO junior EE jobs and I managed to get a coding job out of college.
Hardware or software, either way you'll still be optimizing buttons!
It’s worrying in the long term, the industry’s been built on underpaid geniuses. We need them, but you can’t begrudge them moving to the money.
Google isn't really a great example because they are one of the few places to get highly paid EE / hardware jobs.
That said, there are many more software jobs than hardware jobs at Google.
We had a broken black-and-white TV that was mined out for parts (tubes, resistors, capacitors etc) but the front shell with the CRT, high voltage anode connector and deflection coils remained.
How do we make this light up again with teenager-accessible loose parts? What can generate the necessary voltage? Oh, how about this car ignition coil. How to tickle the coil into generating this voltage, i.e. pulse its primary input? How about this "vibrator" from an ancient tube car radio (a mechanical chopper for turning DC into AC).
But how do we rectify multiple kilovolts? No suitable stack-of-diodes rectifier stick on hand. Oh well, resort to the original HV rectifier tube from the TV. But need to power its (directly heated cathode) filament. Cop-out: Do it with a battery or two.
Voila, bright spot on the CRT.
Can we make it move? Well sure, just apply a suitable voltage (trial and error!) 60Hz AC waveform to the horizontal deflection coil. Voila, a line.
What about the Y axis? Well, we can amplify a microphone with a stereo amplifier and drive the vertical deflection coil with the speaker output. Amplitude? Trial and error (volume control). This was getting pretty cool! Except that it was like seeing the waveform wrapped around a cylinder, seen from the side, because of the full sinusoid horizontal pattern. Which looked really neat, but...
I can crank up the horizontal deflection by increasing the voltage (I did have a VARIC based adjustable power supply - homemade of course) until only the linear-ish part of the sinewave remained onscreen. From there it was a straightforward matter of an RC phase shift to drive one of the control grids (found by trial and error by applying small DC voltages to various pins until the beam was blanked) to blank out the right-to-left direction.
And we had a sort of homemade oscilloscope.
This was cool in so many ways but only because the technological underpinnings were still relevant at the time. Nowadays it would be as ancient as dabbling with atmospheric steam engines. Which I'm sure someone, somewhere is still doing. As an extreme niche hobby without a clear track into a profitable career.
(I'm a software developer, because the other thing, while obviously way cooler and more hardcore, sounded difficult and... I could teach myself how to program way easier than teach myself how to microprocessor design...)
In America, the business is run by "business" people, not by people who understands how to build the thing that the business claims to be building.
Layers and layers of management will not produce your next small arm powered microwave. So even if a company started off with success, eventually, it will be taken over by "business" people instead of a a committee of engineers+sales+product+marketing - all of whom work in sync.
You laugh at business majors? In America, they have more longevity than an engineer who specializes in a niche area.
But I get why:
- the pay is relatively low
- you have to be on standby at night (in shifts, but still).
- you have to travel all across the country and you’re suppose to do this in your own time (only the time on site with the customer are payed working hours).
I wouldn’t pick a job like that either.
I'm a mechanical, electrical and systems engineer. A hardware person. I design medical devices mostly. Yes I could make more money being a dev, a data scientist, etc. But I don't want that. I deeply enjoy, love what I do. I love that I can receive a list of requirements on Monday and sit down in CAD and design something, order it by Friday and build a medical device prototype the next week. It's so satisfying, and it's what I've wanted to do since I was 5.
I dont make the kind of salary that folks are saying here for a FAANG data scientist, but close - and I haven't sold my soul. I deeply love what I do, I have chosen to do the part I love and am deeply talented at rather than the part that pays the most. I still get paid handsomely.
Hopefully this inspires another person like me to say it's OK to not maximize your income, it's ok to do what you love and are passionate about. You'll still be ok.
I am more than happy to be a software engineer and earn 3x over what I would earn for a much more stressful job as an EE. A software bug in production or a bad deployment - do an RCA and you are fine. Ship a PCB to mass production with a weird design fault you somehow missed through all the Greek versions.. and you have a nice bill to explain to the management.
This exact thing happened to me at my first job out of college. I had a small error in my prototype PCB design for my first PCB design ever, and management used it to justify giving me a bad review and no raise. I ended up leaving the job after less than a year, and not long after I was doing software. I never looked back.
As far as I'm concerned, American companies simply should not do hardware. American management cares too much about short-term results and has no long-term vision, so they should concentrate of work that supports that way of thinking, like software. Leave the hardware to companies and cultures that can think long-term, like the Asian nations.
My experience in becoming a professional software engineer started when my parents got me hooked on computer programming as child. Computers weren't nearly as cheap when I was a kid as they are now (90s and 00s), but writing random software was mostly just limited by my imagination. All but the most advanced concepts could be experimented with on a laptop.
EE and hardware design is so much harder to access. The software is very expensive and the open source versions are very limited in comparison (I love Kicad but it's no Altium). Protocols for anything you'll find in consumer devices is locked down behind tons of fees, NDAs, etc. Parts are locked down in the same way. No one will talk to you unless you want to order tens of thousands of components at a time.
What do carpenters, electrical engineers, and grade school teachers have in common? They've all had this exact same article hit the HN front page about them in the past year. It's not interesting and any solution other than "pay more or accept the shortage" is worthless.
There are thousand plus companies that are specialized on finding talent for other companies just because of how difficult it is. I wouldn't be surprised if there already exists companies that hire these talent-seeking companies for your company.
Regarding the article's comments on the decline of tinkering, either it's causal or correlated, but it seems like students have far less physics (E&M) exposure than programming experience, and many don't want to grind through the math required (differential equations is often taught poorly, contributing to this phenomenon). So it's seen as far simpler to just do as little EE as possible to finish the degree and get the sweet SWE job.
Electrical Engineering is a very hot and in demand degree in South / South East Asia precisely because that's where all the jobs are.
A grad SF coder might make $200k but in Europe they might make $20k and I believer the main reason is that prices are sticky.
Like the housing market when you want to buy X you need to judge what X is worth and so you look to the market to see what other people are paying for X.
Candidates too are playing the same game.
The talent market acting like the stock market effectively.
This means SF companies have to be very profitable or grow fast.
A European company can plot along and be efficient, have bad sales etc. but survive on the cheap staff costs for a long time a a zombie.
It means any well tuned geo-arb company, selling in the US and paying locals outside can make a fortune.
Someone in 2050 “when i was young you could earn a fortune as a software engineer because for some reason they didn’t have the efficient global labour market we have now. Now that all the coders work for Uber Code many have decided electrical engineering is actually a better choice as it pays a bit more”
For all intents and purposes his job is absolutely critical to society. I basically fumble around and do a few projects for my clients.
I get paid several times more than him.
I am happy for myself but it makes no sense to me how society is mispricing our work.
Some time back I hired a few MechE's (with advanced degrees) to write simulation code for a startup. I paid them what I thought was fair for a Bay Area coding role, which turned out to be a significant amount more than the same job would have paid if they had a different title (they would have been doing substantially the same work at Honda/John Deere/Ford/Boeing/Lockheed/etc.).
Strange.
The 'fair' pay in the Bay area is sustained/inflated by the vast amounts of VC money with a _huge_ appetite for risk. I am not saying this is necessarily a bad thing, but it is absolutely unique, you just don't have those finance conditions anywhere else in the world, bar maybe New York.
Also all the quoted companies in your comment will have physical goods with way way smaller profit margins compared to most Bay area products.
My point is that it is not strange at all if you follow the money.
If your cousin and all his fellow power EEs quit for software jobs, how would this affect the pay for these EE positions?
In this case a very large fraction of EE's are walking away from the career into Software related jobs because that is where they are getting money that is "good enough for you to not walk out". So while previously companies working on semi-conductor, power electronics, RF design, where competing within the field, now they are bleeding grads into another 'transferable' skills field i.e. anything that touches software.
These companies either accept a full migration of HW design to Asia, which will last them a few more years until the COL is raised in China, India, Malaysia, Vietnam, etc. Or they indeed have to accept that the cost of goods just got marginally more expensive and labour costs on design will have to increase.
To quote the late Ray Liotta "Fuck you, pay me!".
Just before those magazines folded, it was more about trying to manage offshored manufacturing than doing any real engineering work. The more tenacious engineers were learning Mandarin to try to stretch their careers, but the writing was on the wall.
China graduates 7 times more engineers than the US now, and most of the people who would be suitable for an EE degree are going into finance or web dev.
So, not surprisingly, it comes down to power in the value chain. Doctors and lawyers have it, hardware engineers do not. Software engineers on the other hand deal with code that few people beyond a handful would really understand your code. It's not as easy to swap you out for another guy. Then, the cost of producing a new instance of a piece of software that is already working is very low. You make one website, tomorrow 100,000 people could use it. Hardware on the other costs typically at least a few hundred dollars for each unit. Software creates much more value especially on the web with zero marginal cost ... except maybe a bit of marketing until it goes viral.
So, software engineers create much more economic value (typically) because zero marginal cost and are harder to replace because software is inscrutable. So they make more than hardware engineers that do harder stuff much less cash.
Not unlike comparing the salaries of NBA players vs underwater hockey players.
In the U.K. you can go for engineering in many fields without it breaking the bank or having to compete for the top 0.1% of schools.
If india, Taiwan, and china can do better than the U.S. - there's a big problem. Rebuild the system.
1. Hardware engineers should be located near manufacturing facilities, and should speak the same language as factory workers.
2. The economy in its infinite wisdom is signaling that Software engineers are more valuable than hardware engineers. Assembling and optimizing the logic of human society is an extremely productive task, and results in huge profits for companies. For unit of time it is more productive to write backend software which controls the behavior of physical objects, than to re-design physical objects to be marginally more efficient.
There are lots of engineering grads getting jobs – in East Asia, earning a fraction of the salaries here, and working 12 hour days 6 days per week. US engineering graduates are retraining en-masse to be software engineers. The average newly minted hardware engineer graduate has a very low chance of finding a job, and will likely end up earning less than a skilled laborer. The lives of talented Americans are valuable, and should not be wasted learning unneeded skills.
As a data engineer I could make a base salary of $300k quite easily.
As an electrical engineer I could hardly make $80k for a lot more work.
Guess which one I picked as my full time job?
Given that, it sounds like students are making a rational economic choice to avoid an underpaid profession.
I see quite a few places doing embedded work in the uk that involves some electronics. When I did need to do something that involved elec eng recently we did a procurement which ended up in an Indian company providing some custom built Arduino devices to do signal testing and report back to us. They were pretty good, but there was no sensible option where we could get the same thing done in Europe on budget.
Unless software margins collapse or we end up with a dev glut, there's no way EE can bid labor up to match.
https://www.theregister.com/2022/07/08/semiconductor_enginee...
https://semiwiki.com/events/314964-a-crisis-in-engineering-e...
Smallish, recentish discussions:
America's chip land has another potential shortage: Electronics engineers - https://news.ycombinator.com/item?id=32048654 - July 2022 (5 comments)
Where Are the Microelectronics Engineers? - https://news.ycombinator.com/item?id=32012660 - July 2022 (32 comments)
Very low marginal unit costs, easy distribution, faster iteration, easier to scale, etc...
Maybe when this "shortage" materializes and companies have to start paying EE the same as SW I'll try my luck. No reason to for now, though.
Back in the day, lots of products came with schematics, now you might get a lawsuit if you dare to take something apart.
It was way easier to take apart and tweak through hole components too - now everything is surface mount or on chip and it’s much harder to get into with just a soldering iron.
The first thing I ever did was rewire a stereo to broadcast instead of receive to make a little pirate radio station - really, none of those things exist anymore.
I think it’s just a harder thing to get into now.
Kicad ecosystem is very much alive and I was able to reuse dimensions from another project for the same computer.
I made a RAM expansion board for Sharp X68000. You can check it out on https://github.com/stas2k/galspanic
Here is someone else doing it as well: https://m.youtube.com/watch?time_continue=27&v=AudEdHpXTi4&f...
We stockpile old proto boards that were never used for practice and give the kids old motherboards and hot air to scavenge from for practice
Another difficulty is that with the offshoring of so much EE work to Taiwan etc, many US companies only want to hire senior EEs who can direct projects overseas. This unfortunately kills of paths of growth for junior EEs in the US.
My own relationship with tech reflects in that: I despise some of the things that tech is used for and has managed to amplify in our species. I still pursue my passion because somewhere deep down I know that it could also be different: That the power of technology can also be used to magnify and infuse the world with values that make our existence worthwhile and wondrous.
Time to finally turn EE into a software discipline. Then some decent tools will be developed and normal folk will be willing to design stuff.
EE here. That's not really true. Some of the proprietary vendor tools feel dated, but dropping into Altium to design a PCB definitely feels like a modern tool set.
Care to name a couple? Because that doesn't sound right at all.
However, 99% of the time I'm using the text interface instead since I can type faster/easier than I can move a mouse accurately around, and I'm mostly running scripts anyway. The tools mostly produce textual output so it seems natural to give them textual input. So while they do suck, it's because they are not seen as worth investing in, and the vendors are probably right about that. Some new hires do miss the shiny interface at first but then they go on to get stuff done. Plenty wrong with EE tooling but text-based isn't one of them.
It isn't implying that a good workman has the skills to work wonders with bad tools, it's that a good workman will make damn sure they won't have bad tools to begin with.
This isn't about chest-beating. Its about attracting people to a discipline which is currently arcane because of obtuse tools.
Part of what's happening here may also relate to the fact that EE PhDs are dominated by international students (70-80%), compared to ~10% in CS undergrad. That gives companies a lot of negotiating power to drag down the average salary.
At least where I'm from, there is no shortage, it's all bullshit. Universities keep pumping out more EE grads than are needed. I guess it keeps supplies high.
My friends in engineering are passionate about what they do which is why they stick with it. I like the intellectual challenge I get from software engineering and the people I work with. I'm glad I get paid well for it too. I don't think I would get the same level of stimulation in an "engineering" firm here and certainly not the same salary.
Now all I do is tune hyper parameters, but man I miss all the wild mind-blowing physics that photonics had to offer.
Dave Packard once asked in a famous speech, "Why do people form or join companies?"
It's not to make money or create wealth, ultimately. That might be the goal, but not the reason.
It is so they can do things together that they couldn't otherwise do on their own.
Because otherwise they would on their own, you might say.
So if you think about that, that says a lot about what work is.
I can safely say I am a better SWE because of the EE grounding back in the day.
Anyway, I chose a CS Bachelor’s program with just enough EE classes to qualify for a MEng in EE later. I was thinking about starting from EE (I’m equally interested in both subjects, if not a bit more in EE) but I was discouraged by comparing salary info and seeing many dissatisfied electrical engineers. If things look better in 4 years I’ll do my Master’s in EE.
I would prefer if I could do both degrees at the same time, even if it meant more coursework and slightly longer time to complete. I can do a version of this with the degree I chose, studying n subjects in CS and n subjects in EE instead of 2n in only one of these, but ideally it should be 2n in both.
The farther you are from the end customer, the lower your pay is going to be.
There are too many levels of abstraction and distance between EE work product and the end consumer.
Reduce that distance, and watch EE pay skyrocket.
Kind of funny to see software people outearn them 10x these days. That's what gatekeeping does to you.
An easier, and more forgiving entry path based on self-learning ("hack my way through") in software development as compared to the entry barriers of EE - especially in India where Oscilloscopes and other equipment were crazy expensive some decades ago.
20% of my batch has moved to software over the last 25 years, compensation being one key factor.
I'm a savage software engineer with a mutt academic pedigree, even though I'm sidling closer to the civilized EE side. But I really don't know if that's the right choice, ya know?
So, I'll spiral into further existential dread because I can't coordinate these choices with others.
"On the Brink of Extinction", That's some grim storytelling right there...
The fruit of outsourcing tastes rotten, but people still love their iPhones. ;)
How about people use some of that STEM money to train thousands of new CEOs and Fund managers. This would drive down the relative top wages, create more competition, and increase tax revenue.
There is just so much more information available and so many nicer tools available for the fledgeling developer. You have a huge library of languages, frameworks, tutorials and open source projects that need another pair of hands. This makes a difference when community and quick learning is desirable.
Huh? No, it wasn't. CS started out as a specialization in the Math department, since CS is really just an extension of mathematics.
EE is very math-heavy compared to other engineering fields, but CS never came from EE though there's a lot of overlap.
As a software dev, I can find a job seemingly anywhere. As an digital designer, there seemed to be 10 or so companies I could work at globally.
Given the boom/bust cycles the chip industry experiences, and the few employers, the pay should be way higher.
Is this distinction still made?
* You can change software much faster than hardware
* You can customize software exactly to your needs in hours, depending on the skill of the SE even in minutes
* You can use hardware in ways beyond the initial purpose by changing the software
Throughout my first several years working as a CPU designer, I watched as the software industry expanded rapidly. The potential pay seemed to be much higher than what was available as a CPU/IC designer, and on multiple occasions I seriously considered a career change into software development. The CPU/IC design industry seemed to be consolidating during this same time period.
Then shortly before COVID hit, I started learning of various large software companies moving into the custom IC space, and the number of opportunities available to CPU/IC designers seemed to be expanding. Over the last year or so my pay has increased significantly due to the apparent worker shortage our industry has been experiencing. It's not as high as what I've read an average FAANG SWE can make, but it's now high enough that I'm not feeling the same urge to make a career change that I was several years ago. I'm able to work mostly remote, have a great team, and get to solve interesting problems. That said, a fair amount of my work does consist of writing code. Having said that, the semiconductor industry seems to be notoriously volatile. Things are good now but could quickly take a turn for the worst. All of the software companies currently experimenting with designing their own custom ICs could decide these side projects are no longer viable.
If I was giving advice to a current college student, I would probably steer them toward software development rather than hardware. The unfortunate reality is that there seems to be many more opportunities and much higher pay potential working in software development than hardware. If I had started college 10+ years later I would have become a software developer. Is the CPU/IC design industry different from other EE fields in terms of potential pay and job opportunities? Maybe it depends on the specific employer.
2. On the other hand, the US(and/or EU) is not the only market. If engineers are being corned by the established large firms here, then instead of raising salaries to compete, smaller firms are simply killed off and replaced by foreign suppliers. China and India have very competent engineers, and make tons of US consumer products.
3. "Companies should just raise salaries and train up talent!" is obviously true, but there's still a large time lag between raising salaries and new graduates. EE salaries have been horrible and hard to find for 20+ years and freshman still get told repeatedly that EE is well paid and highly demanded. In the meantime, people retire later and the eventual crash just gets worse.
All of the above coming from an EE/ECE graduate who immediately went into CS because the TC is easily 3x higher
I assume you do not do scrum and agile practices.
- It is a job, not a career. Because the industry is so consolidated and highly specialized, there are actually very few jobs available. This is why I believe it pays so much less than software. Don't like the pay, the management, your coworkers, the career advancement? Where are you going to go? There are only a couple of options which may or may not be hiring for your specialty and you can only hop around so much since there are few employers. Having tons of jobs to choose from isn't just about getting more pay, it is another form of job security and freedom. This industry doesn't have that.
- I get the whole argument that software is more capital efficient and so more money can be spent on salaries, and it may be part of the reason, but not the majority. Many of these hardware companies are quite profitable. They don't pay more because there are no market pressures forcing them to. I don't expect this to change even with a looming talent shortage. I know someone with a PhD, probably ~20-25 years of experience, and has spent their entire career in a single area which gives them deep domain knowledge and is making slightly more than what an L4 SWE(4-8 years experience with a Bachelor's Degree?) at Google(assuming I the information I have seen on Google pay grades are accurate). This is total comp by the way and even adjusting further for geo differences doesn't change the comparison much. Yet this is considered pretty good for our field. It is almost a sense of entitlement on the part of the semiconductor industry that they shouldn't have to compete on pay. On a couple occasions long ago I heard from official HR communications and high level management at Intel say something to the effect of being proud about paying "around industry average". Weird how they don't want average employees or average work/life balance though.
- Salaries don't appear to have been influenced much by Google/Facebook/Amazon/Microsoft getting into the business. Disclaimer: I haven't looked deeply into this. Are their teams to small to make a difference in the labor market? Are they paying chip industry rates instead of software rates? Are the traditional semiconductor companies just ignoring it and refusing to match their pay?
- The industry like many others does not want to spend money to train people, so they only want unicorn candidates. Yet it is not enough to simply be an RTL designer, or a verification engineer, or a physical engineer, they usually want someone with the exact skills. e.g. USB experience, PCIE experience, power experience, etc. Things like tuition reimbursement are mostly a myth. It certainly was at Intel despite their claims of $50k for tuition reimbursement. The funds came out of the local discretionary budgets so managers never wanted to approve it because discretionary budget is also the same budget that pays for many other things like business travel and conferences. Discretionary was also the first thing to get hit in belt-tightening situations and was a good way to look good to upper management if a site manager wasn't using their full discretionary budget. Since tuition reimbursement is a several year commitment once approved, it couldn't be cut as easily once started.
- Managers are incentivized to promote execution over learning/growth or innovation due to schedule pressures and frequent hiring freezes. For the same reasons, managers are also incentivized to block transfers because they don't benefit from it, even if there are rules to prohibit this behavior. Combined with wanting unicorn candidates that will accept low pay, it is no surprise that it is getting hard to find employees. Yet if you outlive your usefulness or burnout they will have no problems laying people off and looking for a new unicorn candidate rather than retraining existing employees despite all the sacrifices they made. I saw this happen first hand. Layoffs one week, 2 new job reqs the next week for RTL jobs which are hard to get. Also since they were "silver bullet" hires to fulfill "critical" needs(false!) no internal candidates from places like verification would be acceptable. So not only did they not try to save anyone from layoffs, they also basically said no one inside the group was worth investing in either. Classy.
- Instead of a gradual talent pipeline, I think the industry appears to have had waves of talent progression. This leads to situations where if you are between waves, you will have a group ahead of you that is fairly young but in high level positions and which puts a real limit on how far you can advance regardless of how good you are because positions are just solidified. When I graduated most of the really senior engineers were from their mid 30s through their mid 40s. Grade level *distributions* get "maxed-out" as they say. Because there are few alternatives and people stay in jobs far longer than average, it becomes stagnant. People leave the companies or leave the industry altogether. When the current cohort that is now in their 50s starts to retire, there might not be a new wave to replace them.
- Chip design is far more demanding than software engineering because you have to get it 98,99% right the first time you build it or you get a brick back from the fab instead of a working chip. Simulations are slow, and post-silicon ability to debug while impressive, is very limited compared to pre-silicon simulations. Unless it can be fixed in firmware, you are stuck until the next batch of samples gets made. If you "go fast and break things", you are going to fail and get fired. Schedule pressures are constant: you need product on time for back-to-school, or Christmas, or competitive threats. Sometimes the schedule pressures are artificially created by lazy management that want to create a sense of urgency. Pushing to tapeout can be grueling because you don't want to miss your fab window by being late. During power-on people are expected to work in shifts 24-7. Power on can last 2-6 weeks on early samples and be in foreign countries. This grates on people. When you look at software which has fewer schedule pressures, is less difficult to make, makes more money, and has more employers to choose from, why would anyone want to do this if they possess the skills and intelligence necessary to do software. Even if you don't work for FAANG salaries you can probably get paid the same as the chip industry and have a better life and better job security and career options.
- There is nothing to get excited about and inspire kids to go into the profession. The startup space is minimal so no cool new ideas or riches to be had there. When I was in high school, hardware was the show, the performance improvements were huge with every generation. There was tons of coverage of new CPUs and GPUs talking about their architectures. Now there is little coverage. Between companies trying to be like Apple and minimize the importance of hardware and the chip companies releasing less info on their chip designs as performance gains get smaller it relegates hardware to being an after thought.
- Lastly, I don't know if this has changed with the addition of Google/Facebook/Amazon/etc making chips, but around the time when I got out of college, the total number of EE jobs was actually declining in the US. This pours cold water on the talent shortage narrative that had been used for many years falsely. If jobs are going down, you should have an excess of workers unless they are choosing to retire or leave the industry. It also acts as a discouragement for people to join or stay in the field. Who wants to be in a shrinking industry that is already highly consolidated?
The EE/ECE industry has no one to blame but themselves for the looming talent shortage.
That was a lot to write in a short time and I didn't do a ton of proofreading, so hopefully there aren't too many sentence fragments or typos.
IT is the opposite. Writing code is quite simple, but once you get to bigger systems, things can get really complicated.
It doesn't? Do you have any examples of anything proven to work better? I don't remember feudalism being very good at promoting specialization of labor.
I now do design related to power, instruments, electrical, controls, equipment design for explosive atmospheres, thermodynamics (steam), lighting, functional safety etc etc - which also means I do networking, software architecture and programming (mostly industrial controllers, but also tools for my own use) and pretty well anything with a wire.
To get to the point where you are designing real stuff, stuff the public might interact with and mis-use or have it harm them, without very tight direction and supervision is:
4 years degree 3 years graduate engineer 3 years junior enginner
and then you are just an engineer, so thats 10 years, same time to become a specialist doctor and easily as difficult to do well.
After that there is senior, lead and then principal engineer. Many peopoe never go past senior in their lives, lead and principal take qualities that start becoming all about personality as much as technical abilities, though a real talent in either direction may well rise.
It's a lot of work, the turnover annually of knowledge is one of the highest there is, based on rule of 72 wiith 6% new or updated knowledge to acquire each year just to stand still without developing means 100% in 12 years, in a full careeer thats the equivelent of 3-4 degrees worth of learning, often totally new concepts, that might need to be made during the working life of an engineer.
In order to do this it probably means you spend a certain amount of your "free time" on related interests, eg ham radio, recreational computing, side projects whatever.
It's reasonably well paid, but in order to really be good at it you don't do it for the money, you do it because you like it, it is calling. You would probably do it for free if they had good work and you were fed and housed comfortably. If that is not the case, then maybe it is not for you.
First year EE graduates are looking at around 90k starting, 30 odd percent start at over 100k according to our local university.
Mining engineering graduates might be looking at 120-150k first year out )remote or fly in fly out) right now, they are being wined and dined by prospective employers starting in first year of studying undergraduate!!!
So it's still probably underpaid though, even though at the twenty year mark EE might stretch to 200-250k pa, more in high cost centres or particular circumstances. but the span of knowledge required and responsibility is very high.
As they say, most doctors only kill one patient at a time (and then bury them), engineers can kill a whole heap of people all at once...