Some of the smartest people I know work in other domains: biology, chemistry, and even physics. They are sometimes baffled by tasks that seem trivial to me, and I'm under no impression that I'm more intelligent than them. I simply specialized and focused only on programming, while they program to accomplish other tasks in their domain of expertise.
Can this last forever? Of course not, nothing lasts forever. But wondering why the wealthiest corporations in the world pay their workers high salaries is perhaps like wondering why water is wet. Software has a low marginal cost, and the rest is basic incentives for the corporations.
Good programmers I know also overestimate the skill needed to earn high salary in this job. You don't have to go up the learning curve much; these days, you just learn yourself JS a little bit and go for a webdev job, making shit code and still earning more than most people in a given country.
> But wondering why the wealthiest corporations in the world pay their workers high salaries is perhaps like wondering why water is wet. Software has a low marginal cost, and the rest is basic incentives for the corporations.
Nah, that's like wondering why is this ice block sitting on a hot plate and still solid. The answer is: because it just got put there, and it'll melt in a moment. So too, will end high salaries, as most low-hanging fruits get eaten by software, made by mass-produced cohort of programmers.
Our industry has its share of cycles, but this, in my view, is largely wishful thinking on the part of people. Nothing wrong with optimism but...
Every 5-10 years there's a "technical shift" that forces everyone to reevaluate how they build software or more importantly what they build, and the race starts all over again. The ice block is removed from the hot plate is replaced by a bigger, colder block of ice. And when these technical shifts aren't taking place, the bar for what constitutes as "good software" inches upward.
If your standards for acceptable software were frozen in time in 1985: using modern hardware and software toolchains, you could accomplish in one day what used to take a small team an entire month. But if I delivered, say, what passed for a "good enough" database in 1985, it would resemble someone's 200-level CS final project rather than a commercially-viable piece of software.
Even if you have library support to hand hold your budget coders, even if you use a lot of them, even if you give them all the time in the world, they will produce more complicated, less coherent, less stable, buggier, and harder to modify, improve, or iterate results than better coders who understand the problem better.
That means that no matter how little you pay up front you end up paying more in the long run throwing more man hours and money at fixing the mess made. A good mess is easier to maintain and improve and costs less over time. A mediocre / bad mess takes substantial efforts to maintain and iterate on.
Its also probably a domain impossible problem to remove the ability for any coder to make bad code. If for no other reason that in any programming environment you can never stop someone from iterating by index over a binary searchable tree for a specific element in it, or you can't stop someone from turning a float into a string to truncate the decimal part to then reinterpret the result as an int. But if you don't give them the tools - the integer types, the data structures, access to the bytes in some form or another - you aren't really programming. Someone else did the programming and you are just trying to compose the result. A lot of businesses, like I said, can be sated on that, but its still not programming unless you are in a Turing complete environment, and anyone in such an environment can footgun themselves.
Eventually it has to change (imo), either through companies becoming more scrupulous in their hiring, or through a massive flood of new devs.
There will be a day, but when is hard to say. Thinking it's right around the corner is akin to the belief we're on the cusp of true AI. We're more pessimistic today than we were in the mid 80s. And non-programmers were programming with hypercard, filemaker pro, VBA, etc, back in the 90s.
There are of course former well paying jobs such as old-school front end devs (html/css/sprinkle of js) that are largely commoditized, but that's a given considering the low barrier to entry.
The same highly productive nature per employee is found in virtually every other high paying industry, most of which have not seen the pay for the higher end of those in the industry fall over time.
With these jobs in particular, what I see is that the definition of seniority has shifted to 'knowing the latest tech'.
So a junior dev who's just got to grips with React has become a React Developer, and they are now relatively senior in that field. The experience isn't transferable to other parts of the software stack though, it's too heavily tied up in the browser. So they end up as a super-specialised frontend dev.
It'll pay pretty well until the tech becomes obsolete, unless that kind of person enjoys maintaining legacy code.
On the other hand, if you think the software industry has a hard time figuring out (at hiring time) who the high performers are... science is driven by serendipity. Nobody can predict who will find the billion dollar discovery. Not even past performance is a reliable indicator.
So it makes sense to me that the salary spread in science is relatively even. If they could reliably figure out who to dump money on, they would. On the other hand, the FAANG companies clearly believe their hiring practices can select out the high performers... and perhaps they are right? If they're paying 3-4X what everyone else does, they expect to get at least 3-4X the value.
The selection process seems to do a good job of keeping out the lowest tier at least, although we openly acknowledge that we miss a lot of good people as well.
I was motivated because my older brother, and my mom, had already learned how to program, and they were quite excited about it. After getting past a few familiar conceptual hurdles, it became very easy for me to learn programming myself.
People who are only motivated by the money, or under pressure from others, have a harder time, because their curiosity and drive aren't activated. There's some sort of valve that lets the knowledge into your brain, that has to be opened.
For the most part, the people I know who seem to be motivated by money itself are not so desirous of getting rich per se (many are already rich), but are actually interested and curious about money in the way that I was curious about programming.
I don't program for a living today, but my ability to program is definitely a force multiplier for my work. It has either improved my earnings, or improved the continuity and longevity of my career.
may I ask what domain you are working in? Can you give some examples of how you've slipped in some programming knowledge into other job tasks? I love to hear people's anecdotal problem/solution approaches. Was the programming side of it actually slipping in some VBA/chrome extension/javascript or was it more of just an 'analytical' approach taken to a business decision.
But that is no definition of bubble. Bubble, at a very basic level, means there is a lot of capital flowing within, it has little to do with whether how difficult your job is.
The fat package makes programmers not realize this.
When I took the ASVAB in high school I scored a 107. That score is too low to become a warrant officer so I had to retake it a couple of years ago for my officer packet and I scored a 129 out of a maximum 130. That puts me in the top 0.1% of testers. I am not smarter or more intelligent than when I was in high school. I do write software though. Every couple of years I look back on my software and algorithms realizing how I continue to improve and see the solutions more clearly.
https://en.wikipedia.org/wiki/Armed_Services_Vocational_Apti...
The best teacher has been helping collegues. I have been programming a lot better ( less errors and even without sometimes running the application when I'm pretty sure), because I think and analyze more upfront than I used to.
Some things come back, but it's rarely related to me ( eg. last moment spec changes).
I do have to watch out, I notice that basing my code on someone else is ok. But, they always have faulty code in hard to test areas. So make testing easier on things that are hard to test is my next motto.
Also, helping others is a pretty huge timesink :(
Ps. Being in the zone does wonders lately
PS2. There was another threat about videolan yesterday. And nobody heard the entire story about https. I gave the VLC developers the benefit of the doubt, not knowing everything off their infrastructure. . A lot of the comments here on HN disagreed with me ( stole silently upvoted though).
Today I saw a blog post about why..
It was infrastructure based...
I can't understand why I was practically the only one with another view in the subject in this community, where developers come together.
FYI:
Comments are in my history. Mostly on the videolan topic. It's all recent
The question the author posed was why programmers are paid that much even when some other paths could seem "harder", which seems valid. Sure not all careers are supposed to be "harder" than programming, but they're not as easy as one'd imagine either.
Though yeah at least for now I don't see the situation abating much. The demand is still going strong. Once the proverbial "flood" of the market happens from new grads, things might get worse. But still if you know what you're doing, you know all the right concepts and skills, you should be able to stay on top of the game. There has always been a saying that the irony of the CS degree is that many people who graduated with the degree can't program, while many who can program didn't need to do a degree at all. I doubt the influx of students trying to study CS would change this situation much. Coding bootcamps have been around for a decade yet they don't seem to change the market equilibrium that much.
My kids have mentioned that they might be interested in a degree in computer science because and I've encouraged them to combine that with a second area of specialization. Programmers are everywhere, but a programmer that also knows chemistry or biology or economics or art history or just about anything stands out.
Really? I have a physics degree with some experience in rocket science, but my most valuable skillset (measured by how much pay I can fetch for it) is plain old software engineering. I don't think I'd be able to leverage my area of specialization to exceed or even match what I can get from FB/LI/G as a generic software engineer.
That's how markets work. You get paid what what the market will bear, not what you "should" make.
Starting your own company (not self employed contractor) gives a really good perspective on what it means to be owner and employee.
How much of the business risk of the enterprise is your top flight programmer assuming? Are her decisions the only ones that make any difference to the increase in profitability as a result of her work? How direct is the line between back room engineer, no matter how good and profit?
The only case where 50% or near it makes sense is for a founder owner who is also the lead talent. Maybe. Because then they are also creating the business opportunity and assuming a big chunk of the risk.
Most SWEs I know are not making more than they bring in profit.
Agree
Consider that some of the most valuable companies in the world did not exist 25 years ago, and now they do making real non bubble money where their biggest assets are software developers. This isn't the dot-com boom. The attrition rates in CS programs are very high and even then not everyone that receives formal educations ends up being a decent developer. Factory workers made good wages in the mid 20th century due to companies making lots of money on an industrial boom. We now have a technological boom, and unlike factory work the barrier to entry is much higher. So I don't see why it can't continue. Sure the 3-400k salaries are high but they are at top firms that are selective and competing for a finite pool of talented workers in very expensive areas.
This is nothing like the current environment, where there is a consistent demand for programming talent which outstrips the supply, and the industry as a whole is far more stable.
Of course, a global economic slowdown could put a damper on salary growth, but it will not be a salary "bust" like what happened after the dot com bubble.
A quick google search finds that law school students have around a 14% chance of making BigLaw (the legal equivalent). The odds of getting into a medical school is between 2-5% on average. So no, I don't think we're in a bubble, the majority in the situation described would simply exist as the elite compensation class elsewhere as well, but maybe with even better odds.
For comparison, in Amazon, Senior and above engineers count for ~20% of the total, and those are the ones pulling +300k regularly. So only the top 20% of one of the top companies are getting such compensation.
And to follow the article, this won't last forever. Whenever the next stock market crash comes, almost half of that compensation (the equity based one) will almost vanish. But maybe on the next bull market we will see a similar situation (remember people in the 90s making 250k?).
You're saying the value of (for example) Google stock will plummet to 10% or less and then not recover at all over the following few years?
Working for a big company that pays well does not mean you're at the top 1% of software engineers. It means you're willing to do what it takes to secure that job and maintain it, including moving somewhere many don't want to live.
That's..really low. FB has around that many, Google has like 50k, and from what I've heard Amazon has about the same number or more.
Considering that everyone I know got >= $100k base out of undergrad regardless of location I don't think it's as uncommon as you would think anymore.
Does this include the now >50% TVC "non-headcount" as recently reported?
The US has anywhere from one to four million software developers, depending on your source. The BLS lists 1.2 million US software developers, with a $103,560 median pay (excludes benefits) in 2017.
You have closer to a 5% to 10% chance of earning $300,000 in total compensation as a software developer in the US, at some point in your career. Frequently high incomes don't last, there's a relatively high turnover because peak earning power only lasts so long, layoffs happen, specialization changes, job changes, et al.
The giant caveat to this, as everyone here knows, is you have a <1% chance of earning that outside of a small group of markets (ie it's very much not evenly distributed; if you're in New Orleans or El Paso you have almost a zero shot at it; if you're in SF or NY you have a legitimate shot at it).
Sure its not "easy" to get into those companies, but it isn't an outlier to get into them either.
The simple reality is this:
If you are an engineer at a publicly traded tech company, it is customary to get RSUs and Refresher RSUs. These have compounding effects as their vesting schedules start occurring in parallel. By the end of your second year you will have two series of shares unlocking, and this is in conjunction with your salary increases and bonuses.
You should expect and negotiate your RSU grants to be proportional to your salary. Competing offers from other publicly traded tech companies ensures this.
If the share price has also increases, which is the only thing that happened over the last decade, this is enough for a lot of people to quit.
The article did not talk about share price increasing.
Real estate prices are high, because the Bay Area is a very desirable place to live, there are people who can and will pay the high price, and the supply of housing is low.
Salaries are high because big companies are fighting for talent (demand), and the supply of talent is low.
You aren't being generous. The amount of companies paying people that well is on the order of dozens, maybe even 100.
Every single FAANG company, every single unicorn, all hedge funds, and a few successful non-unicorn company based out of SF and the Bay Area.
I think the reason these compensations get that high is because it does take 5-10 years to become good enough to lead a team that manages something so complex and to do it well so that the results are reliable and consistent. I think the difference with doctors and lawyers is that we're not licensed to practice. We're not a capital-P profession. However we still have to attend conferences and stay relevant but the expense and requirements to do so are on us or the companies we work for: there's no professional obligation to do so.
I don't think we're in a programming bubble if the author means we're in a compensation bubble and that programming is over-valued.
I think the real bubble is complexity. We're seeing a deluge of security breaches, the cost of software running robots in the public sphere on unregulated and very lean practices, and a lot of what we do is harming the public... though by harm I don't necessarily mean only harm to human life -- but harm to property, insurance, people's identities, politics, etc... and we're not accountable yet.
If anything I think we need to up our game as an industry and reach out for new tools and training that will tame some of the complexity I'm talking about... and in order to do that I expect compensation to remain the same or continue to spread further out and become the norm. Being able to synthesize a security authorization protocol from a proof is no simple feat... but it will become quite useful I suspect.
As a programmer myself, yes. There is no way I can continue to make this much. I'm a dumbass. (I say this having worked on, in the last year, a compiler, trading algorithms, and 3D object analysis)
The way I see this playing out is, something like behavior-driven development (BDD) where the business folks describe the functionality they desire, and programmers write up the backend logic. Then as AI progresses to AGI, a higher and higher percentage of that backend code will be generated by machine learning.
So over the next 10 years, I expect to see more specialization, probably whole careers revolving around managing containers like Docker. There will be cookie cutter solutions for most algorithms. So the money will be in refactoring the inevitable deluge of bad code that keeps profitable businesses running.
But in 5 years we'll start to see automated solutions that record all of the inputs, outputs and logs of these containers and reverse engineer the internal logic into something that looks more like a spreadsheet or lisp. At that point people will be hand-tuning the various edge cases and failure modes.
In about 10 years, AI will be powerful enough to pattern match the millions of examples in open source and StackOverflow and extrapolate solutions for these edge cases. At that point, most programmers today will be out of a job unless they can rise to a higher level of abstraction and design the workflows better than the business folks.
Or, we can throw all of this at any of the myriad problems facing society and finally solve them for once. Which calls into question the need for money, or hierarchy, or even authority, which could very well trigger a dystopian backlash to suppress us all. But I digress.
Let's say, a bug appears. If the internals are produced by machine learning, chances are it's basically un-freakin-fixable from the high mountains of the spreadsheet/lisp interface. So someone has to dive in, and do it by hand. I doubt the business folk will do it, they won't know where to look!
The result, seems to me, is a metric-ton of machine generated code that now someone has to rewrite. Better hire a team to do it...
And that Lisp code will look something like: https://groups.google.com/forum/#!msg/comp.lang.lisp/4nwskBo...
(Unfortunately, Lisp neither makes you smarter, nor a better programmer, which seems to be a very profound, ego-wounding disappointment for a lot of people who try to dabble in Lisp programming).
Now programming-by-spreadsheets, on the other hand, is a real thing, that is almost as old as Lisp, and is called "decision tables." It was a fad that peaked in the mid-1970s. There were several software packages that would translate decision tables to COBOL code, and other packages that would interpret the tables directly. I think decision tables are still interesting for several reasons: they are a good way to do requirements analysis for complex rules; the problem of compiling a decision table to an optimum sequence of conditional statements is interesting to think about and has some interesting algorithmic solutions; and lookup table dispatching can be a good way to simplify and/or speed up certain kinds of code.
What is not interesting at all is the use case of decision tables for "business rules." A few of the 1970s software packages survive in one form of another, and I have not heard anything good about them. And the problem is very simple: the "business folks" generally do not know what they actually want. They have some vague ideas that turn out to be either inconsistent, or underspecified in terms of the "inputs," or in terms of the "outputs," or have "outputs" that on second thought they did not really want, and they (the "business folks") never think about the interactions that several "business processes" of the same "business rule" might have if they take place at the same time, much less the interactions of different "business rules," etc.
AI cannot solve the problem of people not knowing what they want or are talking about. Machine learning on wrong outcomes and faulty assumptions is only going to dumb systems and people down (IMO this is already obvious from the widespread use of recommendation systems).
About a decade ago, I created a piece of software that made a department of 10 people redundant. The company actually tried to make use of them, but they were so happy with the improved productivity, that they basically kept the dead weight on, for the most part.
Last year, I did this at a financial company and eliminated an entire team. They did not keep the dead weight.
I may not be so lucky to avoid a future mercenary like me. Hopefully that explains it! I rely 100% on my creativity and out-of-the-box problem solving ability to solve real problems (note: not imaginary interview riddles). So far, the living I've made doing this is good.
- Just because a value is high and may very well go down, doesn't mean it's a bubble. FAANG are making real money from those workers, not just inflating an asset and selling it to other investors.
- Just because it doesn't involve long hours, doesn't mean it's not hard. A lot of my college colleagues really struggled, and many more didn't even get in. Don't discount natural ability - in the land of the blind, the one-eyed man is king, even if seeing is effortless for him.
But will market forces correct the above average salary? I think so.
More young students than ever are learning to code, which is naturally going to increase the labor pool. The supply of software engineers is going to go up in the next 10-20 years (as will the demand, though! But I still think supply will outpace). This seems like it would mostly affect new hires, as some 15 year veteran is going to have valuable experience that (most) companies will always be willing to pay for.
It feels like finance to me. People who got there early made a killing. Then salaries, while still pretty high, fell considerably as everyone rushed there to get rich.
As a counterpoint to the author's comment on doctors (maybe not lawyers, still plenty of law students in the pipeline). It does appear that the number of people going after medical degrees is decreasing, so I would predict their salaries to jump considerably in the next 20 years.
And last, and a totally aside point, I have exactly one friend who skipped college and over the last 12 years worked his way up the electricians union and now runs his own small business doing residential electrical work. He's making more than most of our social circle. He doesn't know many people his age doing this type of work either, so as the old guard retires, he's going to charge whatever he wants.
Former quant dev, ie straddling the industries.
The amount of people who study CS or can learn on their own is surprisingly limited. You won't get a job just by knowing how to write some if statements.
By contrast, there are load of history majors in finance. There's plenty of ways to act like you understand it. Plenty of bullshitters. Also the role of luck makes some of them seem smarter than they actually are.
Engineers who can't code will be uncovered sooner or later. It's hard to know beforehand at an interview, but it's a lot easier to discover with that person present for a few weeks.
It turns out the firm he went to work at was run by a family friend.
At least Tech has always seemed fairer to me, with less of the 'old boys network' than other fields like Finance and Law.
More young students than ever are being taught the utmost basics. But is it true that more people than ever are pursuing it in the sense of seeking professional mastery over the craft? A proxy for this might be population-relative CS program enrollment and graduation rates. It would also be interesting to know how this is scaling compared to the overall volume of programming labor demanded, which is surely growing as well.
I remember when I was there, the CS classes were for NERDS and now here we are, everyone wants in.
I could imagine the same thing happening for programmers over time, if it hasn't already.
How that is supposed to constitute a bubble is beyond me.
Then, payment for work isn't (or shouldn't be) about (perceived) equality and it isn't about compensation for suffering or even mere inconvenience either. It's about the value created by the work, which is why the work being hideous or insanely long work hours shouldn't be contributing factors that justify a high salary. Now, this of course is an idealistic notion. Often work is still valued in terms of time wasted instead of value created. Moreover, in the case of some not so well-paid but still important jobs the salary attached the them often is only tenuously related to the value created.
All that said, while probably few professions will be able to match the leverage and hence the value created through software development in the near future, perhaps the more reasonable hours and better work environments in software development should serve as a model for other work environments and industries rather than as an indicator that something's amiss.
!00% agree
1. The amount that a company can sustainably pay you is dependent on how much value your efforts have in its industry. Software has significantly higher margins than medicine, law, or almost anything else. As long as those margins are sustained, it is possible to pay high salaries. (As buffet says: I'd rather work for a mediocre company in a great industry than a great company in a mediocre industry).
2. But companies don't want to pay high salaries, so they will look for substitutes. Substitutes could be technology (RDS instead of DBAs) or increased supply of quality programmers. So far, it seems like these big companies haven't been able to find good enough substitutes to force down wages.
3. If you are looking at FAANG salaries, you are looking at the top of the income spectrum. The top of the income spectrum for lawyers and doctors is quite high.
4. Market economies reward value (outcomes) not merit (hard work). Hard work is correlated with outcomes, but it is not always a perfect correlation. So, looking at programming and saying it is less 'hard' than law or medicine doesn't say much, the question is how much value the person can generate, and how much of it the company can capture.
Comparing that percentage, you would see the same pattern in every other industry. If you do something uncommonly good, you make more than everyone else in the industry. Same applies to all other skills that are mentioned in the article. If you pick the top earners in law, medicine, etc, you will easily find higher numbers.
That is absurdly false.
To be generous, let's pretend he's only talking about FAANG programmers, rather than startups. According to Glassdoor, average compensation for "Software Engineer" positions:
Facebook: 121K Base + 15K Addl = 136K
Apple: 122K Base + 10K Addl = 132K
Amazon: 103K Base + 20K Addl = 123K
Netflix: 121K Base + 20K Addl = 141K
Google: 124K Base + 20K Addl = 144K
These numbers are slightly inflated (by 1k to 20K/yr) because I counted base and additional pay independently, while Glassdoor calculates an average total combined comp, which is always lower; I'm taking the larger numbers to give the author the benefit of the doubt.
Since Google is the highest, let's drill down on them a bit: With 1yr of experience you're looking at 121K+16K (or 125K total claimed, not doing my independent math thing), with 15+ years experience you're looking at 140K+24K (or 161K total claimed).
Again according to Glassdoor, to break 300K at Google on the average you need to be a Senior Staff Software Engineer or higher, which is exceptionally rare -- bit o' Googling suggests they're around 1% of the workforce, give or take.
In other words, the author is literally calling "The 1%" the Rank and File of programmers. Which is an exaggeration, I think.
NB: What Glassdoor doesn't account for is past stock performance; since SV traditionally pays a large chunk of its compensation in restricted stock units, and since the stock in each of these companies has increased by 50% to 150% in the past 4 years, by the time those stocks vest your "additional" compensation may have grown quite a bit due to market performance. But that's not a compensation bubble, it's just the stock market.
The Netflix data for example is just flat out wrong. It only hires experienced engineers, and pays all cash salaries in excess of 300k to all of them.
-Marlo, from The Wire
As other commenter's have pointed out, the data you're citing is completely wrong and/or woefully out of date. There's always 4 parts of compensation that you need to look at:
-Base
-Target bonus (depending on your level could be 15%-20%+)
-Initial stock grant
-Refresher stock grant (offered each year to protect you from a cliff)
An incoming SWE with 1yr experience at Google would scoff at a total compensation of $125k (this would normally be the bare minimum base salary). Same goes for Facebook. And while Netflix isn't known to hire junior engineers, it is well known in the bay area that they usually pay top of market rates fully in cash; this routinely exceeds $300k and much more so for seasoned engineers.
Edit: I'll add that all the senior engineers (lifetime position for most people) that I have worked with are indeed making 300,000+ a year.
Crowd sourcing personal data + those from friends, median total comp (base+stock+bonus+refreshers) for a CS undergrad degree was ~160k in high cost of living areas. Competitive students (a few internships at FAANG or similar) get offers in the 180-200k range.
It's a pretty common conception within industries with high levels of pay disparity.
Inflation: A $180k starting salary is equivalent to a $100k salary in 1990 dollars (The 1990s is probably the period when many of us 30 year olds started to hear about salaries and learn that a “six figure” salary was impressive. With $100k being the absolute bottom of a “six figure salary”.
Productivity: for every hour worked (on average) the American worker is about 50% more productive - when compared to 1990. That takes $180k to $270k. [1]
Scalable impact: increases in worker productivity are not distributed evenly. Let’s say a programmer is contributing double what the average worker is to the increase in productivity. This brings us above $300k and if we look at the real numbers maybe closer to $500k or more in value compared to that 1990 worker bringing in $100k of value.
Thoughts? Poke holes in this but it adds up in my mind. Now this doesn’t prove companies won’t find a way to under pay programmers but it at least makes me feel less surprised salaries are so high. And sad that for other industries workers are getting the short end of the stick compared to how workers were compensated in the previous century.
In fact, as someone (I think it was Marc Andreesen) said, we're just at the "end of the beginning". Software is in the final stages of eating the media industry and will now progress toward heartier fare. This will be accompanied by further growth in programmer demand - specifically, programmers with skillsets which are niche right now.
Even as CS enrollment reaches an all-time high, I do not believe the supply of developers with appropriate skills will come anywhere near the demand in the next few decades.
So, no I do not think this can be called a bubble - except in the specific case of FAANG workers. They might see their salaries go down in the next years - but programmers as a whole are unlikely to have a hard time finding jobs with decent pay in the foreseeable future.
That's one way. The other way is to do something that other people are unable to do. Software engineering is highly dependent on raw talent / intelligence, and hard work is unable to bridge the talent gap in our industry the way it is in other industries. The high paying companies in our industry are just the ones smart enough to pay 2-3x compensation to get 10x programmers.
2) I do think there's a partial bubble, though. What I tell people is, programmers today are like literate people in the middle ages. You can do hard things with reading and writing, but reading and writing aren't inherently hard. It's just that they weren't taught to the masses back then, so you could be hired as a scribe simply because you knew those basic skills. I think once public education catches up (or affordable education; the bootcamps you see everywhere now are starting to close the gap), the basic ability to code will quickly drop in value. There will still be high value placed on skilled programmers (the poets and technical writers of the programming world) - though their salaries will probably see a bit of a correction too - but I think the bottom will drop out beneath them, and you'll have to do more than just learn JavaScript to be valuable.
You can sort of count on most organizations paying something resembling a cost of living salary (though even that is becoming less true), but after that everything, ever-y-thing, always, is market driven.
We get to have our cake and eat it too, not because it is harder to become a software engineer than a doctor, but because it also isn’t trivial to become one AND the demand for our talents keeps outstripping supply every year.
There are still lots of divisions at lots of Fortune 500s that are just now starting their journey in adopting software to solve their problems, and they all create market pressure somewhere in the supply curve.
Look at medicine. Doctors make less than they used to because cost controls (not very effective ones, but still) have been introduced into the system. So even though demand for doctors is high a doctor can only generate so much revenue for a provider.
The demand for a service only roughly correlates to a markets ability to pay for it. There is a ton of demand for services in our economy that go unmet, because the market can’t or won’t bear the cost.
Edit:
Also, the returns software can generate for an organization are simply astronomical in a way that even other engineering fields can’t rival. If an organization is embarking on a software project that they know will either make or save them 10s of millions of dollars they aren’t really going to bat an eyelash at 300k a year.
So is peak programming just in the valley or should I be worried as well?
I don't know the laws in Sweden, but unless they have an environment similar to California the tech industry is not going to be as strong.
That said, at least for me in the U.S., the biggest bump in pay has always been from finding a new job offer.
You don't make $400k if you work in an anonymous startup, even in Silicon Valley.
If there's any bubble then I don't think it'll affect the "value proposition" that we ultimately are.
This is from payscale.com:
Average Mid-Career Software Developer Salary in Stockholm $61,262
Software engineers are the people who can do the work and so demand go up. The returns (value) these ventures produce warrant continued competition in winning decent talent and the price gets pegged in some range for some region in some category / industry.
These companies have actually devised a pretty decent scheme to select candidates, you gotta be someone either smart enough off the bat or dedicated enough to study to pass their interviews -- either way it's a VERY strong signal that you're going to be a solid hire.
Tech companies are going to continue being some of the most valuable companies in the world, if they aren't we have a lot of other pressing issues (like the decline of civilization) to worry about. Even amongst a downturn and/or recession I'm very confident that programmers at the FAANG type companies will continue to be paid very well relative to the rest of the population.
I’ve met people in marketing calling themselves programmers because they once wrote a PHP script.
I’ve overheard high school-aged kids at the Starbucks talking about the differences between a developer and a designer.
I’ve interviewed self-described “programmers” who struggle to write a for loop and “DBA”s who don’t know what the normal forms are.
My family, none of whom are the least bit technical, casually throw around words like “algorithm” and “server”.
The guy who works at the deli outside my apartment has asked me what my opinion on Python is.
We are definitely in a programming/CS/IT (whatever you want to call it) bubble - the level of general interest and enthusiasm for this stuff is unlike anything I’ve seen with other fields.
You mention other companies offered same salary but could not compete on equity. I'm assuming many of the big tech companies buy back their stock and then draw from this pool to give to new employees, which other companies cannot do, giving them an advantage in getting the best talent.
So are we in a programming bubble? In some sense yes, but it's being driven by low interest rates. I would guess that once rates normalize we will see a decrease in overall compensation from the tech giants as providing equity will become much more costly. However, moving forward the demand for strong information technologists will only increase.
In fact, even with a flooded market of programmers, the production of new tools actually affords programmers even more tools to work with and even more leverage for the business by employing them. So there remains a strong case for programming.
Mind you, the silicon valley area as a specific thing may well be a kind of bubble given how much more you can be paid there as a programmer compared to just about anywhere else in the world.
Since it's down, I can't actually read the content, but I too find no small amount of irony in a claim there is any kind of bubble in knowledge work, when the site said content is hosted on doesn't even appear to work.
Market multiples will go through ups and downs, and they’re high now for the major tech companies, but it’s not entirely surprising that the major companies pay well.
Also, don't forget, FAANG employees are the exception not the rule. Those are probably the top 5% in terms of engineering compensation, so it shouldn't be used as an average case for software engineers. You have to be a top 5% engineer or at least top 5% lucky to be in one of those companies.
And, let's not forget 400K isn't so much when your employer chooses to be in a location where housing costs an average of 3M to 4M and day care costs 30K per year (if you're lucky enough to get it).
When you think about it this way, FAANG pay isn't truly a bubble, it's more of an effect of their globalization. In that sense, I don't think they would "pop" per se on a timer (although they may eventually pop if their business falls over for one reason or another, like how AOL/Yahoo/Pets.com etc. did back in the 2000).
It's also a little bit unsettling that when thinking about it this way, it's somewhat of a transfer of wealth from people across the world, to a small set of people in Silicon Valley. True, the companies are providing something of value to the world, so maybe it's not so much a transfer, but the end result is still that money was collected from people in other countries, and paid to SV engineers/employees.
Again note that I may be completely wrong in my framework of thinking. Just something I thought up while reading the article, no reference or research on any of the claims.
And when talking about just the top-tier tech companies, of course the economy around them is bubble-like. That is the danger of allowing them to maintain nearly monopolistic reign of their markets.
Real estate bubble is visible when the apartment ads say "good ROI" instead of "nice place to live". Crypto bubbles appear when people are talking about market cap more than they use it.
But right now programming is in an optimal situation. It's hard to do, making supply low. Demand is very high; while we have all kinds of startups there are more to do. That creates high prices. IIRC FAANG makes about $400k per employee. In those situations paying $200k for a senior is far more sustainable than other tech fields which operate at a much smaller margin.
The programs created make the companies that hire the programmers extreme profits, in a way that scales by machines to serve an arbitrary amount of customers.
Even small maintenance changes make millions or tens of millions, but when they are done wrong they bring down systems they cost billions. That is why skilled craftsmen are worth their weight in gold.
Doctors only book so many patients a day. Lawyers are also working for a limited number of clients.
An app I did myself (on top of a FOSS base), released independently, was downloaded millions of times within a year.
Google back in 2011 had one billion MAU. In that context, a $150k+ a year salary (plus equity etc.) for helping provide the content for those one billion MAU is plausible.
Also Google is a web intermediary for web sites with content for consumers. So Google benefits from the work of anyone who puts up a web site, contributes to Wikipedia etc. as well.
1. Equity Most high "salaries" are more sharing of the profit or stock then mere salary. Lawyers become partners and thus share in the profit the firm makes (same goes for consultants)
2. Supply and demand There are way more positions than people willing to do the work.* So the ones who do, can pick 'n choose there employer. Thus work conditions improve, as do salaries. Most other professions mentioned are high stake jobs, but with more willing to do it * * . So once you're in, it is still possible for the employer to demand hard work, because you're more replaceable.
3. Scalability The work of a doctor is limited to the amount of patients. Lawyers can do large cases (then they get paid much), but most of them will do work that effect a limited amount of people Code (especially at the big 5) reaches literally billions of people. So the values you can possibly add is huge.
* I think it is still kind of weird how many people don't even try to learn how to code (or scripting) while working in excel all the time, it's a magical barrier, few dare to cross.
* * I don't know why, but I would guess it's because (i) The sciences are viewed as arcane or simply unreachable for many people (self-claimed "I just can't do math") (ii) those job tend to have more social status (think Suits for lawyers & Grey's Anatomy for doctors, to name a few)
I don't think Mechanical Engineers back then were managed by Project and Product Managers though. They had much higher status.
On the one hand, I have outrageous impostor syndrome - nothing I do seems terribly complicated to me and I definitely don't put in 80 hour weeks unless I'm on call; many people I work with seem to be smarter than me.
On the other hand, I do usually acquire a reputation of being very productive everywhere I work, and having freelanced before I've seen loads of outrageously bad code - literally every other project I was "rescuing"/"optimizing performance" with an existing codebase could supply thedailywtf with material for a month or two, and some could just be zipped and published there. So it seems like the bar is (was? that was a long time ago) actually pretty low.
Also, it's not necessary to be 3x as productive to make 3x as much - if there are 1500 NFL players, then the best (more or less) 1500 football players are going to get NFL salaries, whether they are "3x" the next guy or 1.3x the next guy. If there are 100000 openings in companies that can make $1m per engineer, 100000 engineers are going to be rather well paid, even if the IBMs of the world can only make $500k per engineer and thus don't pay so well.
On the fourth hand, I cannot see market not responding, like it demonstrably did with lawyers (I have a couple lawyers in the extended family), with oversupply of labor for such lofty salaries - especially as unlike with doctors and to an extent lawyers, there are no medieval guilds in place artificially restricting said supply.
On the fifth hand (sortof a duplicate of the third), as some have said above, and as with lawyers, it's still possible to have elite earners with oversupply of labor.
On the meta level, I'm not good at predicting the future (and I grew up in a chaotic country), so I'm considering this an unique streak of luck and treating it accordingly. In particular, saving a lot for thinner days ahead should they materialize; not buying as much house as I can afford; and postponing the two-year sabbatical I've planned to take, while the going is good. The way I see it, if the luck never runs out I can retire early; if it does at least I will be somewhat prepared.
I‘m coding for almost 20 yrs now and used to think the same: „when will the bubble pop?“, then it was „how long can this go on?“ until I ended up with „ok, that’s weird, what the heck is keeping this bubble from popping?”
I was told throughout these two decades that I’ll be replaced by some offshore developer from India within the next 5 yrs max. It never even came close.
To me it was against common sense, to climb up so much the social ladder like I was fortunate enough to do, just by being exposed to programming.
Then, I saw a talk given by Uncle Bob where he spoke about the history of software development. It was interesting throughout (like that the rate of female programmers was almost 50% before the dawn of CS degrees), but he also touched the “bubble” issue briefly.
He turned my perma-bearishness into ongoing curiosity: That we’re not in a bubble but in an ongoing “software crisis”. A crisis existing basically since the invention of programming languages. The theory has it that here aren’t enough developers available and may never will be. Is it demand-side economics on steroids what’s behind it? I don’t know.
But what struck me maybe the most is that I’ve never even heard about this theory, and never did any of my much more experienced and better programming friends and colleagues. The fact they didn’t knew kind of underlines the theory: there is so much demand that there is simply no time nor need to even tinker with the history of our craft.
Whether or not the best programmers actually get paid a lot depends on how efficient the market is and I have no idea about that. When crap programmers are getting paid a lot, then its probably a bubble.
* Good to great pay both attracts more people and makes employers supportive of policies to increase the labour supply.
* Cultural acceptance.
* Widespread access to computers and the internet. This was not the case just ~15 years ago.
* Low risk of automatization gets it promoted by governments all over the western world.
* Low barrier of entry. I mean come on, let's not pretend that work on most apps/websites today necessarily requires a CS degree.
* Remote work, in all its glory, may however increase competition for jobs.
I'm just worried about how far this will go given the currently rather widespread anti-union sentiment - even after years and years of game industry bullshit.
The longer people treat it that way the more money they will continue to leave on the table for these Google/Facebook engineers.
You don't have to be a master programmer, just like you don't need to do four years in college studying English, but unlike English, most non-technical professionals still have absolutely no idea how a computer works even though the world revolves around it. If any universal second language is important, it's not English, it'll be a programming language.
It might be a huge bias but I would imagine a significant amount of productivity gained in the economy, and thus new wealth created, over the last couple of decades has been software driven. So it _should_ pay well, right? It doesn't really matter that it's considered easy or hard, just scarce.
I've heard it, perhaps jokingly, stated that more than half of software that gets built fails; it never finds a market or never meets completion. In that case high-salaries are also a good thing as it increases the funding, and thus social proof, required to start a new software project.
So, perhaps we have a way to go yet!
Being the engineer on-call when the website/service/app is down and the company is losing $xMM/minute while you debug the problem is the very definition of stress.
Some companies like Google cordon this responsibility off to SREs. While Amazon has started developing a similar job family, the burden of supporting critical services still largely falls to software engineers.
While software engineering is certainly not the worst job from a stress perspective, my experience (particularly with operations at Amazon) is far from the zen-like state the author describes. It can also be pretty exciting.
Of course, the precondition is that you should be worth your salt, but finally I have come to realise that salaries are dictated by market dynamics. Your compensation is based on how difficult it is to replace you.
Meh. People have been claiming that will happen since the first LISP machines.
Writing code isn't a technical challenge, it's a social one. Until self-writing code can figure out how to extract business requirements from the mind of hungover MBA and turn them into a technical design that can be repurposed to meet what turn out to be the real requirements a year later (with zero overlap with what the MBA came up with), I think my job is safe.
The Web bleeped that up. Everyone hoped HTML5 would fill in the CRUD gaps, but it didn't. Thus, we still waste lots of time because the Web was not designed for CRUD and shoehorning CRUD into it is like backing an 18-wheeler truck into a parking slot. In the old days you just pointed your sadan steering wheel into the parking slot and DONE. JavaScript-centric UI's have proven too fragile. It's great for eye-candy, but not reliability.
Yes, I know there are some good Web stacks out there that can mostly overcome this, but they are rare and/or hard for managers to recognize. Their urge is to "keep up with the Joneses" even if the Joneses are doing something that doesn't help typical CRUD. Nobody can tell fads from good stuff.
One good invention and/or new standard could wipe out half of CRUD coders. I propose the industry experiment with a standard GUI Markup Language designed to do desktop-ish things out of the box and be more stateful. Mobile-friendly UI's are nice for mobile devices, but not good for regular office productivity. Desktops and mice still rule work.
Now, "is there a bubble"? Again, no. If we look at those numbers as hard deadlines, then let's see how many students are leaving college with CS degrees? The National Center for Education Statistics reports that only 60,000 bachelor degrees were conferred in 2015 [2]. Enrollment is still at its highest point, but no where near the amount Sargent says we'll be needing by 2026.
I think the biggest concern to a programming bubble is the capacity challenges we face in training these new developers. If you look at historical trends, there is no dot-com crash right now. Maybe people getting scared because of data analytics, but nothing as severe as 2003's web companies over-estimating their worth.
Instead, I think this looks more like the bubble from the 1980's [3,4]. Roberts and Henn try to explain the cause for the drop in enrollment. Henn focuses predominantly on gender representation in the media making females feel like CS was "boy's only". Female enrollment is only now beginning to return. Roberts, on the other hand, looks at how institutes handled the increase in CS enrollment. Since there were only so many qualified instructors, colleges began to make harder and harder qualifying requirements for enrollment. Students, as a whole, got the point and sought out other degrees.
So "is there a bubble?" maybe/maybe not; but it would be because there are not enough qualified instructors to teach the need.
[1] https://fas.org/sgp/crs/misc/R43061.pdf [2] https://nces.ed.gov/programs/digest/d16/tables/dt16_322.10.a... [3] https://cs.stanford.edu/people/eroberts/CSCapacity/ [4] https://www.npr.org/sections/money/2014/10/21/357629765/when...
If this continues unabated for the long run and we don't hit an intrinsic automation plateau, it can logically have two possible outcomes: either programming will become the main employment of humans and source of income, which seems unlikely, or the capital owners with the help of programmers will manage to make large swaths of the population unemployable.
Another feature of technological capitalism is that it highly conducive to monopoly rents, walled gardens and strong barriers of entry. The early mover advantage of Microsoft on the operating system market still earns it billions in pure rents. So competitors who would like to change and equalize the earnings distribution, for example by employing programmers in another country, will be in general unsuccessful.
I know it doesn't seem like that now, on the contrary, the tech industry seems like a wild west of endless possibilities and opportunity for all, but it's worth pondering if the apparent "tech boom" isn't really a massive economic shift for a very different, highly polarized and almost feudal society, with the programmers acting as the samurai, the sell-swords used by capital-owning rulers.
Programming is different than doctors or lawyers. Programming in terms of value can scale. Doctors can only see a certain number of patients or perform a certain number of surgeries. A lawyer can only take a certain number of cases. They produce value based on their time in each of these tasks. That amount of value is somewhat fixed.
However, a skilled software professional can complete one project that produces a large amount of business value, or the coder can develop a new product that generates a long-tail stream of revenue over time.
For example, a coder that built the Google Ads Platform has helped generate a tremendous amount of revenue for Google, and that revenue keeps pouring in.
The bubble is real, but for IT/SD professionals that tend to get complacent with their skills. Traditional IT is getting abstracted away entirely. SD is getting more and more complex as many more frameworks/knowledge is needed on a yearly basis to do modern software development. Building a simple monolith is no longer acceptable for many companies. One must learn modern distributed systems.
The sad truth is that it will continue to take less and less resources to accomplish the same business goals in the future. One must stay on top of the latest technologies to climb on top of the coding pyramid. Luckily, this pyramid is starting to segment based on Front-End, Back-End, DevOps, etc. which means the truly skilled professions in those domains will rise to the top and command salaries much higher than currently seen.
Self driving taxis and trucks are another example of the potential of software to replace non technical jobs. As automation and technology increases it appears that the need for non technical workers will decrease and the need for skilled engineers will continue to increase. The future is coming and software / hardware engineers are it's architects. Unfortunately the chances that non technical workers will lose jobs goes hand in hand with that.
*There are some caveats to that like usually people experience a slight drop in total compensation on their 5th year, FB/G refreshers are much much better than Microsoft's and Amazons (Amazon won't even give you refreshers if the stock is doing well like it has because your are "overpaid"). Netflix doesn't do any stock stuff at all and instead pays everyone in straight up cash and yes, they do offer senior engineers 300,000+ a year in base salary.
Now when you get a promotion, you'll get a new set of stocks vesting over 4 years. If you get the promotion before the 4 years expire (usually the case) it will overlap with your initial grant.
If an ML system is able to extract the intended structure from a block of code and somehow translate that into a known-good pattern you could see things shift. I see it as being somewhat akin to human language translation.
This is absolutely false. There was a lot of talk about "Microsoft Millionaires" in the 80s and 90s.
How many Apple Engineers received their stock refreshers in Q3 at a $220 per share strike price? (now at $150 share)
Like all equity components, the answer is market driven.
What may end up happening is the funnel narrowing at entry, that is the salaries for brand new developers collapsing, while only a select few make it to the top. It reminds me of professional sports.
Give it 10-15 years and the bubble is going to burst whether we like it or not. If you're not into machine learning or deep learning at that point you're screwed.
Change my mind.
Is this run on big companies might be because of the end of Moores law: this event brought some kind of stasis that benefits big established players, as the big guys are now less afraid of disruption (as it is easier for the upstarts to plan ahead if they know that you have two times the horsepower within a year) Is that right?
It is my opinion that we are not in a programming bubble, but in a venture capital bubble. The latter causes the appearance of the former. Let me explain the meat of how I think it works.
1) A well-to-do person decides to start a VC fund. This person recruits a few high net-worth friends and convinces them to invest a sum of money in the new fund. Let's say, hypothetically speaking, that this person gets $1m to play with in total from 10 individual investors.
2) This new fund does around 10 unpriced seed investments with its $1m (convertible notes). 50% of these companies do not go on to raise more money, and thus the money spent on them is written off. The other 50% go on to raise a priced A round, whereupon some bigger funds lead the rounds and mark up the price of each company's equity by between 50% and 150%.
3) Our seed investor is now sitting on equity roughly worth $1.25m. This person has made a 25% return on the capital under their management. They take a 5% fee off the top - for a handsome payday of $62,500. The investors are pleased.
4) Then, this thought crosses our brave new fund owner's mind: "Boy, I'm really good at this." So our owner goes out to a wider network and solicits $10m this time, with the intent to participate in A rounds instead of seed rounds. They cite their successful 25% returns in seed stage companies, and people scramble to hand them money to manage.
5) Goto step 2 (but increase the numbers and change up the preferred investment round occasionally)
The sums get bigger, the paper returns get bigger, and the management fees get bigger. But what brings this all to a crashing halt? Where does all that VC money come from?
I believe that this is all a consequence of the zero-interest rate environment in the United States throughout the last decade. People with assets have largely had no attractive places to put their money, so they were forced to chase riskier and riskier investments. Venture capital was the perfect target - the compelling narrative of technological progress makes for a feel-good investment avenue, and the general opacity of tech concepts to non-technical people makes the mystique that much more compelling. Plus - there's a mathematical strategy here! Why buy bonds and get a lower rate of return than inflation when you can chase unicorns? Why not take the probabilistic and "scientific" approach by investing in 100 companies - expecting 90 to die, 9 to do okay, and 1 to be a mega-success that makes your investment worth it?
The hunt for the fabled "unicorn company" is this economic cycle's equivalent to "housing prices are always going to go up." But the models work! you say. The success and failure probabilities are accurate! So were the models that led to mortgage-backed securities - if we bundle enough of these loans together, on average they will have to be profitable - right? The problem then, as now, is that such a model is only accurate when the broad macroeconomic conditions underlying it remain true. When you run out of money coming in, the music stops. Loan defaults started to spike and the MBS model fell apart. Likewise, I suspect that less money entering the VC world will presage the whole thing falling apart - and as interest rates climb, it's only a matter of time until debt is more profitable again and the easy-money faucet turns off.
I've ranted for a bit, so you're probably wondering - how in the hell does this relate to the OP's article? Let's now address the other side of the equation here - where does all that VC money go? Well, that money that was invested in all those failed startups (or the successful ones) gets spent on something. But what? It clearly doesn't all get spent on catered lunches and ping pong tables (much to the chagrin of our industry's critics). But it does get spent somewhere - and I'd hazard a guess that the most popular targets for that spending would be Facebook Ads, Amazon hosting, Apple hardware, Google Ads, Microsoft software, etc. The VC money gets spent on stuff the tech giants are selling, fueling the dramatic increase in their share prices over the past ten years.
Given that OP's article makes the point that compensation is huge and primarily driven by share price increases, it's easy to see how VC money could be actively impacting this situation. But as a gut check - if all of this talk of "US economic conditions fueling a domestic venture capital bubble" does have a grain of truth to it - what would you expect economic conditions for software engineers to look like in other parts of the world?
London: https://www.glassdoor.com/Salaries/london-software-engineer-...
Paris: https://www.glassdoor.com/Salaries/paris-software-engineer-s...
Berlin: https://www.glassdoor.com/Salaries/berlin-software-engineer-...
Interesting, right? Proximity to the nexus of Sand Hill Road does appear to have an impact. I'm no data scientist, but I imagine there's enough publicly available data about venture rounds and salaries that a more rigorous assessment of this general thesis is possible. If anyone knows of existing studies, please let me know.
The last push failed but businesses will adjust their conceptual models and try new things. Eventually, something will stick.
Heck, maybe the last round failed because the technology culture needed to develop more in different areas. If so, how long before local geeks do to their area that has apparently already happened in the states?
Demand is super high because of 2 things: 1. Tech companies are very profitable 2. There is a lot of cash in the economy (thanks to QE) a lot of which got invested in big tech and startups.
And supply of high quality senior engineers is not there yet.
When the wave of young kids who chose to do MIT instead of Harvard Law School gets older, things should get more in line.
At least that's how market theory is supposed to play out
In terms of job difficulty, I see no argument there: I can't imagine the stress a surgeon has during an operation; especially compared to a computer scientist such as myself.
Could the economy itself go down? I'm betting it will.
My take is that we currently have bullshit economy, and that the software industry is a big part of that bullshit economy, almost funded entirely by it.
Of course there's a lot of essential software, but if the bubble of bullshit software pops, then the industry as a whole will suffer a lot.
It is definitely the new normal.