I'm not sure programmers are much better. There's a long history of security vulnerabilities being reinvented over and over. Like CSRF is simply an instance of an attack first named in the mid 80s ("confused deputies"). And why are buffer overflows still a thing? It's not like there's insufficient knowledge about how to mitigate them.
And blaming this on the market is a cheap attempt to dodge responsibility. If programmers paid more than lip service to responsibility, they'd push for safer languages.
If programmers paid more than lip service to responsibility, the whole dumb paradigm of "worse is better" would not exist in the first place. As it is, we let the market decide, and we even indoctrinate young engineers into thinking that business needs is what always matters the most, and everything else is a waste of time (er, "premature optimization").
I used to think like this but I've come to realize that there are two underlying tensions at play:
- How you think the world should work; - How the world really works.
It turns out that good technical people tend to dwell a lot on the first line of thinking.
Good sales/marketing types on the other hand (are trained to) dwell on the second line of thinking and they exploit this understanding to sell stuff. Their contributions in a company, in general, are easier to measure relative an engineer since revenue can be directly attributed to specific sales effort.
"Worse is better" is really just a pithy quote on how the world works and it's acceptance is crucial to building a viable business. Make of that what you will.
How many hacks, data breaches, and privacy violations does it take for consumers to start giving a shit?
Also, any programmer will tell you that just because an issue is tagged "security" doesn't mean it will make it into the sprint. Programmers rarely get to set priorities.
There's a quote by Douglas Adams pops up in my mind whenever the subject comes up:
> Human beings, who are almost unique in having the ability to learn from the experience of others, are also remarkable for their apparent disinclination to do so.
This is the only explanation there can be for this. Every time there's a breach somewhere (of which there obviously are plenty), there's a big outrage. But those who should go "oh, could that happen to us, too?" choose to ignore it, usually with hand-waving explications of how the other guys were obvious idiots and why the whole thing doesn't apply to them.
This obviously goes for consumers and producers.
https://en.wikipedia.org/wiki/Say%27s_law
In other words, it takes a better alternative to exist. Better can mean cheaper or faster or easier, a lot of things. That can be accelerated by the economic concept of "war" (ie. any situation that makes alternatives a necessity).
The incentives for someone to break into a major retailer, credit card company, or credit bureau are much different from Widget Cos. internal customer service web database. What I think the article is missing, even though it makes alot of good points, is that if there's a huge paycheck at the end of it, there will always be someone trying to exploit your system no matter how well designed it is. And if they can't hack the code quickly, they'll learn to "hack" the people operating the code.
You are oversimplifying. Dunno in what programming area you work (or if it's software at all) but "we work with languages X and Y" is something you'll find in 100% of all job adverts.
Tech decisions are pushed as political decisions from people who can't discern a Lumia phone from an average Android. That's the real problem in many cases.
That there exist a lot of irresponsible programmers is a fact as well.
It used to be that RandomBusinessApp would hit this stuff, now most of it ends up in Java so it might still crash but usually it's mitigated better.
Most programmers want to dio their job quickly and easily, and go home.