I suppose you could make the argument that these programmers did not
know what they were doing. That they were simply given some specs, and
they implemented those specs, and didn't know that they were
accomplices in a case of massive fraud.
I think that argument is even more asinine than Michael Horn's. They
knew. And if they didn't know, they should have known. They had a
responsibility to know.
I agree with all the points in the article except for the point that the programmers should have known.For me it is a plausible scenario that the programmers have been told that his feature is needed for some good reason (probably testing).
When I was a young engineer I had a mentor. He was a war baby and a strict pacifist. He was also very good and his advice was much sought after so he could afford to refuse all offers from the defense industry.
He once told me that for his whole life he manged to never designed anything that could be used to harm people - except for one thing. When he was young he was hired to design a gear rim for a crane. He told me, he was given the load specifications but never saw a drawing of the actual crane. That was a bit unusual but nothing he worried about.
It turned out that the gear rim was actually for a Howitzer. He never worked for that client again.
There are all kinds of reasons why a car has to behave differently while on a dynamometer and there are all kinds of special code branches that are executed only during test. For the programmers it probably was just another special case among many.
Don't be evil and don't be a fool, but you can't be expected to do a full ethics check for every feature you are supposed to implement.
EDIT: Spelling, style and removal of some superfluous chatter.
That makes it quite a bit harder to believe that whoever implemented it thought it was for some legitimate testing. For testing you want a trigger that is hard for anyone to hit accidentally, but easy for people who know about it to hit. You would not include barometric pressure, because that narrows the ability to get into the test mode way too much.
An ideal sequence would be some nonsensical sequence of inputs, like a specific sequence of left and right steering inputs, with a specific sequence of turn signals (often opposite of the direction turned) if the ECU has turn signal data available, interleaved with a specific pattern of taps on the brakes.
Hypothetical example: The Lane Keeping Assistant can actively adjust steering. Turing the wheels during a test on the dynamometer can make the car jump off the rolls and harm people. The dynamometer is a highly artificial environment that can potentially confuse the Lane Keeping Assistant.
Do you ensure safety through testing guidelines or through safety measures in code? Would this be a plausible reason for a developer to write the dynamometer testing environment detection code?
I think the point is that someone deliberately did this and they had their hands in the code. Yes, there are variants of the tune-able parameters for various regions and tests. As part of design and validation these can be used interchangeable on the test beds. However, someone, somewhere wrote the emissions defeator.
There's a reason that teams designing these sorts of systems consult with lawyers who are experts on the relevant law. The programmer's job is to program. Expecting them to also deal with details of legality and morality (beyond grossly obvious things like hard coding dosage limits into medical equipment) is just wishful thinking.
That's the kind of talk that people want to hear. "Oh the developers were given shitty instructions, they shouldn't have listened" but talk is cheap. Stop to consider the implications of that sort of second guessing. Obviously things get wacky at both extremes but when you give someone a spec to meet you need to have an expectation that it will meet that spec. Our industry is built upon millions of black boxes that meet I/O spec sure having the developers turn around and say "we changed you spec because it was killing polar bears" comes with a much larger can of worms than just implementing what you're told to implement and accepting that it might not be morally agreeable and getting on to the next thing.
There's a reason people aren't all generic worker bees. It's efficient to have the lawyers worry about laws, coders worry about code and managers act as the interface between them and accept the blame if what the lawyers say isn't properly translated into the programmers' instructions. than it is to have all three groups worry about all three subjects.
I think law is interesting and has a lot in common with software developing but I don't want to have to go looking up case law as required research before coding a windshield wiper controller..
We're humans, not robots. People can be expected to think about things and participate in society. It's generally held that we should expect pretty much everyone to concern themselves with details of legality and morality as part of being a good citizen... "I'm just a simple automaton doing what I'm told" is generally not a valid excuse.
Do you actually know programmers who literally just take specs and implement them and have no thoughts or opinions about the larger context of what's going on? In my experience, programmers have a lot to say about non-programming aspects of work.
The other issue here is, what constitutes "grossly obvious?" You just drew a totally arbitrary line based on your own opinion of what can be expected and what can't. Your argument is a bit of a strawman, nobody is expecting coders to go read up on case law.
Ultimately, we don't know anything about what happened at VW. We don't know who was responsible, or who knew what, and we're all just crafting up scenarios and speculation ("you see, the specs were such that the engineers couldn't possibly have known what was going on") based on our own experiences and biases.
Sadly, yes. I've found this to be the case with most outsourced developers I've managed. They follow the spec to the T even if there's a glaring issue staring them in the face.
I think an engineer has more responsibility than following orders. Especially a German engineer should be aware of this. "Ich habe es nicht gewüst" is only an excuse as long as it is true.
Also, I fully agree. There is well-known historic precedent for "I just followed orders" not to be a valid excuse. Plus, the consequences would likely not have been even remotely close to that for a soldier in WW2 (i.e. unlikely to be shot for treason).
OTOH using the programmers as scapegoats is wrong. Yes, those who carry out the orders are guilty. But the entire chain of command that led to it is even more guilty. And thanks to corruption, they'll likely only feel a fraction of the punishment the scapegoats will face.
I can see many reasons why software might be written, or maybe even configured, in a way that could be lethal when deployed to an actual customer, but have completely valid and sane reasons for existing (all maner of testing comes to mind).
Unless it can be proven that the developers had intent and did follow through, there is no particular reason why the blame should fall entirely on them.
Additionally, if he is so intent on having a "profession" that punishes illdoers, he should first call for one that protects good members.
From the excellent Metafilter thread:
> i mean, how do the product managers rationalize this feature to their colleagues? what to they write in the spec that isn't all-out incriminating?
Modularity
Department 1:
Req 1: Software should enable emissions controls upon receipt of control signal A.
Req 2: Software should disable emissions controls upon receipt of control signal B.
Department 2:
Req 1: if epa testing device is detected send signal A.
Req 2: if epa testing device is not detected send signal B
http://www.metafilter.com/153117/EPA-Accuses-VW-of-Emissions...
Without being able to see the actual code and requirement documents, all claims about them are pure, idle speculation.
Maybe it's ok in the Volkswagen software to have a knob that controls the amount of NOX in the exhaust, for testing purposes and for adapting the car to various markets. Maybe it's ok for the software to provide heuristics for the driving conditions (highway, city, dynamometer) for some future telemetry application. But the wise future programmer does realize it needs to be someone else than himself who makes the decision to configure the system to couple those two things together, and make the car reduce pollution only when dynamometer mode is active.
Good old shifting of blame works for the bad guys as well as the good guys. It may not be pretty but it works well enough if only you're willing to draw the line of responsibility somewhere for yourself.
It's not that easy. Sure, what they done can be considered "evil"... but what if they had refused to do it? They would have most likely lost their jobs and they would have no chance in a court trial. Volkswagen has a army of lawyers and is in tightly connected with every relevant government agency in Germany.
The problem here is Uncle Bob thinks he's in the trenches when in fact he's armchair quarterbacking.
But the other possibility would be to "blow the whistle" anonymously.
The chances of getting away with that still aren't great (if VW put some effort into flushing out the snitch, I think only a practised liar could get through it...), but it's another way forward.
And actually: it's possible this actually happened, and the official story of how this was discovered is just a cover for an anonymous engineer who managed to get a warning to the right person.
What says the programmers knew?
Saying that, it is likely they did know but this comes from above. There's a few psychology experiments that show many humans will do things they know are wrong or immoral when an authority figure tells them to do it even though they don't want to do it. The Milgram Experiment, for instance, comes to this conclusion, among others.
Peer pressure and obedience of authority are real phenomenons and that starts with the leadership that needs to be held accountable. Hearing an authority figure pass the blame to someone at the bottom is disgusting and barking up the wrong tree I believe.
While the "nice" and "obedient" always did as they were told? Then we're told never to hire "assholes".
"Assholes", in your context would be people that don't fit the "culture" of the companies "group". Possibly off topic, but it's one of the reasons I always get a bit nervous when companies define their "culture".
"...it wasn't so long ago..." What?
https://en.wikipedia.org/wiki/Therac-25 -- This has been a thing since at least 1985 and probably far longer.
Even if the programmer was fully aware of what they were doing, VW would still be the only party that's legally and morally responsible for this.
(And may God help whoever made the Therac-25 mistake, just imagine making a bug like that)
Lacking more objective standards, the most likely result of attempting to regulate at this stage seems to be regulators who talk a good talk -- such as the author of this article. Those people will not necessarily be the ones with either the best ideas currently available for building good software or the most useful experience and/or data to advance the state of the art in the future.
I sometimes work on software that really does have to behave properly because significant failures in production really could be very damaging. The idea that some of the careful, successful processes used on some of those projects might be required by regulation/legislation to give way to the kind of junk that a lot of consultants peddle is quite scary.
Only artificially so. Modern software development is mainly about re-inventing wheels from the 1970s with slightly different syntax and more bugs in. If we had settled on a language - doesn't matter what, Ada, ML, C, Lisp, FORTRAN - they're all Turing-complete after all - and gotten on with y'know actually building things, software engineering would be a mature discipline by now. Instead all the accumulated experience gets chucked out the window everytime fashion changes.
That doesn't mean they are going to be the only person at the company that is responsible, but they signed off on it and are responsible for it in the context you speak of.
A software engineer would most likely not know the intricate details of the law required to know whether it was legal(in all nations) or not.