They demonstrated the willingness to push the company's technology into heinous, heinous territory. The kind of thing where a drone would be able to follow a single person in a crowd, and target them for execution - unguided, of course.
I quit the next day. Those of us who make technology, need to be very sure we see that it is not used destructively against the human species. The responsibility is very, very high. And, the danger is extreme. These people were revelling in the fact that they could develop targeted assassination drones and sell them to any country in the world.
Heinous.
After about 2 months of work, and after the production line got parallelized and speed drastically increased I went to see it work.
What I saw shocked me and I immediatelly quit. It was a production line for handling of female and male young and grown chicks. Debeaking, throat slitting. I was absolutely shocked how none of the superiors told me exactly which product was being handled.
After seeing the horrific product of my work I quit.
Since then, I'm not surprised, given what horrors we do to living animals, that we are ready to do them to each other.
I doubted the meaning of my work at university, what did I do? Spend 4 years at college to create killing machines? I didn't think I'd ever do that.
I sometimes think this is why we don’t see any intelligent civilizations out there. Intelligence gives rise to deceitfulness and eventually, one selfish actor can bring down an entire civilization intentionally or unintentionally since the weapons get so powerful.
It always amazes me that those big entities exist, because they require such a huge highly educated and skilled human power. What are all those genius at the NSA thinking ? I can't imagine somebody smart enough to work here is not smart enough to understand the consequences of working there. So why are they not quitting ?
Social pressure and money are part of the equation, certainly. I remember when I turned down a Google interview, my close circle though it was weird that I did that, even more for ethical reason.
Having the best toys, budgets and projects certainly helps as well.
But still, I wonder.
Maybe they don't have a problem with working on such projects, because they agree with their end goal?
“There are hardly any excesses of the most crazed psychopath that cannot easily be duplicated by a normal kindly family man who just comes in to work every day and has a job to do.”
— Terry Pratchett, Small Gods
Frankly the broad majority of people in the US benefit materially in some way from their existence in ways that they might not even be fully aware of.
I am not arguing this is a good thing, but I think it's a reality. Military power abroad means wealth accumulation at home.
Because not everybody thinks the way you do. That's why we have elections.
I'm sure disclosure would be illegal per your employee contract, but are there any other steps concerned employees can take?
This is never true. No government will allow its companies (and in this context that's how it's seen) to sell weapons to countries they don't approve of. That said, apparently the defense industry is often irresponsible at best.
What the poster meant was "just about any country" in the world. Which we've seen happen time and time again - the restrictions may be nominally in place (for a while), but eventually they get selectively lifted, and/or the companies find a way to circumvent them.
At times (and not at all surprisingly) with the assistance of certain government agencies chartered with the purpose of not only enacting precisely this kind of subterfuge - but perfecting it as an art.
I'm much more inclined to believe it's the latter.
Sure you can argue that it's different because those are things the common man can get whereas on the state can afford surveillance dragnets and drones but it wasn't that long ago that only the state could afford computers.
Edit: Apparently I struck a nerve.
Technology transfers between military and civilian application all the time. Propeller technology that helped submarines that are now obsolete stay quiet is fine tuned in a different manner to yield more environmentally friendly watercraft. A drone that can disperse insecticides on only the crops that need it can deliver chemical weapons with some slightly different fine tuning.
An 1984 (or 2018 UK if you like) surveillance and law enforcement system could be used to track down corruption in government, suppressing dissidents, identifying insider trading, identifying human tracking, etc. It all depends on who's using it. (I personally don't trust any government to properly wield that kind of power.)
The technology doesn't care. It's all how you use it.
Like it or not the world is full of extremists who would like nothing more than to hurt innocent people. There is no “oh just send the cops and arrest them!” route to take.
Shit, just look at the time Osama bin Laden could have been bombed with a tomahawk missile during Clinton’s presidency. He didn’t do it because of the potential to kill a Saudi prince he was meeting at that time.
Would those angry Googlers be against surgically killing Osama? I think not.
Better drone software might help track a potential target and present with the optimal window in which a target could be shot and have reduced civilian casualties. It could also present with better intel to let a surgical ground strike which would put more American soldiers at risk but would allow for better intel and again less civilian deaths.
Lastly, it could offer new knowledge and experience in tracking humans with drones during humanitarian disasters. It could also help in tracking victims of kidnapping, are the Googlers opposed to rescuing the hundreds and thousands kidnapped by Boko Haram and company?
Who is going to go into the African heart of darkness to rescue those people? Is it the arm chair Googlers who pretend to know better?
It would reduce collateral casualties per target attacked, which would make the drones easier to use with looser target selection criteria, which might both increase number of targets attacked and increase the number and ratio of incorrect-target-selection casualties.
The law of unintended consequences is most likely to sneak up and bite you when you only bother to consider first-order effects.
This is just throwing in an unfounded qualitative thought, not an actual empirical argument against precision weapons.
But I think it's also true that carrying out military operations (even precise ones) in unstable parts of the world helps violent extremists gain support among the broader population and does nothing to help alleviate the instability that gives rise to these extremists in the first place.
When I consider how few deaths there actually are from extremist groups operating in Western countries it makes me wonder if the scale of our response is really appropriate to the severity of the issue, and whether our actions aren't helping to perpetuate the very issues they're intended to address.
I think we should have gone down there after 9/11 and punished (killed) everyone that remotely as affiliated to the terrorists and left as soon as we did that. But that is a political action not a military one. Our military budget would be far smaller if all the random bases across the globe would be closed down and we brought the troops back home. If anyone wants to mess with the sleeping giant, then they can quickly pull up records on what happened to Japan and the Axis back in the 1940s.
The article here is talking about developing military applications for better drones. The potential gains to be had from this are more than just better strike capabilities. Those who don't see the potential humanitarian and other activities that might benefit are being short sighted.
Can you guarantee our government won't initiate illegal aggression?
Won't subvert democracies?
Won't have another Gulf of Tonkin?
Won't target the families of terrorists, as our current President has suggested?
No, you can't guarantee those things.
Guns don't kill people, people do. And people are sometimes evil, and sometimes break the law, and sometimes make mistakes. And sometimes the gun is stolen. So maybe some engineers don't want their company to make any guns.
You don't get to ask, "Would those angry Googlers be against the technology always being used the way they intended?"
Instead you have to ask, "Is it possible for this technology to be used in ways those Googlers would object to?" And of course the answer is yes.
Many on the Manhattan project thought that bombing Nagasaki was completely unnecessary. Some probably thought Hiroshima was unnecessary, that a demonstration of the power would be sufficient.
barring catastrophy, it’s close to 100% inevitable that militaries will become largely autonomized in the coming decades.
But I never seen any good guys in my history books or in the news.
Hence I always assume, when given a power to somebody, that the person doesn't have my best interest in mind.
Let's all remember it's possible any of our country become one day a dictatorship. Just because we enjoyed a lot of freedom for the last decades doesn't exempt us from still working like we can loose it at any moment. Because we definitly can.
More pragmatically, with powerful AI, giant communications nets, huge database of everything and everybody, cameras with facial detection and wire typing everywhere, do you really want to add drones to the collections of what the power that be can do ?
Look at the MRI imaging. They are a downstream invention that came from the development of nuclear weapons (nuclear magnetic resonance). How many lives do you think that has saved and improved in the past 70 years?
The resistance team sank a ship with 18 civilian casualties also the previous unsucsessfull air raids killed more than that.
You wouldn't characterize the Allies in WW2 as good guys? No better or worse than Germany or Japan?
They fear that this current admin and future adminis may depend on as sec Clinton put it “droning” people we simply disagree with rather than actual military adversaries.
The main question is effectiveness of the system, given some baselines.
Other people don't ask if the ends justify the means.
Other people recommend first-strike nuclear attacks.
https://www.wsj.com/articles/the-legal-case-for-striking-nor...
"Full of"? The world is more peaceful than it's ever been. Extremists do hurt innocent people, and we should not ignore them. But with each choice that we need to make, we should carefully consider pros and cons. Is there really a net benefit here?
I agree with you the world is much safer and there is less deaths from military conflict post WWII. I think that is largely due to the massive military power in a largely benevolent country like the U.S. I think if you magically removed the U.S entirely from the picture other nations would be thrown into conflict to be "top dog".
They most likely don't want to work for a company involved in any killing.
Probably.
I can't find the quote, but I was reading something about the troubles with the IRA, and the response, that really stuck with me. The author said something like, "When you use lethal force against terrorists it lets them feel that it's fair to use it against you."
We won't defeat violence by violence.
I think we need to challenge ourselves to become less bloodthirsty as we become more technologically capable. (If only to set a good example for our progeny... https://en.wikipedia.org/wiki/I_Have_No_Mouth,_and_I_Must_Sc...)
Once you make the mental flip it becomes really easy to think up defense systems that work without causing any casualties or deaths at all, not our guys, nor civilians, nor the enemy. And if we just can't live with that, we can always kill them later: https://en.wikipedia.org/wiki/Saddam_Hussein#Execution
I can think of a few examples that prove that statement false. WW2?
Here's an article on that very thing. It didn't mention anything about a Saudi Prince though.
http://www.latimes.com/nation/nationnow/la-na-nn-bill-clinto...
But your points are valid. Like anything, it could be used for non-objectional purposes. How morally objectionable do you think the whole of the use of this technology would be, given the country using it, etc? Do you think it would help the US and allies become more or less authoritarian? How long until you think it would be given / sold to local police departments like other military equipment?
As far as your other points, I think they are all valid. In this country we still have the press to bring up rampant abuses, and other branches of the government to keep the executive in check. I suppose as long as the other branches are willing to keep each other in check, then its better for everyone to include US citizens.
You also have to remember that the people in the military aren't robots (not yet at least). They are regular citizens joining a volunteer army for a multitude of reasons, many of which are to serve honorably and of course the steady paycheck and college benefits. Most of the military does a 4 year stint and they go back to being civilians like you and I.
[0] https://www.washingtonpost.com/news/fact-checker/wp/2016/02/...
Guess how it's been used? It's been used to "target" everyone. Why? Because it's gotten cheap and easy enough to use on many more people at once - just how automated drone strikes will be soon.
I think you're naive if you think this will "improve" war conditions. Here's one story that may bring you back to reality, and about how these automated drone strikes are more likely to be used in the future:
https://www.middleeastmonitor.com/20180501-journalists-chall...
Do you think this guy was a victim of "inaccurate" drone attacks and "collateral damage"?
And some of them are parts of governments.
What you seem to be saying, in what we hear all the time, is spending money and doing work that increased the power and capabilities of the US military will makes us all better off.
You are also seem to be saying that Osama bin Laden is an extremist.
But in the 1970s, as Afghanistan was working towards becoming a more secular society, the US military and intelligence agencies were arming Osama bin Laden and his fellow jihadis, the proto Taliban and proto Al Qaeda. Who wanted, among other things, for the secularization of Afghanistan to stop, and for an Islamic dominated government to come in. An effort which the US succeeded in, along with their partner Osama bin Laden.
This being the case, I am not exactly sure when bin Laden and people like him became extremists. I suppose it was after the US began it's military occupation of Saudi Arabia. Osama bin Laden opposed the US military occupation of his country.
This may sound equivocal about bin Laden, but the US is more equivocal about bin Laden. I think he never should have been armed by the US. People of a like mind said as much then. Others disagreed.
In other words, if the US is making political errors (or is not making errors and is pursuing negative goals), more power and capability to carry out those erroneous policies will not help matters.
For example, Trump just escalated the conflict in the Middle East this week, which only satisfies religious fundamentalists. Handing him more power to do so will not help things, it will just mean more 9/11s in response to the blow he just landed against Arabs/Muslims.
I am not sure why some people instantly assumes that the whole purpose of making drones more autonomous is to make them more precise: historically, the DoD/US military have made virtually zero efforts to even try to reduce civilian "casualties".
Maybe I am too cynical, but I genuinely think that the military is only willing to invest in technology that would help them expand current and future operations, disregarding the impact that these will have in the civilian population of foreign territories... probably because it's orders of magnitude cheaper to just pay someone to write a public statement denying every statistics published by neutral NGOs around the world.
They are also told by a lawyer during those briefs essentially "if you break any of the Geneva conventions or outright any of the things we just told you we will swiftly punish you".
????
That's not even close to true. On any level. You are confusing the inevitable willingness to tolerate civilian casualties with an outright disregard. Civilian casualties have consequences, and they are avoided. Not strictly, but it is a cynical fantasy to imagine that there is a complete disregard for civilians.
I totally get the objection to developing combative AI - that’s a separate ethical question - but you can contribute to the military and still maintain your humane values.
I say fuck that.
It's hard to argue how effective this tactic is, being that: a. most everything relating to this is classified b. it's very difficult to assess how many terror operation were prevented by those actions, even if you have the classified data above.
An amazing book on this subject of state sponsored assassination I advise anyone to read is Rise And Kill First, by Ronen Bergman, detailing Israel's assassination policy from operational, political and societal perspectives - truely fascinating.
Pakistan is a country with which no formal military conflict with the US exists, but it is not one in which no formal military conflict involving the US exists.
The 2001 AUMF is a (ludicrously open ended) conditional exercise of Congress' Constitutional power to declare war, and the parties targeted in Pakistan are parts of groups to which the executive branch has determined that the conditions in that act apply.
It's a complex issue that can't be summed up by saying "fuck that".
I'm trying to find on their website about this principle and it doesn't seem to be there in their "values" section
Did they take it out?
If you are at G and thinking whether you should resign or not, remembers this - the market for AI talent is super hot. You will immediately find lots of great and challenging AI work pushing humanity forward
They'll find jobs quickly, that's for sure.
But "work pushing humanity forward"? There's precious little of that in any skill sector - not at FAANG salary levels, anyways. The vast bulk of the work that the vast majority of us do is simply about pushing the investor's balance sheets forward - not "humanity".
There is a school of thought that recommends “moral” people doing “immoral” work because if those people left then other “immoral” people will take those jobs and more readily implement “immoral” features. So the “moral” engineers have an incentive to stay and act as a front line against “immoral” actions, or at least have an insider’s position for whistleblowing.
Military drones are here to stay, and whether or not the US builds them, other military powers certainly will.
Ultimately, I don’t think this changes anything.
I don't know if my experience is typical or not but I do know that several times in my career I have encountered or been put into a position where it was clear that 'success' was tied to doing something which I felt was also wrong. I have always chosen not to set aside my principles for that success.
And still I know people who have made the immoral choice and reaped the rewards, and then they have used that success to step into places of higher influence or control. They would no doubt argue that they were in a much better place to do good now, because they chose to do something wrong once before.
It is not surprising that this conflict is the underpinning of many dramatic stories.
This is true if the engineer is in such a position of power within the company that he can effectively influence things towards the ethical goal. Most of typical for-profit corporation engineers who care about that ethical goal have no such position. If they strongly disagree with what they contribute to, the best decision for them and the ethical goal is to quit.
Only a small percent of engineers at google are AI experts. Sure, more people use it, but they probably just make a service call and get some magic results back.
As far as I can tell, this is the original: https://www.youtube.com/watch?v=9CO6M2HsoIA
Thinking that the military of your own country is "evil" seems a bit puerile to me. Watching too many movies and TV shows can have that effect.
"Every independent investigation of the strikes has found far more civilian casualties than administration officials admit. Gradually, it has become clear that when operators in Nevada fire missiles into remote tribal territories on the other side of the world, they often do not know who they are killing, but are making an imperfect best guess." [1]
"Leaked military documents reveal that the vast majority of people killed have not been the intended targets, with approximately 13% of deaths being the intended targets, 81% being other "militants", and 6% being civilians." [2]
"strikes have killed 3,852 people, 476 of them civilians. But those counts, based on news accounts and some on-the-ground interviews, are considered very rough estimates" [1]
Not only that, we bomb inside of countries that are (sort of?) our allies, without informing them and without their consent:
"Pakistan's Prime Minister, Nawaz Sharif, has repeatedly demanded an end to the strikes, stating: "The use of drones is not only a continual violation of our territorial integrity but also detrimental to our resolve and efforts at eliminating terrorism from our country" [2]
[1] https://www.nytimes.com/2015/04/24/world/asia/drone-strikes-... [2] https://en.wikipedia.org/wiki/Drone_strikes_in_Pakistan
Yeah, I have no problem calling our military evil.
Doing one evil thing doesn't necessarily make someone evil. Doing a few evil things doesn't necessarily make someone evil either.
If a doctor who has saved thousands kills one person, are they evil? What if that person was a convicted child rapist? What if it was in self defense? What if killing them would save a thousand more? What if killing them would save 10,000 more but the doctor doesn't care about that and would have killed them anyways?
Deciding whether a single person is good or evil is a very complex process. Deciding whether a country or a military is good or evil is enormously more complex and needs to take a lot more into account that you just did.
Arguably, it is precisely the inability to consider international conflict from a vantage point outside a nationalist worldview that is the root cause of all external wars.
The US military (As have all other militaries that have done anything of note) has done some ridiculously evil things. The drone strike program isn't the worst of them by any means, but it's not its brightest moment, either.
Since they were first developed, NATO partners: Belgium, Denmark, Netherlands, Norway Other European countries as soon as the mid-80's: Croatia, Greece, Italy, Poland, Portugal, Romania Middle east since the mid/late 80's: Bahrain, Egypt, Israel, Iraq, Pakistan, Turkey, Jordan, UAE Africa: Morocco, Indonesia, Asia: Singapore, South Korea, Taiwan, Thailand South America: Chile, Venezuela.
How true this is in a specific case depends on what that country’s military is doing at the time; categorically dismissing the idea that a nation's military can reasonably be seen as evil by a citizen seems more than a bit naive to me.
At the very least, most people agree that killing in self defense is not evil.
The GPL has already shown us that a license has the power to change culture and behavior (in however small a way). We should be able to extend this approach to other values we hold dear.
... This is a pandoras box that has already been opened I am afraid.
But as I sit here and think about it, I wonder if its a good thing that a person like myself (that believes I'm on the ethical high-ground) decline these types of jobs.
Someone is going to take the job. Perhaps someone less skilled than myself, perhaps someone less ethical than myself ? What is the result of that ?
As another poster wrote, it's "good" that the targetting gets more precise, meaning less collateral damage.
But to each his own. We need to be able to sleep at night aswell. And that to me also seems like a really good reason to decline.
I'm kind of on the fence about wanting to work in that industry.
Whenever I find myself in this line of thought, I always remember: I am not that special. The impact that anyone has on a workplace is small and mostly inconsequential--more important is group momentum and culture. The likelihood that you'd do bad in a bad workplace is much higher than that you'd be able to stand fast and do good--the world just does not work that way.
On that note, I jumped back off the fence. Not for me.
I would be curious to see the list of employees who quit for this purpose. They are making a statement. They might as well publicly disclose their identities to inspire more people.
Also, wondering. Are most of them financially independent to have made this decision? When money is not an worry, people have freedom to truly align themselves externally with their internal core values. If you are constantly worried paying rent or securing your kids future - people make compromises. That is not ideal to build a great society.
Do you think USA should have no military?
Would you vote to dismantle it?
Of course, there would be no conscientious objectors if it were a liberal event