Having read Singer, I'd say that the motivation is to take the sense of "doing good in the world" and apply reason to it. There's a kid drowning in a pond in front of you and there's a kid drowning on the other side of the world. Why do we act differently, Singer asks? From there he goes on to build an ethical case that it's our _obligation_ to give significantly more to charity.
I don't recall Singer ever advocating for earning ever more money, nor certainly doing so at the expense of others. And I'm fairly certain Singer would strongly object to deferring giving to some uncertain future point.
Setting aside this article, where's all the hate for giving to charity coming from? Guilty consciences? Shouldn't we as a society celebrate and encourage giving? It seems the alternative more often then not is to accumulate.
I agree with your interpretation of Singer, for what it's worth: I don't recall him ever encouraging maximum personal income in any of his books; only observing that someone could do more good (in the Utils sense) with more resources.
> Setting aside this article, where's all the hate for giving to charity coming from? Guilty consciences? Shouldn't we as a society celebrate and encourage giving? It seems the alternative more often then not is to accumulate.
I don't think anybody really hates charity. What people (rightfully) identify is the "hazard" of motive in charitable giving, particularly public giving: when someone is known publicly to donate, it becomes impossible to distinguish truly benevolent motives from self-interested ones (even if those self-interested motives don't "really" matter from a Utils perspective).
Separately: charitable giving on the scale performed by billionaires demonstrates latent injustice. Even if not intended as such, it effectively represents the conversion of a just action (giving to the poor is right) into a whimsical or motive-driven one (I give to the poor because I want to).
> when someone is known publicly to donate, it becomes impossible to distinguish truly benevolent motives from self-interested ones (even if those self-interested motives don't "really" matter from a Utils perspective).
Who cares why someone donates, as long as they donate? Is there some thing called "truly benevolent" donations which keep more people alive longer, or have some other measurable impact? I actively am having trouble parsing this. It seems like you realize how ridiculous this sounds, by adding the bit at the end about how it doesn't matter, but then why even bring it up? Like what is special about "truly benevolent" giving?
It's not as though these motives don't "really" matter, they ... just don't matter at all?
2) As far as I understand it, EA actually is a very, very simple concept that ultimately states that 'doing good in some ways is better than others' aka some things make us 'feel good' but may not be effective. Literally 'be effective'.
3) EA in practice can be a secular kind of moral signalling and personal brand washing. You can't use religion to be huckster anymore so much (FYI I support religion, it's just that it can be easily appropriated), or rather, the 'My Pillow Guy' does but that's to a more limited audience, so 'EA' can be used for the same purposes among the white collar culturally secular masses. Which is how it was used at FTX.
Like anything we should be thoughtful and skeptical about all of it, try to put things in context, don't accept wild claims at face value and be perennially wary of people who just 'talk' about things as opposed to 'doing' them.
Narratives, idol making, lack of skepticism are the problem here.
Incidentally, there's an individual who brought evidence to Bloomberg about FTX months ago and they avoided it partly due to fear of lack of access - and - a conflict of interest with advertising. Such is the power of money with tentacles. Same thing for major geopolitical powers with money and leverage.
You're misreading it.
EA, on its own terms, reads something like what you describe. But many people do not believe that it plays out that in the real world, and instead is something like "prosperity gospel for agnostics" - a thin rationalization for gobbling up as much as possible.
The hate isn't directed at "giving to charity", it is directed at rich people advancing their own interests while calling it charity. (For instance, SBF convincing Propublica to burn their reputation publishing weak, politicized studies for $5M.)
To try to mind read, if everyone's simply saying: "it's wrong for someone to do immoral things and justify them with the promise of future charity" then I suspect there's not much to say except, "Yes, duh."
People don't hate when people give to charity.
People hate when people earn money in ways that have negative side-effects and justifying it by: promising to (at some point in the distant future) give (some portion of) their money to (select) charities.
Nobody hates giving to charity. Some people are skeptical about the utility of donating millions of dollars to an anti-skynet foundation founded by a guy whose primary accomplishment is writing a Harry Potter fanfic.
You can see the contempt he has for people in this report: https://www.gawker.com/money/sam-bankman-fried-is-so-stupid-...
Because, and this may not come as a shock after the SBF revelations, those people may not be sincere and honest, to others, themselves or both.
With something like EA the interesting part isn't what's being said out loud, it's what philosophic and in particular aesthetic goals aren't literally spelled out. This is the case for almost any form of activism, ideology or lifestyle. The interesting parts are never the banal things people advertise.
For example take an adjacent belief, also practiced by the EA folks. Veganism. Now superficially they will tell you this is all about animal welfare, but what they won't tell, they may not even be aware of it, is also that it's an upper class signal of purity, their own modern recreation of the Brahmin class. When you look at things that way it suddenly becomes less surprising why a 28 year old effective-altruist Potterhead also seems weirdly fascinated with Indian caste systems and imperial Chinese social dynamics.
Really, the person you should be preaching Peter Singer to are the people inside of EA who might be misguided with promises of fat stacks of cash. Criticizing atheists for not knowing "the true teachings of Jesus" really doesn't do you much good, just makes you look defensive and responsibility-avoiding.
I'm confident that there are lots of people who are part of EA for the reasons the EA community advocates. My expectation is that the dollar weighted average of participants motivations would be more cynical. I think the parallel to religious participation kind of writes itself.
When Effective Altruism started gaining traction in the early 2010s, Earning to Give was a poster child of EA recommendations, hailed as a counterintuitive yet rational outcome of utilitarian analysis. It has long been considered mainstream in Effective Altruism and was advocated heavily by the organization 80,000 Hours, which provides career advice for people wanting to do good in the world.
It's not a bad thing to earn money in ethical ways and then also donate money to good causes. Unfortunately, one of the specific examples that gets the most attention is to go into high-frequency trading (instead of another career) so you can earn money to donate.
https://80000hours.org/career-reviews/trading-in-quantitativ...
Although 80,000 Hours no longer emphasizes Earning to Give, it has never taken public responsibility for the damage it has done by steering young people toward career decisions that prioritize Earning to Give and for broadly legitimizing Earning to Give in EA circles.
If this is the argument, then the criticism should focus on HFT, not on EA.
Most HFT employees aren't earning to give, and donate little to none of their income to effective charities. If HFT is harmful, then those non-EA HFT employees deserve at least as much criticism as EA HFT employees. So if you're worried about HFT, you should start with criticism of HFT, and if someone tries to defend HFT by bringing up EA, _then_ you can debate whether the benefits of earning-to-give outweigh the damage done by HFT.
But this article, and your comment, are doing the exact opposite -- you're _starting_ with criticism of the EA movement, and only mentioning HFT in passing! It comes across as looking for an excuse to criticize EA, rather than being genuinely concerned about harm caused by HFT.
The criticism here is of the claim that _earning to give_ is charitable, not a claim that HFT is charitable.
Earning to Give as a specific strategy for college-age students isn't terrible either. Of course one can pervert it by going to maximize earnings for a tobacco company, but people aiming to do good would most-likely steer away from these types of things anyway.
I think the reason 80,000 Hours moved away from the Earning to Give strategy was because of the numerous new (and better) opportunities they were able to find as they put in more research time and as the movement grew in its size.
I think ETG basically advocates for a kind of central planning - replacing a large bureaucracy with a class donors who believe they have special knowledge about what would help the lives of others.
ETG encourages people to think of the positive side of what they're doing (donating money) without considering the negatives (harmful careers). Being rich, and focusing on getting rich as a goal, distorts your thinking.
My understanding of ETG is that people have the goal to earn as much money as possible so they can donate more. But if you take the idea of rationally deciding how to do most god, of course you have to balance the good you do with your donations against the bad you do in your job. And you are the worst person to evaluate that rationally.
I'll admit it's tricky to figure out when this really matters. On the margins, it is clearly the case for plenty of people, maybe even most people, that they can do more good by earning to give than they can by directly providing charitable services. But by earning to give, you're relying on and presupposing the existence of other people doing the direct service provision. If everyone becomes a stock broker and no one becomes a nurse, the world does not end up better off. And the line where that happens is nebulous and not necessarily at the 100% mark or even at a majority mark. A world with more stock brokers and fewer nurses gradually gets worse. I don't know where that point is, but maybe at least when a charitable organizations reaches some threshold where the bottleneck to providing more service is labor rather than money? "Pay more" to solve the labor problem doesn't necessarily work, either, because if you pay your employees a high salary, then Givewell will rank you as less efficient and all those charitable people will stop giving you money. This isn't true of all non-profits, of course. Employees of the University of Alabama Athletic Department and the New York Metropolitan Museum of Art can earn 7 and even 8 figures and still receive massive donations, but real charities usually can't do that.
To be clear, I'm sympathetic to the marginal focus. I don't vote. Why? Because obviously, one vote has never made a difference in any voting precinct I've ever lived in for any election. The vast majority of the time, a single vote will never make a difference in any election. But obviously, someone has to vote. And we're probably at a level of civic engagement in the US where the percentage of people who vote arguably leaves the country worse off, with an unrepresentative government and citizens who feel disconnected and ineffective and don't identify with or agree with the goals and mission of the their nation.
For all those reason, even though I don't vote, I'm not going to start a campaign trying to convince people in general that they also shouldn't vote. Earning to give seems in a similar category. Go ahead and do it, but it feels dangerous to advocate for it in a general way implying that it is what all sufficiently intelligent and high-skill people should do. You're creating a class-divided society where anyone who directly cares for others is looked down on as low status and ineffective. This is unfortunately much harder to quantify than "how many malaria cases can I prevent in the next ten years," but we know at some point it becomes a real problem. Inability to quantify doesn't make it go away.
* Finance professionals who gave less to charity?
* altruistic folks who gave to poorly validated charities?
* Changemakers who decided instead to go for the money and thereby reify the social structures of their lives?
* Regular people who thought less hard about ethics, e.g. giving vegetarianism or veganism a go?
* Regular people who never considered becoming megalomaniac crypto scammers?
And whether EA is good or bad as a whole depends on the characteristics and composition of this counterfactual group. IMO, that's pretty hard to get a good grip on.
In general, for hard questions like this, I take a “know them by their fruit“ approach, I.e. seeing how folks do on easier-to-evaluate ethics tests, like whether someone is vegetarian or vegan. By that light, the movement comes out looking pretty good. If you apply a more anti-capitalist test, the movement does not look so good.
But let's say people do get rich and donate to charities to have a clean conscience... is this a bad thing? EA is not a reason to do outright immoral actions to donate to charity but out-competing a colleague for a promotion? Leaving a job after 6 months to take double the salary elsewhere? I think these are all decisions I've personally taken for EA-based reasons and donate a fixed 10% per month so yes ultimately I've ended up richer but I fund more charities and that seems like a good thing.
Building ponzi schemes to redirect money from investors to charities is never okay.
Similarly, if SBF claimed to follow EA that would not be as big of an issue. The problem happens when you consider that he was also championed by EA as the poster boy, and now everyone is realizing he has been a scammer all along.
Moreover, the way I saw the value proposition of EA is that "we are better at doing math to calculate the effectiveness of charities, finding out which ones are scammy, and also to assess risks to humanity, so you should trust us to do that." That value proposition really blows up when top-line EA couldn't do their due diligence and figure out an eight billion dollar hole in the ground even when their whole existence depended on it.
youre really going to get mad at the EA movement for not realizing FTX was a fraud when literally, and i actually mean literally, no one knew it until a couple weeks ago. He bamboozled every government and financial institution in the world, but the humble EA movement should have known.
https://www.effectivealtruism.org/faqs-criticism-objections has a small paragraph on the differences. Having been involved with EA since 2017 in my own small way I can absolutely say with confidence it is not the same. Ends do not justify the means. It is simply meant to be a guiding philosophy to how you donate; treat all lives equally and donate focusing on lives saved per dollar. This should be backed by research proving it's effectiveness and can be unintuitive. There is a vocal longtermist view point that lives in the future should be valued the same resulting in some pretty weird ideas but I ignore most of that.
Meh, that's a weird take. If it was, it would ask people to act only to maximize outcome and ignore other constraints, because they would be less secondary. As far as I'm aware, it doesn't, it only suggests that, if you want to do something altruistic, you might want to look at what's most effective and not just do anything that vaguely feels or sounds like it might be helpful.
Effective Altruism is a large movement, just because some behave a certain way, doesn't mean all subscribe to those ideas.
The idea of doing bad to do some good seems to be a theoretical concept, but not likely to hold up in reality.
(It says nothing about the believers sacrificing for the greater good; that's a moral good in and of itself).
[0] https://forum.effectivealtruism.org/posts/P9XEtnHFQF24HCC8v/...
It feels like someone optimizing for "power"
To a first approximation, everyone on this site regularly participates in threads talking about how to maximize their compensation, without anyone feeling a particular need for a fig leaf.
If someone does hassle you about your justifications for your wealth, it seems really easy to write that off as jealousy.
I am sure that somewhere in the world, there is someone so simultaneously ambitious yet wracked with guilt about their ambition that they need an excuse to satisfy their ambition. I just doubt that there are enough such people that Effective Altruism can use such people as a foundation for its existence.
Is there anything bad about wanting to be wealthy or having power? I dont think so.
I think the actual problem is when you are already rich enougg and still want more and more.
That is the cancer that is ruining our society.
It's like trying to defend against criticisms of religious fundamentalism by saying "What about the teachings of Love and Kindness by Jesus? Have you ever read the Bible?"
The literal words of the Bible doesn't matter as much when people who claim to follow it in the earnest are also doing the most harm. You need to have a conversation internally first to figure out what your ideology is, otherwise it just looks like avoiding responsibility to the "outsiders".
[0] https://forum.effectivealtruism.org/
[1] https://forum.effectivealtruism.org/posts/WdeiPrwgqW2wHAxgT/...
You can't assess a philosphy by its worst adherents. However, you could attempt to quantify the net good/bad done by its adherents and make an argument about the effect the philosophy has on human personality. But then you'd be doing science and statistics in pursuit of understanding how to make humanity the best it can be, and you'd find yourself in trouble: You don't want to be associated with EA, but they keep emailing you to hear about your findings. :)
Earning to Give is to EA like Copyleft is to Free Software. It's on the more radical side of the philosphy, and it definitely leads to some internal contradictions. It is, at times, contentious within the movement. There are a lot of people who care more about harm minimization who may oppose Copyleft and Earning to Give, and there are people who are more focused on net-good-maximization who may argue in favor of both. There's no getting around those arguments because there's no getting around complexity when it comes to broad new philosophies.
I can not possibly imagine an idea which has done more harm to humanity than that one.
Without getting into a flame war not fit for HN, the same cannot be said for the supernatural foundations and epistemic claims of religions - the foundational ideas of EA are not the problem in this case.
I think Sam Harris breaks it down decently here: https://www.samharris.org/podcasts/making-sense-episodes/303... though that's a little out of date with how fast news on the topic is moving.
It's also another reminder why it's good to keep your identity small [0] and deal with the ideas directly.
Said another way:
"Go three-quarters of the way from deontology to utilitarianism and then stop. You are now in the right place. Stay there at least until you have become a god."[1]
[0]: http://www.paulgraham.com/identity.html
[1]: https://twitter.com/ESYudkowsky/status/1497157447219232768?s...
"You have been quoted as saying: "Killing a defective infant is not morally equivalent to killing a person. Sometimes it is not wrong at all." Is that quote accurate?"
"I did write that, in the 1979 edition of Practical Ethics. Today the term “defective infant” is considered offensive, and I no longer use it, but it was standard usage then. The quote is misleading if read without an understanding of what I mean by the term “person” (which is discussed in Practical Ethics). I use the term "person" to refer to a being who is capable of anticipating the future, of having wants and desires for the future. As I have said in answer to the previous question, I think that it is generally a greater wrong to kill such a being than it is to kill a being that has no sense of existing over time. Newborn human babies have no sense of their own existence over time. So killing a newborn baby is never equivalent to killing a person, that is, a being who wants to go on living. That doesn’t mean that it is not almost always a terrible thing to do. It is, but that is because most infants are loved and cherished by their parents, and to kill an infant is usually to do a great wrong to her or his parents."
"Sometimes, perhaps because the baby has a serious disability, parents think it better that their newborn infant should die. Many doctors will accept their wishes, to the extent of not giving the baby life-supporting medical treatment. That will often ensure that the baby dies. My view is different from this, but only to the extent that if a decision is taken, by the parents and doctors, that it is better that a baby should die, I believe it should be possible to carry out that decision, not only by withholding or withdrawing life-support — which can lead to the baby dying slowly from dehydration or from an infection — but also by taking active steps to end the baby’s life swiftly and humanely."
- Own your name
- Excommunicate those who claim to be of your movement that you don't like
Otherwise you will lose control over the term you invented and it goes to hell. Witness Buddhism (Pure Land is basically anti-buddhism), Christianity (where to even begin with all the splitting?), Islam (two big sects trying to murder eachother), communism (pol pot, mao, stalin).
Protect your brand or see your name be dragged through the mud.
Is EA actually... bad? It would seem at worst it's not as good as advertised.
If 10% of the followers donate 10% of their income to charity, isn't that a massive win?
I'm not part of EA myself, like many I'm mildly annoyed by the arrogance but also feel like that's not enough to substantially criticise a movement.
It’s the “do well while doing good” philosophy that underlies EA, that is quietly grinning in the shadows eyes glowing sick with greed. Doing well while doing good means turning a profit (or gaining/maintaining power, after the fact) by fixing important problems. The issue here is that you’ve got a principal-agent problem. Jeff Bezos has $150B, says he’ll give it all away in his lifetime. But he’s not going to give it away. He is going to invest it. You’ve got the Bezos Earth Fund which is looking to bootstrap solutions to climate change. The end result of just one winner? Bezos interest group has a huge foot on the thing that controls climate change. This is a lever, for them to pull as necessary. The same way Twitter is a lever for Must.
EA is fine, but the problem is that in America “the best of us are by definition the richest” and so it’s pretty hard to engineer a situation where the rich don’t get richer much less agree to concede power to the next party.
Without EA he might still have some BS foundation like many before (I'm not actually taking a stance on his work, I have no idea about it). He might use one to push his weight around and have an outsized influence.
Maybe without the label of EA there would be more pressure to justify the scope and impact of charitable work? I'm not sure.
It's not as if you get a bunch of free stuff the moment you say I am doing something for EA purposes.
Any activity can be described this way. "Working in a soup kitchen?" "I bet you are doing this because it makes you feel superior to all the others who do not volunteer, right? And also, you feel less guilty seeing homeless people now, after all you are already doing something, so there is no reason to feel obligated to them aswell".
I am sure that there are many entirely sincere EA enthusiasts, wanting nothing more than see their money be used to better the lives of others. Spending time organizing, fundraising, optimizing charity, doing outreach, some of it might even be "hard work", done gladly without receiving any praise or compensation. On the other hand a friend of mine is regularly getting up at 2am to drag the injured, freezing, decrepit and mentally ill into a place where they can get some semblance of help. To me it is clear who is actually more deserving of respect and who does more to improve the lives of people.
If everything you have is money, all problem look like they are solved by spending.
What disturbs me is crypto scammers and the uninformed spouting off about “AI alignment” buying influence within the government, with stolen money
Also people who are either dishonest or comically aware of conflicts of interest, pretending they’re above it because they’re “rational”.
Eg the AI alignment stuff is mostly funded by corporations and is part of their marketing arm
I think it's normal charity that should be considered conscience laundering. You give money to the Red Cross and feel good about yourself, and ignore the fact that the money is embezzled or squandered without helping anyone. It's people that don't care where the charity money goes or the long term consequences of it when it does at least go to someone in need that should be criticized.
The article doesn't accuse them of doing anything bad exactly, but it's kind of... in the background? And even if the article writer didn't intend that, a lot of people, when they discuss this type of thing do intend it. That being rich is somehow bad, or means you did something bad, or reflects poorly on your moral fiber. Which I want to point out is a huge assumption and should be drawn out and pinned to the ground as something worthy of discussion.
My own (more extreme) take is that not only have none of them done anything to feel bad about, but that most of them are living hugely net social positive lives and would have been even if they never gave so much as a dime to charity.
If your philosophy tries to contradict that, you need to figure out why your philosophy is wrong, not double down on counterintuitive reasoning as to why it is right.
If you can't see why and don't have your eyes open as to why this is obvious, I can't help you.
Humans really are afraid to be punished, they've all been kids at some point, the fear of being bad is deeply ingrained.
In practice, the more important and beneficial part is to be seen as good by others. If someone can convince enough people, they will have little doubts about themselves.
As a rule, I consider any advertisement of virtue as suspicious.
How you got rich is important, because damages as well contributions must be accounted for.
The notion of 'end justify the mean' probably just means you end up being a docuhebag instead of contributing anything meaningful.
It seems a little hypocritical to use him as ethical cover without adhering more closely to his specific ethical proscriptions (which don't, to my knowledge, include Internet Rationalist stuff like AGI and colonizing the galaxy).
If you got pushback, then I guess it must be controversial what "EA" refers to at the moment. I hope it shakes out in a sensible way.
I feel like there's either a disconnect between your perception of the EA movement, or mine. It could totally be mine! I know that a lot of the EA movement has been slowly shifting focus to long termism and X-risk. Is this your main criticism of the EA movement today? Is it your sense that it's no longer about actually supporting charities ala Givewell, but rather this other stuff, and that's what you dislike about it?
(Genuinely curious here, since you seem to be super anti EA and I don't understand why.)
On the Overwhelming Importance Of Shaping the Far Future
If you are right, I am confident that, at best, there's a small comment about how it might be the case that in some situations, particular people could do more good with resources, and not (as you put it!) that "wealthy people in developed countries deserved aid more than poor people"
The headline seems to imply that gaining wealth is inherently a slimy thing to do. I just don't buy it. And if those people choose to donate some of that wealth, that seems strictly better than just being wealthy on its own.
Maybe it's just so difficult for people to get ahead these days that they start to focus more on what other people have and become bitter?
What a rude and erroneous way to summarize a complex movement comprised of so many different activities and points of view.
"Effective" by what standards? "Altruism" according to who? One person's altruism is another's abuse. Safe Injection Sites for drug addicts are a good example, on one hand they are "altruistic" in trying to prevent deaths of drug addicts and hopefully steer some toward rehab programs. On the other hand they bring crowds of junkies to whatever neighborhood they're in, likely making life more dangerous for any residents/businesses.
Is the use of imminent domain to build public infrastructure "effective altruism" when it bulldozes peoples' homes for a new road?
Most of the time calling something "effective altruism" is just the time-honored tradition of projecting a veneer of righteousness onto an action, and sometimes an attempt to shut down counterarguments that are pointing out negative impacts.
It's just the new, trendier "making the world a better place".
What if two EAs making the same amount give the same 10% to two effective but opposing charities (say one donates to a charity promoting veganism and the other donates to a charity that provides free meals, including meat, to starving populations), is their altruism still effective or does it cancel out?
Like I said, it's "making the world a better place" 2.0. Better for whom? With what trade-offs? Well that seems to be up to the EA in question, with vague guiding principles of "be good, honest and focus on helping as many neglected people as possible" (https://www.effectivealtruism.org/articles/introduction-to-e...). Who decides who's being neglected? Can a group of people, say a subsistence farming community be considered "neglected" if they're satisfied with their lives, even if they're technically living in extreme poverty?
I could go on in detail, but reading that site's "values" is largely just a collection of left-wing-ish tropes where it's assumed the reader already knows what otherwise vague terms mean and agrees with their definitions. Which is ironic given Value 3:
"Rather than starting with a commitment to a certain cause, community or approach, it’s important to consider many different ways to help and seek to find the best ones. This means putting serious time into deliberation and reflection on one’s beliefs, being constantly open and curious for new evidence and arguments, and being ready to change one’s views quite radically."
Even prescribing the 10%+ of income donated to charity, so? That's no different than a tithe. Only here you get to choose the church's values assuming you pick anything in the general direction of "good".Ultimately it suffers from similar attempts to replace religion with secular humanism. The belief system is so abstract that, despite all the words it and its advocates spend describing itself, it doesn't really stand for anything more sophisticated than the lyrics to the Power Rangers Wild Force intro: https://www.youtube.com/watch?v=8Y3Ib0YNFaQ
It was my own country that severed the last ties that the Western world had with medieval society's focus on the afterlife and overruled the Bible on economics. To quote Madonna we live in a material world.
That being said EA has clearly gone completely off the rails and FTX should be the final nail in the coffin. It's become a way for nerdy people to do stats and thought experiments and pretend they're doing something altruistic. EA will never be net positive given the massive negative that's been done by SBF and FTX.
Can someone explain to me what happened here? The last I knew about EA was from a JRE podcast in 2017 with macaskill, where his ideas were to spend currency where it had the greatest impact and with the greatest efficiency while presenting the least friction to give, and I recall these ideas and the charity being supported by thinkers like Sam Harris and others. Now I'm reading connections with sbf and future now and you'd think macaskill had gone full dark side. Macaskill's wiki says he wrote a book about longtermism which seems fine on its face but I've only read negative things about here on HN.
Without knowing more, Seems like people are being reactionary to me. What's the scoop?
What I suspect might be happening is that "do gooders" (as research has shown) are often attacked by others because those that do good are a "threat" to the status quo. Rather than living your life as you do, when you encounter a vegetarian, you are reminded that your meat consumption causes suffering in animals. When you encounter an EA (Effective Altruist) giving 10% of their income to charity, you are reminded about how you too could do the same but are choosing not to. So for many people, it is easier to find some shred of hypocrisy or fault in those doing the good and thus have the clean conscience to ignore the good aims.
In the case of SBF, if he essentially stole from a bunch of retail investors, and then went about doing altruism (this is not yet verified, of course) did he ultimately create a net positive? That only works if the investors in FTX did not lose out, which they did.
Anything that goes into the 'ends justifies the means' territory ends up badly. Sometimes immediately, sometimes over time, but always.
He did donate and fund many people through his FTX Foundation and donations via Giving What We Can. I know this because 1) people were funded with real money, even if some people who have not received money yet are of course screwed and 2) I did some voluntary data analysis work for some EA orgs and saw figures, SBF donated.
In the case of EA, we now have exactly one instance where someone engaged in this ends-justify-the-means behavior in a morally problematic way, and this is for some reason supposed to show that EA is irredeemably flawed. This seems a bit absurd.
“Ya. Hehe. I [SBF] had to be. It’s what reputations are made of, to some extent. I feel bad for those guys who get fucked by it, by this dumb game we woke westerners play where we say all the right shibboleths and so everyone likes us.”
https://www.gawker.com/money/sam-bankman-fried-is-so-stupid-...
EA doesn't support this. It's absolutely not okay to do immoral things to make money and even things like Earning to Give via legit quant roles is and was pretty hotly debated.
It's basically a license to be a charming scoundrel. Everything good you do can be balanced by some vice of yours, and that's how we get powerful people abusing their employees and other relations.
The MO is not new, you might have heard of the Sacklers donating to art. Should it buy them any sympathy for what they've done? Doubt it.
If you want to do good things, do good things. As soon as you do a good thing and get publicity for it, I'm going to think you're buying publicity. I think there's an episode of Curb Your Enthusiasm about this.
If you want to turn money into publicity, then giving locally to high profile causes is optimal. EA discourages both in favor of more global, more impactful causes.
Consider learning more https://www.effectivealtruism.org/
This is why Jesus said:
> Therefore when thou doest thine alms, do not sound a trumpet before thee, as the hypocrites do in the synagogues and in the streets, that they may have glory of men. Verily I say unto you, They have their reward.
> based on […] some quick reading I’ve done on the topic
and is dominated by a quote from Hitchhiker’s Guide.
There are much better considered critiques of effective altruism than this article.
The good delusion: has effective altruism broken bad?
https://www.economist.com/1843/2022/11/15/the-good-delusion-...
Edit: this article is discussed here https://news.ycombinator.com/item?id=33618156
I wrote a long piece about the features Twitter needs, and I'd say that's a nice long article. This one is definitely just a quick note, but hopefully with an insightful connection that resonates with the HN crowd.
Long before the internet, and the means to make up language and spread the most nonsensical ideas to millions of people across the globe in milliseconds, people, some of whom became wealthy, acted altruistically.
One does not need wealth to practice altruism. Many people do it instinctively.
To suggest or even imply that altruism not backed by wealth is insignificant or even less significant is not a novel idea, nor is it non-obvious. It is, of course, deeply cynical. We can expect such unoriginal "thinking" from those entranced by the so-called "tech" industry.
Quoted from https://ssir.org/articles/entry/the_elitist_philanthropy_of_...
How could anyone take an article that includes a sentence like that seriously?
It's written like a paper and I quoted from "the abstract".
To understand the conclusion in the abstract you obviously need to read the paper.
"But", they say, "that's got to be the most valuable way of knowing what to do, basically by definition". No, you're just doing that thing again: you're measuring your measuring sticks using the same system, and then measuring your way of measuring measuring sticks using the same system again, ad infinitum.
"But isn't that better? Your moral system must be worse than that." Yeah, mine's worse _in your system_. But it's better _in mine_, and to me you seem like... well.... a self-absorbed buffoon who found a way to fetishize utilitarianism and wealth-maximization and reason themselves out of any imperative to feel empathy or consideration for, like, the people or communities around you.