My goal in life is not to maximize financial return, it's to maximize my impact on things I care about. I try to stay comfortable enough financially to have the luxury to make the decisions that allow me to keep doing things I care about when the opportunities come along.
Deciding whether something new is the right path for me usually takes a little time to assess where it's headed and what the impacts may be.
In the vast majority of cases, financial returns help maximize your impact on the things you care about. Arguably in most cases it's more effective for you to provide the financing and direction but not be directly involved. That's why the EA guys are off beng quants.
The only real exceptions are things that specifically require you personally, like investing time with your family, or developing yourself in some way.
I've not found this to be true at all, for a variety of reasons. One of my moral principles that extreme wealth accumulation by any individual is ultimately harmful to society, even for those who start with altruistic values. Money is power, and power corrupts.
Also, the further from my immediate circle I focus my impact on, the less certainty I have that my impact is achieving what I want it to. I've worked on global projects, and looking back at them those are the projects I'm least certain moved the needle in the direction I wanted them to. Not because they didn't achieve their goals, but because I'm not sure the goals at the outset actually had the long term impact I wanted them to. In fact, it's often due to precisely what we're talking about in this thread: sometimes new things come along and change everything.
The butterfly effect is just as real with altruism as it is with anything else.
People don't become quants because they are EAs, they become EAs to justify to themselves why they became quants.
Your first paragraph is just a standard response to utilitarianism, although a poor one because it doesn't consider EV.
Nonetheless I'm not quite sure why merely mentioning EA draws out all these irrelevant replies about it. It was incidental, not an endorsement of EA.
The EA guys aren't the final word on ethics or a fulfilling life.
Ursula K. Le Guin wrote that one might, rather than seeking to always better one's life, instead seek to share the burden others are holding.
Making a bunch of money to turn around and spend on mosquito nets might seem to be making the world better, but on the other hand it also normalizes and enshrines the systems of oppression and injustice that created a world where someone can make 300,000$ a year typing "that didn't work, try again" into claude while someone else watches another family member die of malaria because they couldn't afford meds.
Yes there are flaws in the system, but smugly opting out of it and declaring yourself morally superior isn't helpful. Instead you need to actually do the work of understanding the system, its virtues and flaws before you can propose changes that would actually improve things.
So, the things that matter the most for most people?
Studies pretty consistently show that happiness caps off at relatively modest wealth.
Or in prison for fraud.