Corporations need profit to survive because the cost of tomorrow is a surplus of today.
There is a very important factor that heavily influences (perhaps even controls?) how people act to achieve that goal, and sometimes even twists or adds goals.
Is that corporation publicly quoted in the stock market or is it private?
Look at how steam behaves, it's private and more ideological VS how many other publicly quoted companies, whose CEO often sacrifices his own corporation's long term survival for the benefit of short-term profiteering and some hedge fund manager's bonus.
Both need profit to survive, but the publicly quoted company is much more extreme.
When people say corporations only look to profit, what they really mean is that publicly quoted corporations will do everything possible to maximise short term profit at any cost. Is there a CEO caring for long term? Either he will be convinced to change or kicked out. It's almost impossible for someone to resist these influences in publicly quoted companies. It's just how Wall Street works and if that doesn't change neither will corporations.
The people running the world of finance and their culture are what causes enshittification and pushing a zero-sum game to extremes.
While public companies are more likely to be short term focused, even this is not true. There are plenty (ie. thousands) of executives and public companies that are long term focused and tell investors to pound sand and sell the stock (or mount a shareholder challenge) if they don't like it.
Elon Musk is the most extreme example of this. He wants to go to Mars. He is turning Tesla into a robot company and discontinuing or curtailing the growth of some of his most profitable products.
Mark Zuckerberg is another one. He is losing $20 billion a year on VR, and even with recent cuts, will still be doing that. He's spending $50 billion on AI. None of that has anything to do with short term profit. Don't like it? Sell the stock.
Wall Street doesn't necessarily force companies into short term gains: they hold you to perform to what you say you will perform. This is often the trap that leads to poor management decisions, as they overpromise and underdeliver, leading to the enshittification spiral.
All of this depends on the governance structure and ownership structure, and how competitive the business is.
Many public companies for example have only common shares available while a family or an individual retains preferred shares with more voting power. This is how Zuck, or Larry Page or Larry Ellison etc can do whatever they do. Elon just has a reality distortion field so the board gives him a trillion dollar pay package.
How come the board hasn't eliminated him?
Most boards defer to management on most topics and most shareholders do not vote on anything substantial, they proxy vote, which defers to management. And thus management nearly always does whatever it wants, as long as the company isn't a dumpster fire of losses. It usually takes a shareholder activist threatening a hostile takeover or proxy battle to change this dynamic.
It comes back to people. The people (employees, management, board of directors, shareholders) determine what a company does and how it acts. "Numbers go up" isn't always the motivating factor, and I'd wager that the majority of privately held corporations (i.e. small businesses) are fine with "numbers go up modestly" because they are lifestyle businesses, not growth businesses.
I hate that, by the way, but what I hate even more is that this is somehow the most effective way to run economies that we've found so far, and it ends up this way because instead of unsuccessfully trying to safeguard against greed and sociopathy, it weaponizes them outright.
Companies exist to create customers. Everything else follows that. There is no value, no profit, not growth, no action whether moral or immoral, unless you have a customer.
Market incentives by themselves don't tend management decisions towards immorality, unless you've created immoral (or amoral) customers, or you've accepted capital from immoral (or amoral) investors.
It always comes back to people. If your customers or investors are some level of evil (or some degree of amoral), then you as a corporation probably are going to wind up being some level of evil or amoral.
It's up to management and majority ownership to steer those as appropriate... are you're willing to take money from anyone? There's a useful but dangerous veil of ignorance that raises with scale & ubiquity, such as commodity or public equity/debt markets. The resulting anonymity requires diligence from the company, such as Know Your Customer / KYC , and clear statements of the principles & laws of the corporation in its prospectus to attract the right fit of investor... and a backstop of government regulation to encourage or require these minimum standards of behaviour.
But if most people in a society find something "wrong" generally they will organize to prevent that (even if it has value for a part of the society). I think it is simpler for everybody that economics (how we produce and what) is separated from morals (how we decide what is right and wrong).
The way we organize in a society is by having governments, usually elected ones to represent what "most people in a society" actually think, to serve as an arbiter of applied morals in our interactions, including business. To that end, we codify most of them in laws with clear definitions to prevent things like unfettered monopolies, corporate espionage, poor working conditions and hiring practices, etc. This generally works, though it depends on how well a given government and its constituent parts does its job and whether it uses the power it has to serve the entire society's interests or the interests of the elites that drive decisions. We can see right now how it fails in real time, for example.
Morals don't have to be evaluated "objectively" (whatever that is) every time to be observed. Humanity has agreed on many things that make up UDHR, international law, and other related documents. It's not the hard part. Making independent actors conduct their business in accordance with these codes is the hard part. Somehow even making them follow their own self-imposed principles is crazy hard for some reason. When Amodei claims Anthropic develops Claude for the benefit of all humanity but greenlights its use for surveillance on non-Americans, that's scummy. When Amodei claims to be terrified of authoritarian regimes gaining access to powerful AI but seeks investment from them, that's scummy. The deal with Palantir, the mass-surveillance business, is scummy. Framing the use of autonomous weapons as only disagreeable insofar as the underlying capabilities aren't reliable enough is scummy. You don't need to be a PhD in morals to notice that.