Agency is overrated. The AI does not have to be an agent. It really
just needs to have a degenerate form of 2): a selection process. Any kind of bias
creates goals, not the other way around. The only truly goal-free thinking system is a random number generator - everything else has goals, you just don't know what they are.
See also: https://en.wikipedia.org/wiki/The_purpose_of_a_system_is_wha...
See also: evolution - the OG case of a strong optimizer that is not an agent. Arguably, the "goals" of evolution are the null case, the most fundamental ones. And if your environment is human civilization, it's easy to see that money and compute are as fundamental as calories, so even near-random process should be able to fixate on them too.