Then by that definition AI training has no goal, it's simply a process defined by calculations. But whether you want it call it a goal or not, the fact remains that they look very, very much like goals. "If it looks like a duck, swims like a duck, and quacks like a duck, then it probably is a duck."
> It's plausible that AI "goals" emerge evolutionarily as well
AI training is vaguely similar to evolution, except more efficient and directed