Because a minute is a huge amount of time (and, by the way, don't forget, that minutes don't exist, I just understand that you mean 60 seconds) and there is no such thing as "leap milliseconds" because UTC is literally just counting seconds. Basically, UT1 already is an implementation of what you call "leap milliseconds". Not literally, but achieves the same thing. And it is way too complicated to use it in practice outside of astronomy-specific tasks.
So, to sum it up:
- TAI is a real thing, it has a concrete meaning, and it's "leap infinity".
- UT1 is a real thing, but is unusable in practice, and you could think of it as "leap ms"
- UTC until yesterday was a real thing, meaning time, which has seconds equal to TAI-seconds, but not drifting from UT1 for more than 0.9 s. Since today it's broken and I'm not sure what it even means anymore — I mean, not in practice, but "platonically".
- Nobody just introduced a standard that would mean "time with seconds equal to TAI-seconds, but not drifting from UT1 for more than 59 s" yet. I guess you could be the one to do it, but I'm not sure it would get a wide adoption.