That wasn't the case before coronavirus and won't be the case afterwards.
The best robot teacher that exists is Duolingo, and I don't know a single person who has learned a language to a conversational level that way.
As for why this wasn't the case before coronavirus: there was no reason to. There wasn't good enough of a market. Now, there is, so perhaps some ideas from this situation will stick around for longer.
(Then again, I've heard numerous reports that in general, remote school classes are a huge dumpster fire. I don't expect any of that staying when the schools are allowed to reopen. But maybe the postmortems will bring some nuggets of insight about effective remote schooling.)
Kids grow, and every person's body is different: even when they've been playing for 10+ years and have reached the most advanced levels, they're still asking their teachers and fellow musicians to "debug" posture - because subtle changes can have outsize impact. This is why master classes exist, for instance.
There are all kinds of tools in this debugging toolkit. Some of them are rather tactile, such as feeling how much force is being transferred from the hand to the tip of the bow, feeling how "soft" the bow-hand is being held, feeling how much the bow arm's elbow is being allowed to sink under its own weight, or how much the shoulder is being tensed - because muscle tension can be a huge issue for string players. Not to mention the huge number of minute adjustments that can be made to various angles and might involve tweaks to shoulder rests, chin rests and so on.
Right now teachers have to debug, without being able to use their hands. A tall order for anyone. Just imagine being asked to debug software without being able to use your hands, or to poke at the system: no attaching a debugger, no inserting print statements - all you can do is give verbal inputs and observe outputs. It's a huge responsibility, when failing to debug, or getting it wrong, can have serious consequences (life-long problems with pain, for example).
Some teachers are much better than others at such "debugging", so yes from that perspective it sure would be nifty to invent some technology that guarantees more consistent results. But is it realistic? Every body is different physically, and the human touch and expertise plays a major role.
So, I can't help but think it would actually be a simpler problem to create an AI that can reliably debug software, vs creating AI that can reliably debug a musician's physical relationship to their instrument.
As for debugging in general, I agree and offer an analogy that might be intimately familiar to everyone here: debugging your parent's/neighbour's/friend's computer over the phone. I can handle it for about two minutes before my head hurts and I start boiling from anger. These days, I flat out refuse helping this way; if it can't wait, I only explain them how to install TeamViewer and what numbers to spell me over the phone.