Models don't learn by you telling them something, the model doesn't update itself. A human updates their model when you explain how something works to them, that is the main way we teach humans. Models don't update themselves when we explain how something works to them, that isn't how we train these models, so the model isn't learning its just evaluating. It would be great if we could train models that way, but we can't.
> Humans learn vast amounts of information from examples.
Yes, but to understand things in school those examples comes with an explanation of what happens. That explanation is critical.
For example, a human can learn to perform legal chess moves in minutes. You tell them the rules each piece has to follow and then they will make legal moves in almost every case. You don't do it by showing them millions of chess boards and moves, all you have to do is explain the rules and the human then knows how to play chess. We can't teach AI models that way, this makes human learning and machine learning fundamentally different still.
And you can see how teaching rules creates a more robust understanding than just showing millions of examples.