I think it boils down to the (intellectual) effort needed to synthesise your thoughts, which is a skill that leads to deeper understanding.
I'd say ChatGPT is a bit better as a tool than copying from a colleague, just because you still have to double check for correctness. However, does it rob students of that opportunity to think deeply and therefore learn? I'm not sure to what extent, but I'd imagine that there's an impact to _some_ extent.
ChatGPT is also going to get better, and will eventually be pretty accurate (or, at least as accurate as a student of a subject may be). My stance is that in school, you should be learning to think instead of memorising. If the goal is to memorise, however, then use ChatGPT isn't that harmful (you'll still fail for memorising the wrong information).
Overall I'd treat ChatGPT as a tool, and give students the facts: first gain a deep understanding of a subject so you can learn to verify, then use the tool later in life.
I'm also mindful of the fact that if an AI is not guaranteed to be correct, which is the case right now, then students now have an incentive to _understand_ the subject so they can verify, which might actually be a net good?