I don't think that my logic allows us to conclude that. The idea of mathematics is that it is possible for humans to create and apply a system whose correct application makes the genesis of errors, howsoever subtle or undetectable, impossible.
Given this, and the likelihood that a mathematical error (of the conceptual type "any convergent sequences of continuous functions converges to a continuous function", rather than of the computational type "1 + 1 = 1") will not be found, it is especially important that practitioners of mathematics know how to apply their tools correctly, which they probably will not have learned by cheating in school; and, if they are able to apply those tools correctly, then they will not create errors.
(I grant that the weasel word 'correct' and its derivatives risks making this argument circular. I grant that human mathematicians collectively make an awful lot of errors, although I hope that we make fewer professional errors than many other professionals.)
> However, I wasn't really talking about people wanting to become actual mathematicians - those probably wouldn't use Wolfram just because they actually love crunching those numbers manually.
This comment seems to suggest to me the source of our disagreement in the first paragraph. As a mathematician, I don't crunch numbers professionally, and, when I have to do so outside of my profession, don't love crunching them manually. I suspect most mathematicians are in the same boat.