Let's ignore for the moment whether it's even needed, whether devs will just code stuff that doesn't need much math anyway etc. as that's a different topic. I want to focus on whether autodidacts get the same level of CS knowledge as university graduates (with good grades).
My CS education did have a lot of difficult math that I enjoyed but in a running-a-marathon way, not in an eat-a-piece-of-cake way. Having regular lectures, homework and topics given to you by experts in a logical order is very important for many people like myself, so they have no gaps and get a well-rounded overview of CS fundamentals. Designing your own curriculum is often how you end up being a crank as well (e.g. see "autodidact physicists").
And it's not just calculus and analysis, but also graph theory, complexity theory, abstract algebra, linear algebra, complex numbers, formal languages and automata, information theory, theory of compression and encryption algorithms, signal processing like Fourier theory, control theory, optimization and machine learning, and various other things you learn here and there, such as Petri nets, quaternions etc.
And also, outside of extraordinary life circumstances, why not get admitted to a university anyway, if you're so enthusiastic about all this theory stuff that you'd learn it to good-grade level just by your own motivation anyway? For example in Germany, university is free and if you prefer you can even skip all the lectures if you don't think they give you value and just take the exams, while studying in your preferred autodidactic fashion.
My default assumption, unless convinced otherwise by evidence, is that self-taught devs can put together functioning code and have familiarity with CS terminology but only a vague folk understanding of the details.
Most people who argue that university is superfluous are usually those who could not pass the exams due to a lack of interest and/or talent (or live in a country with tuition fees they cannot afford, such as the US).