How could you get a BS in CS without taking calculus courses? Which school did you go to?
(Although maybe this will change with machine learning.)
By the end of that summer of 1983, Richard had
completed his analysis of the behavior of the
router, and much to our surprise and amusement, he
presented his answer in the form of a set of partial
differential equations. To a physicist this may seem
natural, but to a computer designer, treating a set
of boolean circuits as a continuous, differentiable
system is a bit strange. Feynman's router equations
were in terms of variables representing continuous
quantities such as "the average number of 1 bits in
a message address." I was much more accustomed to
seeing analysis in terms of inductive proof and case
analysis than taking the derivative of "the number
of 1's" with respect to time. Our discrete analysis
said we needed seven buffers per chip; Feynman's
equations suggested that we only needed five. We
decided to play it safe and ignore Feynman.
The decision to ignore Feynman's analysis was made
in September, but by next spring we were up against
a wall. The chips that we had designed were slightly
too big to manufacture and the only way to solve the
problem was to cut the number of buffers per chip
back to five. Since Feynman's equations claimed we
could do this safely, his unconventional methods of
analysis started looking better and better to us. We
decided to go ahead and make the chips with the
smaller number of buffers.
Fortunately, he was right. When we put together the
chips the machine worked.
[1] http://longnow.org/essays/richard-feynman-connection-machine...It's true that in a lot of cases, deeply understanding discrete numerical algorithms is a lot easier if you can analyze the continuous versions, which of course cannot be executed directly. But you can get really far with just the discrete versions, and you can understand useful things about the continuous versions without knowing what a derivative or an integral is.
And I don't just mean that you can use Unity or Pure Data to wire together pre-existing algorithms and get interesting results, although that's true too. You don't even need to understand any calculus to write a ray-tracer from scratch like http://canonical.org/~kragen/sw/aspmisc/my-very-first-raytra..., which is four pages of C.
You could maybe argue that it's using square roots, and calculating square roots efficiently requires using Newton's method or something more sophisticated. But Heron of Alexandria described "Newton's" method 2000 years ago, although he hadn't generalized it to finding zeroes of arbitrary analytic functions, perhaps because he didn't have concepts of zero or functions.
You could argue that it's using the pow() function, but it's using it to take the 64th power of a dot product in order to get specular reflections. People were taking integer powers of things quite a long time ago.
Even using computers for really analytic things, like finding zeroes of arbitrary analytic functions, can be done with just a minimal, even intuitive, notion of continuity.
Alan Kay's favorite demo of using computers to build human-comprehensible models of things is to take a video of a falling ball and then make a discrete-time model of the ball's position. A continuous-time model really does require calculus, and famously this is one of the things calculus was invented for; a discrete-time model requires the finite difference operator (and maybe its sort-of inverse, the prefix sum). Mathematics for the Million starts out with finite difference operators in its first chapter or two. You don't even need to know how to multiply and divide to compute finite differences, although a little algebra will get you a lot farther with them. A deep understanding of the umbral calculus may be inspirational and broadening in this context, and may even help you debug your programs, but you can get by without it.
I agree that calculus is really powerful in extending the abilities of computers to model things, but I think you're overstating how fundamental it is.
Calculus does usually build some mathematical maturity for those who haven't encountered it. And it's useful as an introduction to sequences and series, and for anyone interested in numerical analysis or physics simulation (e.g., computational science, modeling, game engine development, etc.).
Not to mention having it is useful if you find that you'd rather do computer engineering or EE halfway through your undergrad career (though this last point is tangential at best).
I do wish linear algebra was a more commonly required course in CS programs.
I spent a year in a community college making up for what I should have learned in high school; basic math up to advanced algebra. Sure I applied myself more, but the teachers, and even the text books seemed better?
Once I learned the basics, it made math enjoyable, and I didn't fear courses that were heavy in math.
By the way, most Medical doctors never sat in a calculus course. Here, in the U. S., there's always had two calipers of physics courses. The hard, and easy physics courses. The easy physics courses don't require calculus. They hard require calculus. Most med students too the easy courses, and aced them. It's all about the GPA when trying to pretty yourself up for med. school.
I worried way too much about grades in college. I look back and wish I took the courses I was interested in.
My interests are completely different as I've aged. It's tough in college because so much rides on getting into that certain graduate program, or professional school-- graduating, and getting a Job.
Heck, even last year I talked to another University to look into electrical engineering and only 1, ONE, class would transfer. All others wouldnt count because since they were a private school, their curriculum was different than most. Thats not something they put in brochures.