At least start with a rigorous, well-specified, robust, high-performance, complete implementation of Common Lisp, if you're going to recurse infinitely.
The issue is either:
* whatever the parser is doing is already trivially tail recursive, but not expressed in C in a way that doesn't consume stack. In this case it's trivial, but perhaps tedious, to convert it to a form that doesn't use unbounded resources.
* whatever the parser is doing uses the intermediate results of the recursion, and hence is not trivially tail recursive. In this case any reformulation of the algorithm using different language constructs will continue to use unbounded resources, just heap instead of stack.
Either way it is not the fault of recursion.
The whole point of the article is about the dangers of unbounded recursion, because of its resulting unbounded memory allocation, no matter at what level or by whom that recursion is implemented.
Yes, the unbounded memory allocation is the fault of recursion, even if you're implementing the recursion yourself with dynamic memory allocations, via malloc, std::vector, dynamically allocating continuations to pass with CONS, or whatever. It's still recursion, and still unbounded dynamic memory allocation, and still dangerous.
If the ALGORITHM requires recursion that can't be optimized away as tail recursion, then it requires memory allocation. You HAVE to store the state SOMEWHERE. You can't take the recursion and unbounded memory allocation requirement out of the algorithm by using a different language, compiler, or writing it in assembly language. It's still a recursive ALGORITHM no matter how you or the compiler chose to implement it.
It's so ironic that I have to explain this so many times, because you keep recursing every time and trying to blame the problem on something else other than recursion, when it's coming from the ALGORITHM not the IMPLEMENTATION or the programmer's ability, and it's an essential part of that algorithm.
You're just playing a recursive game of whack-a-mole trying to erroneously shift responsibility away from recursion onto something else, or writing checks from one bank account and depositing them in another bank account to pay the credit card bill you used to fill another bank account you wrote checks on to pay yet another credit card bill, just to defer having to pay your first credit card bill. And you can't shoot somebody and get away with it by declaring "people don't kill people, guns kill people". Recursion kills -- the title of the article!
Yes I know you can keep recursing infinitely trying to blame something else each time than recursion, but the only time you're going to bottom out and stop recursing is when you realize that recursive algorithms that can't be simplified to tail call recursion always require memory allocation, and unless you explicitly bound them with some limit, they require unbounded memory allocation, because there is STILL no such thing as a free lunch, and continuations are not magic unicorns.
Continuation passing is NOT some magic bullet that can automatically optimize any program so it doesn't need to allocate memory for non-tail recursion. You STILL have to allocate the continuations, they don't come from nowhere, they consume memory too. So triumphantly bringing it up as if it solves the problem or indemnifies recursion from blame is a non sequitur.
It's no comfort to the user that the program they're running exploded because the programmer chose to use malloc or explicit continuation passing, instead of the built-in compiler support for recursion. Recursion is still to blame, and it comes from the ALGORITHM not the IMPLEMENTATION.
When an algorithm requires x bytes of memory for a given input it shouldn't matter whether a given implementation of that algorithm uses x bytes of implicit stack from recursion, x bytes of heap allocated with malloc, or something else; however unfortunately in various poor programming language implementations this does make a big difference, such as the case described in the article. The problem is not recursion nor the algorithm, the problem is the poor implementation of recursion in their C compiler, and as such it can be avoided by expressing the same algorithm without that C recursion (which is trivial). Even in the unlikely event that such an expression exhausted available memory (implausible), it would not be a security issue.
So every recursion in a language that supports programmer-managed state and memory allocation can be transformed into a tail recursion. That still means any recursion in all serious programming languages and most joke programming languages too.