Adding more nested loops is bad, but it's not exponential. It's polynomial.
As you nest more and more loops, the big O complexity goes from N to N^2 (quadratic) to N^3 (cubic) to N^4, etc... N^(any number) is polynomial. Exponential would be 2^N or 3^N or any number raised to the N.
defines = 0
tests = 0
increments = 0
defines++
for (let i = 0; i < 5; i++) {
tests++
increments++
}
console.log(`defines: ${defines}, tests: ${tests}, increments: ${increments}`)
For the sake of space I'm not going too copy nested versions of that, but imagine nesting it two and then three levels deep. 1 level will increment 5 (5^1) times
2 levels will increment 25 (5^2) times
3 levels will increment 125 (5^3) times
More interesting are the number of definitions and tests: levels tests defines
1 15 1
2 30 6
3 155 31
But I don't know is how the interpreter works and whether it has tricks to shrink those numbers; those just reflect my naive mental model of how things work.N^2 is polynomial.
2^N is exponential.
https://stackoverflow.com/questions/4317414/polynomial-time-...
If you have a sequence of programs with polynomial runtime, but which grows as N, N^2, N^3, N^4, ..., that's a textbook example of exponential growth. It is not generated by running a program on increasingly larger inputs, so there's nothing to be put in the exponential runtime complexity class, but it's exponential nonetheless.
No, it isn't. N, N^2, N^3, N^4, ... is polynomial, not exponential. Exponential would be X^N. Look at the graph on the Wikipedia page I linked to.
1000^depth is not a constant: the function here is "f(depth) = number of times the inner loop body executes". f(depth) = O(1000^depth).
And, "times" here is serving the same role as "comparisons" in "Mergesort takes O(n log n) comparisons": it's referring to the thing that f is actually counting.
And usually when you encounter O(some constant), it's meant as "of the order of magnitude of", i.e. somewhere between some constant/10 and some constant * 10. That isn't the definition used here, but seems to be the cause of most complaints about asymptotic analysis being misapplied when no asymptotic analysis was being done in the first place.