I've seen things some people wouldn't believe. mod_perl 1.4 on Apache 2.0 handling 350,000 hps dynamic traffic on twenty archaic 1U's. Five tiers of caching, serialized objects on local disks, NFS in production. C-beams glittering in the dark near the Tannhäuser gate. Who cares about inefficiency if it scales?
CS is by far not just about runtime complexity, but it's one thing that can bite you.
Also, all the components you mentioned were likely written, at least in significant parts, by people knowledgeable about computer science.
Say your app is using bubble sort. Egads!!! What a shit design. Clearly this is going to become a nightmare in real world scenarios. But wait - we can theoretically get to O(n) if the list is already nearly sorted. How can we achieve this? By monkeying with the dataset, queueing and batching operations, invalidating operations that take too long, artificially limiting the number of requests per second, or just passing the request to a completely different app depending on use case. It sounds insane, but if you can perform any of these things quicker than redesigning your app, so that it can continue performing under load, that's an example of "scaling" despite the application's poor performance.
Another example is the "scalability" of application development. Say you have an application which is basically O(1), but one day you find a bug in it. Even after you write and commit the one-line fix for the bug, if it takes you between four hours and two days to deploy it to production, or the validation process takes a month... You still have a bug in production for hours, days, or weeks. "Scaling" the development process can significantly reduce the amount of time needed to solve problems, or complete new features. It can be more beneficial to be able to ship code faster and more reliably, even if it isn't the most efficient code.
And that's a good thing? Part of my point is that this can be avoided, or at least have a higher probability of being unnecessary, if the proper basics were learned.
Wouldn't you wish the author would have just known how to properly sort a list[1] in the first place?
"Scaling" the development process can significantly reduce the amount of time needed to solve problems, or complete new features. It can be more beneficial to be able to ship code faster and more reliably, even if it isn't the most efficient code
Agreed to that, but as said, computer science is not just about runtime complexity. Knowing computer science might help you avoid those situations, or resolving them quicker.
I know that shitty stuff can work, even work "well enough", but going back to my original comment: Maybe it could work much better, simpler, more profitable if that's your favorite metric, with just the application of some basics, if they are known.
[1] Noting that "sorting a list" is a stand-in for all sorts of tasks that benefit from CS-knowledge.