At least in my university, MS students were usually paying full tuition, with the understanding that they'll take classes and leave for (presumably) high-paying job.
On the other hand, PhD students were usually paying no tuition at all -- instead, after getting their masters-equivalent, they were expected to do the research for a few years and therefore "pay back" by advancing the science.
(of course the fun fact was that one could drop out half-way from PhD program and get a MS degree.. but this did not happen very often)
Example: in the Intro to OS course I'm about to finish, we started with 700+ students and are now down to about 340. The projects are typical schoolhouse stuff for systems programming: C programs that manage memory, use sockets, IPC constructs, pthreads, RPC libraries, etc. One of the guys on the class slack just posted that when testing & debugging the projects, he just printed stuff to stdout... for the whole semester... because he doesn't know - and didn't bother to learn - how to use a debugger.
Lol that's like the best way to debug C
Obviously not every school has the same demands as others, and there are certainly a lot of degree farms that take foreign students money and funnel them though, but I'm not sure that the authors points ring very true... certainly not for everyone.
Correlation isn't causation and all that, but it does bias you a certain way.
Which is to say, you don't need a degree to study/learn things.
Though that approval is helpful in opening doors to employment when you're young, it again is not needed when your history of experience, abilities, and work can do the same later in your career...
The sad thing is that large parts of the curriculum aren't that valuable. Data structures used to be my favorite, but today it's not that important because we stick everything in hash tables or database tables. We rarely use LISTS!
The same goes for compilers. No one writes a compiler any more. Apple just repurposed LLVM when they made Swift. But all of the undergraduates have to pull their hair out making toy compilers and for what end?
Most of the CS curriculum is pretty extraneous. This is why many companies are deliberately hiring technically competent people from tech fields like physics or chemistry. They learn practical skills to analyze their data-- the kind of practical skills needed by corporations not theory heads.
I heard someone else say 30+ years ago you were more likely to be implementing a sorting algorithm or other things often studied in your day-to-day job. Most people's job have changed to integrating predefined APIs--a completely different skillset.
However, as a counterpoint I wish I had pursued a degree. I've looked over friends' notes from their undergrad and graduate classes and wished I had the time to do what they did. Not just as a personal interest, but to give you a practical understanding of what those magic APIs are doing.
Similar to a "hello world" of a framework, in practice things get messier. You often end up having to compare frameworks to pick one. Or play around with a few to judge which is best to continue with. Having a rough idea of how it was implemented and knowing the pros and cons of those choices are hugely beneficial. After choosing, you only write the code once, but spend the rest of the time rewriting, debugging, and optimizing. Having those CS fundamentals not only helps you identify when the built-in solution is insufficient, but what better options might be.
Ignoring all of this, I'm sure you could fill a career with writing one-and-done CRUD apps.
I think if you want to forge new roads in programming, a little theoretical studying doesn't hurt, so why not explore a university degree.
I also do not enjoy the endless debates around self-taught versus schooling. I've seen real-life anecdotes from both viewpoints and find that it's too subjective for anyone to declare anything from either side. I personally don't recommend people jump out and grab a CS degree, but there are some good programs out there.
I'm not debating whether anyone should study this. I'm just saying that people who want to get jobs shouldn't need to study it. If you want to do the theoretical stuff, have at it. I'm just saying that it's failed my team as often as it has helped. If the theoretical model doesn't match the problem exactly, you can get the WRONG answer as that NP-complete obsessed dude did when he didn't look for a heuristic.
(Also, Apple didn’t exactly repurpose LLVM for Swift. LLVM is just the backend, and Swift has quite a sophisticated frontend with its own optimization passes and intermediate language. It is safe to say that a lot of compiler theory and programming language design went into writing it. As another example, the compiler for Go was basically written from scratch and does not use something like LLVM for code generation.)
I do agree that learning CS theory is no substitute for actually writing software, and doesn’t make you a good software developer by itself. And there are probably plenty of excellent software developers who couldn’t recall what a Turing machine is. That said, a good knowledge of CS fundamentals is indispensable for some types of software development.
However, CS exposed me to a lot of different concepts, algorithms, a lot of math. I obtained an associate degree before getting my BS in CS and I did not unlock the key to learning until the CS degree. I feel like I can quickly pick up different things now. I was able to build a 5ft tall retaining wall and submit plans to the city then execute the project. I literally can perform almost all jobs on my car, except machining engine blocks as I lack the equipment. I completely remodeled my house, learning different building codes.
I feel a BS in CS gives someone a huge advantage, they have proven they can accomplish something, and should also have unlocked the key to learning that works for them.
I am not sure if MS in CS is worth it though. I thought about it myself but would probably go with MBA instead.
Also, to varying degrees, every justification for doing a CS degree has smelled like an attempt at rationalizing a poor decision.
I think this is because our product is not related to web in any way, and the only UIs we deal with are quick and ugly ones for internal use.
If your goal if "simply" to be a software developer (who usually use existing APIs), nope CS education is not needed No need to study fancy theoritical stuffs.
Just straight learn anything practical like Java/SQL/Python/etc. And thanks to internet, you don't even have to wait to be enrolled in university. Any high school kids can do that.
>> Most of the CS curriculum is pretty extraneous
It's science, after all. Can we say the same thing about physics/biology? OK, computer science may not be considered as science as in "natural science". Instead, see it as a mix of math & engineering.
Still, the goal of CS department is no to produce practical programmers. At least many many years ago in my 1st day at the campus, the lecturers said that. Although no doubt most CS alumni work as programmers. Yes, I'm aware understanding theories is a thing, and writing software is another thing. That's why we also had software engineering, project management classes.
CS certainly matters in software development design and implementation today; esp with AI, distributed systems, complex/large data, cyber security, etc.
Though, personally, I never learned, or was capable of, the appropriate & applicable CS until I faced these real world issues later in my career in advanced work.
So I do agree that in college I wasn't ready for such a level of detail that I could not apply in any functional matter of life or early career development.