I think the right balance needs to be struck between the academic and vocational views. Big-O, grammar, automata, etc. are the fodder of academic papers, not the Real World. But when you talk about higher ed, here's the rub: its supposed to be the theoretical foundation, not a purely vocational preparation.
The danger of the more vocational point of view is that you're supporting what I see as the slide from "skilled software engineering" to "coding". (Think the difference between having John Carmack on your team, versus some guy making $8/hr in a developing country.) This is hyperbole, but if you focus on the day-to-day elements of any job, then you're advocating for movement to a technical school curriculum (which, right or not, has a different level of career momentum, responsibility, etc.).
CS (using the term to apply to the genre, inclusive) is maybe unique in that there are very vocational components, but also very intellectual/academic components. I think, like any career, there are going to be Things You Don't Know coming out of school. Interviewing, for example. Is that in ANY university-level curriculum, for ANY major? Are you expected to be 100% "operational" in a particular job immediately after your degree is awarded? Moreover, since when are all CS jobs the same? Why should it be any different for CS?
This is what bothers me about this argument: CS is not equal to programming, and not equal to a (particular) job.
And moreover (and I think most here would agree), the greatest hallmark of a great "technical thinker" (programmer, academic, problem-solver, tester, DB admin, whatever) is their willingness, nay INTEREST, in pursuing the details of their craft beyond the structure of a class or a job or (god forbid) an employee handbook.
I don't want to work with someone who goes through the motions. I don't want to work with someone who comes out of college thinking they're prepared for their capital-C Career. I want a lifelong learner, and someone who wants to get into the guts of operations and make an impact.
Educate and train for THAT.
I have personally needed all of those things in the Real World.
I suspect that if more people remembered the stuff they learned in college, it would get a lot more use. "I'm never going to use this in the real world!" can be a terribly self-fulfilling prophecy.
You're right, though. Which goes back to my question around what should be part of a formal education? I bet HN could come up with 10 years worth of courses that are "essential", plus another 10 that are "nice to haves".
A CS degree, or any formal education, has little to do with what kind of programmer you are going to be. You can take a completely vocational approach in school and move on to other work individually. Or the other way around. You will end up being the type of programmer you wish to be regardless of CS. Case in point, John Carmack. And in the other direction, the thousands of CS grads who every year simply churn out "whatever works" once they are in the field and forget about all they were supposed to be learning. I would encourage students to be less critical about what it is they are actually learning in school, because it's what you do after those 2-4 years that will really matter.
But we're talking about "formal education": how do we develop strong technical thinkers?
And to my mind, that's the correct order: I'd rather teach someone the practical components rather than the theoretical ones while they're on the job. (Not to mention that practical components—even programming languages—are quickly out of date. Theory has a much longer shelf life.)
I hire CS majors (and more broadly, engineers) into business positions because of the core skillset: problem solving and a desire to figure out a solution.
I'm a little surprised schools didn't touch upon complexity already; that's been around for awhile. Yet, it is the biggest category in the responses, so that's probably why.
I'd also say that things like deployment, sysadmining -- browser incompatibilities? These are a lot more role-specific and better learned in situ. If the guy who knows browser incompatibilities needs to know how to do deployment, you are either (a) a pre-funding startup or (b) spreading your talent kinda thin.
I'm sure most of the responses here will say something like "I have never let my schooling interfere with my education" -- which is all well and good for you. But if we want more and better candidates, it couldn't hurt to adopt some of the best-of-the-best practices (version control!) into CS curricula.
Edit: I should note that SE and CS are separate departments at RIT, there is a real difference in curriculum and that difference is not what this survey is getting at. A major in the IT department is what is closest to this survey, not SE. Somewhat anecdotal because its one University though.
But the reasons for testing weren't really emphasised or even discussed.
(And a first-year course where we were required to use jUnit in the assignments, although again it was incidental and not at all a focus.)
I took a "Software Testing Theory" class in the 2nd year of Software Engineering, then every course after that we were using jUnit (or whatever) for unit testing, and writing Software Test Plans and executing them on everything we did.
It was great.
The vast majority of students in my program honestly don't give a flying fuck about the academic opinion that CS shouldn't behave like a vocational school - the intent is to go into software development, and CS is the only offering that touches on it. Yes, we're all capable of learning various things on our own (including the theoretical aspects of CS, as instructors in these fields are typically so incredibly incompetent that students are required to teach themselves, anyway), but that's not a justification for providing an education that is largely irrelevant.
If I really wanted to study theory and computational mathematics... I would have studied computational mathematics.
Were it not for the perceived value of a CS degree, I suspect a substantial number of students wouldn't even bother, and probably flock to Coursera, etc.
And you're forgetting companies like Google, Amazon, etc. who expect candidates to know CS theory pretty thoroughly. What kind of school would be proud of grads who couldn't get into one of these top companies? Software engineering is also easier to pick up on the job than in school, and vice versa for the theory.
It's condescending to describe CS theory as "largely irrelevant", but this also sounds like a case of sour grapes. There are plenty of tech curriculums that are light on CS theory--try looking for "Informatics", "IT", "Information systems" and so on.
The vast majority of students in your program are unqualified to have an opinion. There's no point in having a university teach you version control systems if you're just going to end up learning it anyway. It's basically impossible to make students good software engineers (which has nothing to do with version control systems or other trivial things like that) in an academic setting. That's only something which comes with experience.
But that is the key phrase, "on the job". I know I personally didn't go to school to fit all these check boxes. It seems like a poor choice to me that you and your classmates would (1) enter into a CS program without much interest in CS, then (2) blame the field of study for not being interesting to you. I for one am glad I got a CS degree (for the CS degree's sake) and thankful for what I was exposed to in those years. These other check boxes? Sure they are valuable. That's what my first job was for.
Yes, well maybe they should get on with it then.
The beauty of CS is that the theory is directly applicable to the practice of software engineering. And because of this close connection between code and theory, you can develop significant coding skills just from doing the problem sets. As per the usual academic refrain, you get out of it what you put in.
If you think they should just be teaching you industry skills then you're wasting your time and money because college will never be as instructive as taking an internship in an actual company shipping code. Not only that, but you're paying for the privilege instead of being paid. Even if they rolled over and decided to go this route, academia does not have the knowledge or experience of what it takes to be successful in industry, must less impart that knowledge to you.
If you're going to go to college, take advantage of the academic strengths: the deep knowledge, the curiosity that goes beyond the immediate problem, the great minds you have at your daily disposal.
On the other hand, if your CS program sucks then it sucks and I'm sorry.
I'm currently trying to convince my organization to use version control, and because nobody has ever used it before, they just see it as a pain and more overhead that it's worth. To be perfectly honest, I felt exactly the same way before I was forced to use it for a 2nd year programming course.
Of course, since then, I refuse to not use it, even for my side projects where I'm the only contributor.
One of the big values of college is having a lot of other people around you learning the same things at the same time. This can be helpful when learning about subject matter and about tools like version control.
For instance, many programming exercises in various subjects taught at my university give the students a basic framework with missing functions, or a test harness, or some other piece of code that needs to be finished. These could have been shared via a version control system, and delivery could be made that way as well (using pull requests, branches, or some other feature).
That way, students would be encouraged to use version control while working on the exercises.
The way it actually is done here, though, is basically "learn version control yourself and justify your choice of version control system in your report" when doing group projects.
-Object-oriented programming
-Basic calculus
-Basic discrete mathematics
-Linear algebra
-Algorithms, data structures and asymptotic analysis
-Basic compiler design
-Theory of computation, Turing machines and formal languages
-Basic practical computing technology, from the gate level
-Basic software engineering: SE methodology, version control and developing software in a team
-Cryptography, ciphers and data security
-Overview of the role of operating systems
-Networking, encapsulation and the TCP/IP stack
-Basic web technology, server-side programming and client-side scripting
Anything outside of this, I've learned on the job or in my spare time. This includes AJAX and the use of heavier web technology. To my intuition, this is more descriptive of an undergrad degree in computer science rather than a software engineering degree. But it'd be nice to hear what others think.
Instead of this:
Total voters: 613
Total votes: 3168
Designing data structures (543 votes, 17%)
Show this:
Total voters: 613
Total votes: 3168
Designing data structures (543 votes, 89%)
Designing data structures Analyzing algorithmic complexity (Big O)
Version control was glossed over in my class with little importance given to it.
I've only got, in this respect, an undergrad diploma in computing and covered 6 of these things in any depth; it would be around the same time as you were studying and we never covered version control at all.
You know what I want--a twelve week, intense course on scientific computing in something like Python--for people who already know some core scientific subject and statistics. If we took http://www.drewconway.com/zia/wp-content/uploads/2010/09/Dat... as a guide (and yes, yes, data science is just a name for something that has existed for twenty years), I want a program to move people from traditional research to data science.
My CS degree hardly taught me anything in comparison to everything I learned outside of my coursework, but I don't feel like my coursework was lacking. On the contrary, my education allowed me to learn new things on my own that would have been significantly harder otherwise.
eg a basic algorithm course lead me to get involved in TopCoder. An introductory course on software development (source control, testing, etc) gave me enough knowledge to find an internship where I learned about peer reviews and how to work effectively with a large codebase - something that is very hard to pick up in a semester of college.
Things that were not covered included:
- Basic business skills
- Basic sales skills (We all have to sell our ideas, be it to peers, departments, customers.)
- Storytelling skills
- People networking skills
- Understanding very few businesses have technology problems, but instead business problems that they are trying to solve with technology.
- You don't understand anything if you don't understand how the data in any system inputs, exists, interacts, and outputs. You don't understand the business if you don't understand the data. You should not be allowed anywhere near building software for an organization if you don't understand the data.
- Understanding that the data in many ways is the system. No software/system is useful without data.
- Learning an organization's competitive advantage and magnifying it with software. Too often programmers interpret things killing the competitive advantage.
- It's not about you, it's about the user. Users rarely care what you code in, or the frameworks, or whatever you've done to make your own life selfishly easier and continuing to neglect actually getting to know their world, their data, and the the problems they face with it.
- Marketing skills - eliminate feature babble and focus on benefits once you've been in user's shoes working on users problems.
- Communicating with the world to first understand, and then address their needs. You are no better than a technically clueless business guy if you make assumptions as blindly in other ways about the business process.
- How the real world does not start every project with a clean slate like CS projects. This is probably one of the biggest gaps.
- Learning to learn other people's code, refactor + more.
Instead, we learned the theoretical using many languages building the same kinds of things over and over for 4 years. It was awesome to the geek in me. Sprinkle in some UI or database skills, and yes, I did get very good at learning any technology I needed, but implementing it in a meaningful, sustainable way. But the real world used very little of what I learnt for 4 years.
When I took CS in the late 90's I had to learn to build web apps, databases, load balancing, administrating servers, all on my own, not to mention getting a business education largely by learning to swim by diving in.
The social skills are really big. Partially I think CS attracts folks that aren't extroverted, and it can be a problem.
We're at or nearing a crossroads in my opinion where the entire world has come to the internet and technology, and those building technologies and online need to be better bridges in interfacing with humanity.
The downside was that more time on business meant less time on engineering, and you ended up with people that dropped out of the engineering school, combined with people that may not have been able to hack it in the first place, and a tiny minority of people that could have passed in the engineering school.
This means that as a signalling tool (which is a large part of the value of a college degree in the workplace) a technology degree was far less valuable than an engineering degree, which causes a feedback loop to keep the number of people with actual engineering talent low in the Tech school.
This is from my observation of EE vs EET as I knew people in both majors. CS was in the school of science, not engineering so it had no equivalent in the school of technology.