All I can say is this, jumping into industry, and spending my time on engineering and research of my own that'd be too risky for academia, and that would not have a clear payoff value for a big co R&D lab, its the sexiest, best, most exciting and amazing decision i've ever made.
If my current endeavors pan out, I will actually be able to say I've created software (and an associated business, WellPosed Ltd) that makes innovation on our wee planet happen faster. I'd not be able to attack the high risk & high impact area I'm working on right now as a grad student or jr faculty, but as my own wee company, I can!
:-) (anyone who's intrigued regarding building better shovels for the data mining gold rush, whether as a user or maker, shoot me a line at first name at wellposed dot com, subject: awesome shovels)
I don't get this. The whole point of university research is that it is too risky (or benefits will be too far in the future) to make it profitable for companies to pursue. I don't see how anything could be too risky for the university, but not too risky for a business.
Edit: I'm not saying universities are perfect. I'm just saying that they still manage to do research that is too risky for industry. For example: finding the higgs boson. I don't see any examples of industry doing research that is "too risky for the university."
Everything you do as university research is towards the aim of getting publications to progress either you or your adviser towards tenure. The problem is that things that don't work are not usually publishable. This leads to 'fluffing' up results (read enough academic papers and you'll find some hilarity there) and avoiding anything that might not pan out.
Just as important is funding, even a small lab with a few computers and one or two grad students needs funding. And in CS that largely means DARPA or a handful of other government agencies. If your particular research interest involves eventually killing someone, you'll do fantastic. There are of course other sources of funding but they typically have much smaller wallets, especially the further you go on the 'for the good of humanity' scale.
I've seen countless times were grad students are doing something really interesting, but because it's not going to help anyone get tenure and not going to bring in any funding, these students are strongly encourage to 'get back on track'.
I get it. I recall specifically posing a promising, if not clear-cut, research topic to my grad school advisor and his response "that's high risk".
Academia is more concerned with the volume of publications rather than impact and unique ideas. There are tremendous disincentives in the system for innovation.
He claims universities are increasingly judging professors by their ability to get funding. And, due to budget cuts, funding is only going to areas likely to pay off. "Too risky for academia" would be too far off the beaten path to get funding.
That said, I don't know if it's true or not.
In some fields, this is very easy. It's easy to buy a bunch of cluster time on Amazon EC2 or something and run lots of machine learning/other research applications. This is awesome and is one of the great things moving the field forwards.
It is much more difficult in the physical and life sciences where starting out costs are extreme. Many great ideas do not become reality because of how expensive starting a company in those fields are and how difficult it is to raise money in those fields. Lots of VCs are cutting funding in these areas because of the comparatively high risk and pharmaceuticals are slashing R&D budgets in order to pick up smaller companies and secure rights to compounds, etc. Funding is very difficult in these fields and the framework of academia is sometimes the only reasonable way to carry out original research.
So, when I read posts like this it leaves me somewhat disheartened. The reason it is so disheartening is that I agree with the author on many of his points.
The commoditization of education in particular hit home for me. It seems obvious that online programs like Coursera and Udacity are the future. Traditionally, if you had a world-class professor you were left with the fact that his impact as a teacher wouldn't scale. But now he can teach a million students not just a few hundred a semester. Why settle for a B level professor at some non-top 10 CS program when you can learn from the people who wrote the seminal textbooks on the subject you are learning?
The feeling I'm left with is akin to the feeling I have with the digitization of books. Again, the benefits make the rational part of my brain see that it is the future. But I can't help feel like some part of the experience is lost in the process.
Is there going to even be a place for me in 20 years when I'm looking to teach Computer Science at a state school?
Some of the experience is lost in the process. We need not really dance around or wonder about that, it is. But what will be gained will be greater than what is lost, or we'd go back to the old ways, which we won't.
A mass-produced medium-class bed isn't a magnificent hand-carved bed adorned with various and sundry Greek gods and goddesses. But the former scales, and the latter doesn't, and sitting and wringing hands about the experience lost when your bed isn't gloriously hand-carved is missing the point of the mass-produced bed quite badly.
"Is there going to even be a place for me in 20 years when I'm looking to teach Computer Science at a state school?"
A place? Quite likely. But probably not standing in front of a class giving a monologue, because in 20 years the whole "give a monologue to 50 people barely paying attention, hand out assignments to be done one week later, trickle out feedback about performance two week after that" model will be considered laughably quaint, and our grandchildren will ask us why on Earth we ever expected anyone to be educated with such a terrible model. It may not be a state school, either.
A moment of silence for some of the old nuances may be called for, but what is coming is a tidal wave, not a couple incremental advances. It won't be entirely 100% positive, but effectively nobody will be seriously advocating going back to the old ways in 20 years.
The first step is to characterize exactly what is lost moving from a traditional model to a Khan-academy-like model. Then to determine how to focus on those things. Then to experiment.
To me this is the future of teaching for most teachers - whereas currently they may spend ~10% of their time one-on-one with students, in a future model I can see that figure climbing to 80-90% percent as students struggle with their homework after watching their Coursera/Udacity lectures.
Not sure how you'll feel about that but personally I always found individual instruction to be the most challenging and rewarding part of teaching :)
More of that kind of teaching could be a great thing.
What do people gain from spending four years in college? Many of them learn something, but most of the value is in the degree, which serves as a signal of competence in the workforce and is a requirement for most jobs. Online programs probably won't provide the same kind of signaling in the foreseeable future. Moreover even students who are primarily interested in learning are unlikely to get as much out of online programs as they do out of college; the rewards structure of college drives many students to work much harder than they would on their own, and (without the signaling reward) online programs can't reproduce that motivation.
Lectures are a very small part of the value proposition that universities make, and of course their functionality is already mostly duplicated by textbooks. Indeed, I'm not entirely sure why American universities primarily teach their students with lectures right now (versus books, video, Socratic method, etc.). I'd guess that the answer is some combination of tradition, students having a fundamental psychological preference for live teaching, and society in general preferring to believe that colleges are selling "education" rather than signaling, motivation, and lifestyle. In particular, I don't at all believe that the dominance of live lecture is driven primarily by a lack of good video materials. And availability of good video materials is the only thing that I see Coursera and Udacity really changing.
So: I'm not sure if there will still be state school professorships in 20 years, but I wouldn't bet against it.
(Note: for the record, I'm a fan of Coursera and Udacity. I also enjoy universities as they are today, though I wouldn't be heartbroken if they changed dramatically.)
I would add to the list one other important factor: getting papers accepted at desirable journals feels less like success for me now than it did when I was younger. The 14th edit of a paper just feels like pointless tedium and is time not spent making or testing something. And once you get a faculty appointment all the hands on time vanishes, and you're more of a professional writer/editor. I still need to be working with my hands.
That said, I'm finding it somewhat more difficult than I expected to transition to industry. I find that a lot of companies don't understand that graduate school and postdoc level research are not like college. I am also frequently treated with skepticism (from the CEO level to HR) that I really want to leave academia after developing such a strong academic CV. There still seems to be a lot of misunderstanding about the differences in environment from both sides.
Good luck!
[1] http://www.amazon.com/alternative-Career-Clinic-Jane-Ph-D/dp...
Some of my difficulty comes from the fact that I'm limiting my search to the greater NYC area. If finance was something I was willing to do, there would be more opportunities for my background.
In doing so I came up with a (super-scientific!) "professional satisfaction formula":
http://blog.arturadib.com/the-formula-for-professional-satis...
1. Online education is not mass produced as he claims; the costs for every video that is downloaded to watch video lecture is incredibly small. Popping up tutorials on YouTube with ads is profitable and sustainable, and that's really all you need for someone who can dedicate themselves to learning more about a topic.
2. I have no idea where he pulls his "winners win" suspicion.
> why wouldn’t every student choose a Stanford or MIT education over, say, UNM?
If we have an open model of education, people would have the freedom of spending a minimal amount of time to look through a plethora of lectures and figure out what they like the best. In the end, you'll hear the best lecturers echoed from students who watched them. I'm making many assumptions here, and I don't want you to forgive me for making them. But we can't overlook the assumptions that the author is making either, and there are a lot of them.3. Supposedly, online education kills a personal connection: an argument that has been made many times. Maybe I should point to a popular counter-example: Khan Academy. A very small group of people who manage to supplement the education of an incredible amount of people who want the extra help. They do it for free. They are happy with the results. And the people they do it for are very happy with the results. The argument he's making sounds like there's something to be yearned, but I don't buy it.
Were it not for this section and his rant on the "Funding Climate" (I really don't want to argue about politics here), I would not have thought this to be the rambling of an angry, old man. He may have some valid points, but it's hard not to see this as a giant middle finger to his previous employer as well as venting now that he's at his new job.
The personal connection in education is more complicated. It is certainly important, and I don't think the Khan academy is a good counterexample (they apparently have plans for personal mentoring, but the focus seems to be the videos). I could imagine a model where an online course by a distinguished professor is combined with face time with a local mentor, but it would still be a narrower experience than going to a bricks-and-mortar university.
None of that is to say that online education is a bad idea - I think the advantages outweigh the downsides, and it looks like the author does too. But we shouldn't ignore the downsides, and the time to think about mitigating them is now.
First you have to convince some one (a funding agency, an investor) that your idea is interesting and will work.
Then you have to manage yourself, other people and other resources to work your plan.
Then you have to show off your results.
The real world is not strictly a meritocracy, not even a good approximation. What people (reviewers, customers - the market) likes is often not very tightly coupled to the technical merits of the work.
Based on your past success funding agencies (angel investors) will be more or less willing to fund your next venture.
In the end, industry involves a whole lot more money, but academia gives you a different kind of satisfaction, but in both I think it is what you make of it, how you manage your time, how much stress you take and what kind of work life balance you make.
The obvious answer is "more funding", but there's also cultural shifts, wherein academia is excorcised for not doing "Real World Work".
Is part of the solution to drive upward from K to PhD, focusing on critical thinking and humanities, educating more than training?
I don't know.
The idea, which was plausible, is that projects can then be judged on their merits, instead of one big slush fund that who knows what comes out of. But the unintended, if not unforeseeable side effect is that it adds a huge amount of extra overhead, and incentives to target only "fundable", aka "sellable" research. With NSF funding rates for projects currently running at 5-10% of proposals, and typical large research universities expecting you to have 1-3 of these 3-year grants going at any given time, you need to be submitting 10+ grant proposals a year! And ideally also working your networks to see if you can attract some corporate funding. That's a huge amount of overhead, and it also sucks a lot of the appeal from academia, since rather than the university setting giving you freedom, you're in some sense closer to an independent firm that has to bring in its own funding, constantly chasing the next round of financing lest your lab implode and students go hungry.
There are likely people who will thrive in that environment, but I think it's increasingly going to be people who are skilled at research management and sales. The #1 job is attracting external financing, and the #2 job is heading up a successful mini enterprise with that financing, ensuring the lab is operating well. This is becoming pretty close to explicit. One university I've been at now sends around an internal newsletter ranking faculty by number of dollars brought in so far this year! If that's what you're going to be judged on, why even be in academia?
They don't. As far as I can tell, all the problems, bar those that stem directly from funding, have always been there. That's the nature of the institution.
What changed, for CS at least, is that Google provides much of what is attractive to academia, but has all the funding that is required to make the other roadblocks go away. Google is not like Xerox PARC or MERL or Microsoft Research or other industry research labs. It's a company that's built upon being a research lab. I can believe the hey-days of Sun and SGI were probably similar.
The easiest way I describe working at Google to my grad friends is "It's a giant grad lab, except grads are also the ones running it, and they're billionaires." Academia can't compete with that, and it never did. The open question is not how to fix academia (as it never will be), but whether the Google model is sustainable. For as long as the Google model does exist, you will always see a net loss of professors to industry, rather than the other way around.
Google does seem like a great place if you're a senior enough researcher, though. People like Peter Norvig, Ken Thompson, and likely the author of this linked article, seem to get basically 100% freedom to work on whatever they want, with minimal management or job requirements, which is pretty much the ideal position to be in as a researcher. I suspect not all Google employees get Ken-Thompson-level freedom from having a boss, though.
My feeling was that taking the money (even less money as you wouldn't have the manager/bean-counter overhead) and splitting it amongst a large number of small projects would have been far more productive in terms of actual research output.
I was just part of an FP7 EU project in which 11 organisations of 7 different countries participated... and gosh, the admin burden of that is horrible. I got to be a work package leader and unfortunately it means you can't really do a lot of R&D. The amount of forms you have to fill , all the Brussels reporting is amazing. Added to that, I think the whole "publish or perish" attitude of academia just is not for me. So what if I only publish one paper from the project? yeah, everybody things I am a loser and that my participation was not successful (yeah, nobody says that but...). It feels as if research is just done with the objective of publishing papers... not only that, but colleagues actually say it "but you know, we have to see how are we going to translate this research into publications".
On the other hand, I also worked in a smaller 3 year EPSRC project with 3 Universities and 3 industry partners. It "felt" better, although the 3 industry partners did not really care (each meeting they sent someone new because the other person was either out or doing something else, so the new guy didn't know anything about the project and was just in the meeting to fill in). In this project, at least each academic partner was doing whatever they wanted and just presented their work, and a the end of the project, something "practical" came out of it.
Loved this quote!
i haven't even heard of that school