I'm trying to understand the industry, as it appears to be(at least to me) different than what I thought was true. I believe it to be important if we're going to do better and there is a ton of metrics showing we should do better(percent of projects failing, percent on projects exceeding budget and time, percent of projects becoming unmaintainable).
If you could prove that only a handful of people is capable of actually developing software project pass the stage of 'piggy-backing' on libraries, that would probably distinctly change the way we develop software. Maybe we could prevent death marches better. Maybe we could improve our working environments so nobody has to crunch or have a depressing spaghetti-code maintenance job.
It doesn't mean in any way that 'plumbers' should/would be treated worse. If anything, I would expect the opposite.
It reminds me of how MIT changed their intro-to-programming course, from the Scheme-based one to a python based one, because "the SICP curriculum no longer prepared engineers for what engineering is like today. Sussman said that in the 80s and 90s, engineers built complex systems by combining simple and well-understood parts. The goal of SICP was to provide the abstraction language for reasoning about such systems. [...] programming today is “More like science. You grab this piece of library and you poke at it. You write programs that poke it and see what it does. And you say, ‘Can I tweak it to do the thing I want?'. The analysis-by-synthesis view of SICP — where you build a larger system out of smaller, simple parts — became irrelevant." (http://www.posteriorscience.net/?p=206)
Also reminds me of Vernor Vinge's "Zones of Thought" novels, where in the far future, the starships don't have exactly programmers, but rather a kind of software archelogists who assemble systems from software components that may be a thousand years old.
Failures here are almost definitely related to lack of adequate mentorship rather than anything else. College doesn't go half the way to prepare you to be a successful engineer.
There are people out there that can be self-motivated to do better, but in almost all those cases they're building skills that do the dirty work but don't feature best-practices necessary in a collaborative engineering environment.
Your first employer/team, and their ability to mentor and develop new engineers, makes a huge impact on your success as an engineer. Really capable engineering mentors are worth their weight in gold (diamonds? printer ink?) and their contribution has an exponential effect.
This is something I'm hearing alot at the moment, and not just about engineering. What would you say college taught you?
- are well specified and known to be completable
- start from a blank slate
- produce relatively short programs
- once complete and accepted, will never be run or looked at again
- are required to work individually
Whereas in a real software engineering department:
- goals will be to some extent vague and fluid, may be contradictory => requiring negotiation skills with PM, customers etc.
- you will nearly always be adding to an existing project => requiring ability to read code and perform archaeology
- programs end up huge => requiring schemes for better organisation, modularisation etc
- have a long life and a maintenance overhead => requires design for adaptability
- are required to collaborate => requiring use of a VCS, not having complete freedom to choose tools, techniques like CI and branching for managing incomplete vs complete work fragments.
others comments about the difference between school work and real work are spot-on.
It's funny how much I hated group projects as an undergrad, but how in some ways they were the best preparation: How do you still get things done when everyone has different ideas, varying levels of competency, available time, and motivation?
The how to learn bit is (and has been, for the last 20 years for me) massively helpful. It's rare that I think back to a particular thing I learned about (it still happens though), but I cherish knowing how to move from one subject to another when trying to work out how I should solve something and where I should look next.
What learning those things does do is drastically increase your future flexibility as a developer - new databases, new languages, new jobs entirely, whatever. It's all built on the same primitives and if you have that fundamental understanding it makes it easy to ramp up on new technologies given you have the willpower and motivation. There's still a learning curve for specialized fields (of course) but that's fine.
Colleges may well be adapting since I left, but the main issue is that people aren't really holding you to the standards of software that exist at capable software firms. Correctness is about all that matters in university. Students don't know how to optimize for testability, maintainability, deployability, monitorability, etc etc. And learning and developing those skills makes you far better at the 'correctness' bit too.
There are some courses that are collaborative, but in industry the code you write can affect hundreds or thousands of other engineers and there can be real economic consequences of issues in your work (see: plenty of interns/new hires that have had the opportunity to kill $100,000-$1,000,000 or more in revenue by taking down a site - not blaming them, it's just an issue that actually exists in the real world). The order of magnitude is just so different.
This isn't a problem per se, I don't think universities should be expected to perfectly prepare you for this (this is why internships are crucial, and are one of the strongest interview signals for new grads). But somebody does have to - the onus is really on employers of new grads to raise functional engineers if they want to have top notch engineering teams.
I'll be honest - I didn't really grok CS until my first internship had passed, but that one summer really changed both my existing knowledge and my desire to build those skills further. I'm really grateful to have worked with some people that sparked that interest in me. I was at a 2-fulltime-dev startup with a ton of opportunity to work on different pieces of the stack, and it was just tremendously fun.
Side note: an interesting note on taking down applications is that as software enterprises get more mature and taking down a site is that much harder, it feels like (to me) that new engineers in your organization actually have a less opportunity to learn-by-doing for the foundational pieces. This is a very bizarre catch-22 that I suspect has real consequences for the growth of new engineers in software organizations. Very hard to calculate that effect though.
We don't need a cult of brilliance. What we do need is an atmosphere of humility. Modern software/hardware systems are of breathtaking complexity. It turns out that's simply hard for humans to hold in their minds as a whole, but we still develop software like we actually could.
For a while, we had a glimpse on what a simpler world could look like. (The early days of the web - when everything was GET/PUT/POST). We promptly proceeded to layer complexity on top of that.
And that's OK, because it gave us a lot more power. But we pay a price for that power. And every time we attribute that price to lack of brilliant people, we mostly show that we haven't even come close to understanding what it even is that makes projects succeed.
The genius myth is just magical thinking in disguise.
Even as a business programmer, I agree that I do think it's a good idea to occasionally step into some places that are a little more low level. Yet from what I've seen (playing around with embedded systems and VSTs and the like), the actual coding process is, more or less, more similar than different. Both in process ("the basics" of Javascript translate to some degree to "the basics" of C), and even at the "lower level", libraries are also used quite frequently. For instance, JUCE is a package pretty frequently used for developing VSTs (VSTs itself are an SDK). These packages save time and allow developers to focus on the meat of the audio plugin, the DSP algorithms.
At a certain level, what happens is that you get into the realm of math / engineering problems. And there definitely is mindset and focus differences between engineers and business -- one that's good at one is not necessarily good at the other. I just don't think the coding aspect of that separation is as stark as you make it out.
Could you explain how such proof would lead to those changes?
> death marches better. Maybe we could improve our working environments so nobody has to crunch or have a depressing spaghetti-code maintenance job
These aren't software problems, they're business and social problems. No concievable level of productivity improvement will eliminate the death march.
If we split it into "framework writers" and "application writers" then organizing teams along these lines might improve efficiency.
I train in machine learning and other areas, and I often make a "framework/application" skill level distinction made here -- where framework just means the meta-development activity.
What the op comment appears to be saying here aligns along my experience of working & teaching pretty exactly.
I don't think it would matter.
In one instance, you're debugging and Apache server, in the other it's an in-house server implementation. You handjam a CSS file or you can use a preprocessor to help. You can create your own SPA implementation (as I painstakingly and naively did) or use any of the hundreds of existing ones. So do you want to debug business logic or debug all of your in-house implementations and your business logic? External tools are not perfect, but the idea is that they've been battle tested to know where they shine and edge cases that were missed. On top of that, decisions about the tooling must still be made. Relational or NoSQL? You must still know the difference when choosing.
It has everything to do with management practices, organization, anxiety, fear or personal wish to be seen as hero.