As a somewhat older developer, I find this a surprisingly difficult question to answer honestly. Comparing myself to myself from 10 years ago, I sincerely think I'm more effective, but self-delusion may play a part in that. I've probably lost some of my "step", in terms of raw capacity to memorize and compute mentally, and I have more commitments outside of the world of software, which dilutes my efforts further. Then again, the strategic ideas I have are more dependably correct, and I spend less time chasing down dead ends, either because I've been down them before or had the good luck of witnessing them second- or third-hand.
I've gotten a chance to see a world-class developer very closely between the ages of 36 and 45. He started this period as, very easily, the greatest engineer I'd ever even heard stories of, and I'm pretty sure he got better over that decade. It can be done.
I'm quite sure I am much more effective today than I were 25 years ago, but a lot has to do with cognitive prosthetics.
So, I empathize strongly, but I think the issue isn't that we need to use prothetics as much as the conventional wisdom is that "that's devops problem."
At least us old fogies have had the last 15 years to learn web development. How the hell do young programmers learn such a big stack in a few years? I'm guessing half using youthful energy and the other half skipped in blissful ignorance?! :)
At my age of 39 I solve tasks that in my 20's I can't even dream approaching. I attribute it to much higher-level languages I use today (mostly Haskell) and, of course, to experience in various fields.
I like that term.
"When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong."
When you are older you maybe really spend less time chasing dead ends, I agree with that, but to be fully sincere with oneself, some of those ends might end up not being that dead anymore 10 or 20 years after you've last bothered to visit them.
"Almost everything is possible."
More precisely: "If a distinguished but elderly scientist passes judgement on in idea, it is probably possible. Only ideas whose possibility are not judged might be impossible."
[disclaimer: 44 yo ;o] [ps. google has mitigated memory loss, i suspect.]
Now, this all assumes that developer activity on Stack Overflow is correlated roughly equivalently over most ages. If it is, then these plainly state that for any random developer you would interview, they are more likely to be more knowledgable (according to the definition extracted by Stack Overflow activity) the older they are. The fact that there may be fewer developers at an older age is irrelevant.
Disclaimer: I'm 25
Measuring the latter in terms of the former is highly prone to the survivor bias. The OP's data is evidence in favour of progress over time, but it's weak. Now, progress over time doesn't sound like such a silly idea, so even weak evidence counts for me.
The text didn't seem to do much besides present the stats. But he say this:
I knew that with age coders tend to switch careers, but I
was surprised to see the size of the drop. After the peak
age of 27, number of developers halves every 6 to 7 years.
I figured that "survivorship bias" was sort of the exact point he was trying to make. Beyond that, there's really no discussion of causation (getting older makes one more "mature" or whatever), so I think it's implied that weeding out less committed devs is exactly what's going on.An alternate explanation is that for some reason older developers are more likely to be addicted to Stack Overflow.
A big problem here is the unproven assertion that high SO reputation means you are a “better developer.” Does it really? (After all, with few exceptions, the more active you are on SO, the higher your reputation, period, regardless of your answer quality, partly because downvoting is strongly disincentivized. And the article itself notes that older programmers don’t receive significantly more upvotes per post!) Until that’s shown, the article’s conclusion is highly suspect.
Frankly, I’m embarrassed so few people seem to be calling out the terrible reasoning behind this post. It may well be that older programmers are “better,” but what we have here is nothing more than a colossal failure to understand science, reasoning, and evidence.
The population of "older Stackoverflow users" is not randomly drawn from the population of "older developers", and nothing is put forward to claim that the former is representative of the latter, so you cannot make this assumption.
Older engineers tend to compare new problems to experiences from the past. The tools at our disposal have become much better but the fundamental mechanisms haven't changed that much so it's easier to identify whether there's a real benefit to using a new tool or if it's better to stick with what you have.
As an experienced developer it's a little easier to avoid sinking effort into novel but misguided technologies.
As a young developer it's a little easier to be open-minded about promising technologies.
But don't pay any attention to me - my SO rep is less than 30% of the average for my age bracket...
On a barely related subject, tomorrow is my 49th birthday.
Thanks for blogging and contributing on HN, you've really improved my understanding of software, and how to think about software.
Happy Birthday.
Happy birthday to raganwald!
And I agree with 6ren. You definitely look like a 20-something.
However, this isn't exactly proven by the data: what Stackoverflow shows is that older developers are better at talking intelligently about programming. That's extremely useful (and helps career wise), but it isn't the same thing as being a better developer. Sometimes it correlates (the best programmers I've known have also participated in organizations like IETF, written RFCs and have also thoroughly documented their work), but it isn't a total ordering (I know plenty of programmers who are better than I, but who don't participate in any public forums).
On the other hand, I've yet to find a successful programming language made by someone before their thirties. Contrast it, on the other hand, with some of the most ground changing academic work in Computer Science and Mathematics being done by people in their twenties.
[1] There's nothing wrong with that: Google's APM program particularly is a great example of "engineers who don't want to code" being extremely useful. See also "The Russian Lit Major" by rands: http://www.randsinrepose.com/archives/2006/09/06/russian_his...
* Dennis Ritchie was 27 or 28 in 1969 when C got going.
* McCarthy was about 30 in 1958 for Lisp.
* Sussman was 28 and Steele was 21 in 1975 for Scheme.
* Alan Kay made Smalltalk between 28 and 31.
And while it used to be true that lots of game-changing mathematics was done early, I don't see much of that lately. A huge amount is done by junior faculty and postdocs, but that's usually late 20s and 30s.
http://en.wikipedia.org/wiki/Yukihiro_Matsumoto
http://en.wikipedia.org/wiki/Ruby_%28programming_language%29...
I bet that if you could separate out the younger developers who will still be developing in 10-20 years, that their rep on SO is similar to that of older developers. Those developers that'll wash out in the next 5 years are dragging down the participation numbers of their peers.
So it isn't the age that's important, it's the personality type which is correlated to those people that'll stick with development.
Honestly, though, I think programming is a "young man's game", partly because you are sharper, have more energy etc when young; but mostly because when everything is reinvented each decade, you are better off starting fresh, without being aware of other choices.
The exception is for higher-level tasks, such as marketing, managing people, strategic business decisions, and code architecture. Also, I would think, language/library/API/framework design. I hesitate a little, because many of these are based on the needs of current programmers, which the front-line troops know better because they are doing it (they are the users). However, for deeper insights, age has the benefit of seeing deeper patterns over decades, and over generations of usage. Most language designers seem to be older (but is that just because their languages are now old?)
If you've been something longer, you've had time to learn what works and what doesn't, and why.
Thus, you can do a better job at guiding people who are newer to the material.
As a 29-year-old I'm kind of freaked out by the fact that I'm on the older part of that distribution.
For me, that is about 4 programming languages ago :-)
"So, senior coders earn their higher reputation by providing more answers, not by having answers of (significantly) higher quality."
A lot of people here are focused on "being smarter" or "doing a better job" or "higher quality"
Excluding all of the self-taught developers, and limiting ourselves only to people who follow the standard "get a 4-year college degree then go out in the real world and work" crowd, as that's pretty sizable. Make extrapolations as necessary.
Remember your first year (or two?) of development? Looking back, you were probably way in over your head, had mentors looking after you, making tons of mistakes etc.
Fast forward 5 years. Can you write code faster? Probably not. Can you write better code? Sometimes. It's all about experiences and learning from them. When you take a new job, or a new project, or a new anything, you call upon past experiences to guide the efforts of this process. It might be something as vague as "I am going to write tests first because I found it helped me earlier", or (ignoring TDD), "I'm not going to write this function like this, because I know the code will be hard to test when I get around to writing a test for it later"
You learn this all from experience. Senior people who have been in the field longer have more experience. They aren't "better" in the sense that they are smarter or have more intelligence, they just know MORE because they've been exposed to more.
It's also why so many people (especially in hacker news) have been successful without degrees. It's not a degree that matters it's EXPERIENCE.
It might be a fine line differentiating between smartness gained from pure intelligence and "smartness" gained via experience, but I think it's an important distinction, and one that I think this post highlights well.
And obviously, the fact that this only takes data from SO means whatever the conclusions, they only hold true for the kind of people that post there.
Age doesn't necessarily have anything to do with knowing the right answers. (My first experience being a greybeard actually came within a few weeks of starting my first job.) Part of it is investing the effort to gain expertise, part of it is being smart and/or lucky about investing your effort in the right areas, and part of it is having the background and aptitude to absorb knowledge. Wherever it comes from, being able to answer questions that stump other developers provides a big gain in productivity, because you spend more time doing work and less time struggling with trivialities. If the ability to answer questions increases with age and quality of work doesn't decline otherwise, then the average value of developers increases with age.
So anyway, my first experience being the greybeard, even though you don't care. I was walking around in the office when someone flagged me down. There were three senior developers standing behind a junior developer, all huddled over staring at his screen. They were trying to figure out a snippet of Java code that looked something like this: set?foo:0
"It must be a valid identifier of some sort or it wouldn't compile." "If it is, where is it defined?" "Question marks and colons aren't allowed in variable names. We checked that three times in the book." "Maybe it invokes an implementation-specific feature in the compiler we're using. Whatever it is, it's some kind of deep magic." "Maybe there's a bug in the compiler and it's treating the question mark as a combination semicolon and comment character." Apparently they had been at this for quite a while and were repeating the same ideas they had had an hour ago. They were scared to just "fix" the code because it had been written by a "really smart guy" who had left the company, so it must be right.
"Sheesh, haven't any of you guys written any C?" I said. Two minutes later I had restored a junior developer and three senior developers back to productive work, saving them God knows how much more wasted time.
That's the value of knowing random things off the top of your head.
You're seriously telling me that they were stumped with "set?foo:0"??
And you stayed? :-)
Is it your assumption that they are just more invested in educating others, or is it e.g. because they have a broader range of experience and can answer more/more difficult questions.
As it is, this data set is merely an interesting conversation starter. I hope somebody takes it and does some research on it, because it sure could be interesting
See "Generativity vs. Stagnation" in this list of Erickon's Psychosocial Stages of Development:
http://www.psychpage.com/learning/library/person/erikson.htm...
Most people get into their career in their twenties. For computers it's often even younger. But it seems relatively rare for someone to pick up programming in their 30s or older.
If you're 40+ today, your twenties were ending around the time that the "tech boom" was beginning. But there just wasn't as much information and inspiration around for getting into software development. Even through the mid-90s a college degree and programming skills was no sure ticket to cushy employment.
It's hard to say what this says about developers as a whole, including the ones not on SO, which I assume is a large number.
An alternative hypothesis might be this: Good developers get better over time, whatever that means.
I would suspect that people who form bad habits early on don't enjoy the same benefits of experience as those who built on solid foundations.
In addition this only represents SO programmers, which while a great bunch, is hardly representative of all programmers.
The vast majority of the highly-regarded teachers I know are ones who have been doing it for 25 years or more.
Stackoverflow is not representative of the overall developer population. As an example of an alternate view compare the relative number of items tagged on SO with C#, Java, and PHP. Then compare that to the number of listings of those tags on Dice.
The older someone is, the less chance they'll spend time on an internet community site, especially at work, which is when a lot of people access stackoverflow.
Old dudes work while they're at work, because they learned their work ethic in pre-internet times.
Is this an actual bell curve? It's not symmetrical. (http://en.wikipedia.org/wiki/Normal_distribution)
Edit: Shouldn't have said that. Everyone knows basic stats here and I was being pedantic.
Analogously, if I measured academic skills and availability by time spent in libraries and time spent teaching, I'm sure we'd see a similar peak in 20's because grad students spend so much time doing both these things and productive professors need this less.
The older the developer, the less curious they are <= the less questions they have.
Of course you could say it's because the older developers are more experienced. But would that quality per se quell the thirst for new knowledge?
Just thinking aloud...
That's what I've learned about SO: it's fine for popcorn questions, but for the in-depth knowledge and discussions... meh.
This probably isn't enough to explain the entire effect shown in the graph, but in 10 years (if people still use SO), it would be.
Of course, we only know they are older, not that they have more years of experience.
You can still do a fit to a bell curve and find a chi-squared value. If your chi-squared is horrible, obviously you should be considering a different probability density function as your model is incorrect, but if it's decent enough you can omit the features and call it a "bell curve"
That being said, I could see this as a composition of two gaussian bell curves with the means correlating to the ages of people in college and their early/mid career (let's say 23 and 28 respectively)
I know when I was first starting out as a developer I spent a lot of long days and weekends working but not really getting a whole lot accomplished. As I've matured, I've gotten better at picking out what's really important and am now much more productive while working less hours. I anticipate this trend to continue as I gain more experience.
[1] http://stackoverflow.com/users/22656/jon-skeetIn my experience it has been that older developers hold on to old coding habits that are today considered dangerous and are reluctant to change that.
If you're curious about the down-votes, it may be because the above quote is an example of the particular kind of bigotry commonly referred to as "ageism". For future reference, if you are ever in a management position, that kind of thing is actionable in a work environment. You could get your backside sued off, is what I mean, if you are found to have used that in a hiring decision, for example.
What you say about the HR liability potential agrees with what I've heard from HR people, but never seen it be a factor in reality. In practice I can't imagine suing over a development gig. I can think of a million bad reasons people might not hire/let go a good developer (e.g. questions about manhole covers) but the "he's a curmudgeon who's reluctant to slap stuff together quickly" argument seems halfway legitimate.
So I say, bring it on, let's have an open discussion about who's smart, who's fast, who's wise, and when it even matters.
My personal experience is that older developers just don't get it. They haven't kept up with the exponential increases in productivity that we have had in the last 5 years. Things that used to take 2 days 10 years ago can be done in 2 minutes now, but they are still used to thinking that they did it quickly if they finish it in 2 days, so that's how long it takes them to do it.
Also my personal experience is that older developers can't handle the asynchronous nature of modern communications very well. They always want to work on only one thing at a time and get confused/ much slower if they have to work on multiple things, whereas younger developers will happily be able to switch in between tasks while waiting for the previous task to finish compiling/running without problems.
This is the case with everyone. Multitasking negatively affects overall performance. There are many studies that confirm that. If some developers avoid it, maybe it means they are better at evaluating their own productivity?
Twitter, text messages and facebook have trained our minds to work differently. Have you been keeping up?
The younger developers get confused/much slower when they work on multiple things, too, they just don't realize it. I'm pretty sure there is experimental data showing that everyone performs worse when they switch tasks often.
I actually just watched The Social Network for the first time a couple of nights ago, and one of the things I appreciated most about the movie is how they would yell at people who tried to interrupt someone who was coding with a question ("Don't interrupt Chris, he's in the zone!"). I suspect that attitude had a lot to do with the productivity needed to get Facebook off the ground, and they were certainly all young people.
(Or maybe the writers of The Social Network just made it all up.)
I would love to know where you've seen exponential increases in productivity in the last five years. In my experience, developer technology has been largely chasing its own tail for quite some time. Productivity improves dramatically on individual, emerging platforms, but seems little changed in a wider, overall view.