Should I worry? Do you think that some form of AI will be able to do the job of an average programmer any time soon? If yes what is your estimate? And how would you try to AI-proof your career?
A very large part of what remains is the bit which cannot be automated: modelling real world (business) process in terms of the systems of automation which are available.
Programming is a modelling activity which is about phrasing sequences of available actions to represent a process. If AI systems generate code, then programming becomes the sequencing of AI prompts -- which are here then just a more natural language like version of programming.
Even in that world a significant amount of technical skill is required to ensure commands are sequenced correctly, the code is correct, etc.
For "AI" to replace this process it would not only have to be AGI, but also AGI fully embeded in the human social world of the processes were are modelling.
My observation was that a lot of my colleagues had no appetite for reasoning about processes, much less thinking through various edge cases to make sure the work was done correctly and covered enough cases to be a useful workflow with low incidents.
Colour me skeptical but I'm not convinced we will see an AGI that can solve business problems without killing the proverbial cat without lots of baby sitting.
So is programming the business of copy pasting from stackoverflow or is it the business of solving problems?
Both, but what you’ve missed is you’re still putting some devs out of work. And solving business problems is absolutely on the burner for AI right now so give them a few years and it will solve that too.
This is not programming. This is not what I had in mind when I signed up for Computer Science school.
> modelling real world (business) process in terms of the systems of automation which are available.
In other words, programming by analogy.
But if you mean algorithm design, that isn't programming. Algorithms arent programs, and the "operations" that they "sequence" are abstract. CSCi alg. design is more like geometry.
Programming is an empirical discipline; it uses the "geometry" of csci to build applications.
Senior devs will be able (they already are) to generate code, at first boilerplate and gradually more complex code, and effectively work as planners and passive reviewers, in a similar way to how some companies just hire legions of juniors with some architects/seniors guiding and reviewing their work.
The problem with that flow, I think, is that it completely disrupts the junior to senior pipeline. Senior roles might be valued even more than today, but reaching that stage or simply entering the market might become much more difficult.
I feel my career is pretty safe, but I’m not sure about someone joining the industry 5 years from now.
That could put junior developers out of a job just like glass terminals put many data entry and server room operators out of jobs.
My guess is that we’ll eventually see some kind of “higher degree bootcamps” that accept people with junior baseline skills and take them to the senior level, since learning on the job might become less feasible.
Ask someone who isn't a software engineer to guide you in developing a simple program. You do all the actual coding, they just tell you what they want to happen. You will find that most people are unable to describe even high-level actions in the form of a coherent, procedural sequence.
Even the best code-generating AI won't solve this problem, since it cannot generate useful code if the operator cannot articulate what they actually need.
You need to have highly developed critical thinking skills in order to be able to formulate appropriate solutions to problems. What a developer does isn't so different from what an entrepreneur does; they spend most of their time explaining their vision and goals; the difference is that the developer's vision and goals are much more granular and detailed.
For the same reason why companies pay $2000 per day for an experienced consultant when an employee could theoretically build the same stuff at minimum wage. Sometimes, mistakes are expensive. And then you need people who can reason about why they are doing what they are doing. AI can maybe churn out CRUD better than other generators, but when you have any significant amount of money depending on the software working, nobody is going to use ChatGPT without a human code review.
But ChatGPT code is typically overly lengthy and complicated, just like what a beginner would produce. And that makes for expensive and slow code reviews. That's why in the end it's cheaper overall to skip all that and just hire a professional.
But in my experience, a small and highly skilled team already outcompetes these armies right now.
Working as an "IT consultant" after a 2 week bootstrap course has always been unsustainable. We will likely get rid of 90% of the current "software engineers" without any meaningful reduction in productivity.
Yes.
> If yes what is your estimate?
5 to 10 years will see a noticeable decrease in the number of programming jobs as we know them today.
> And how would you try to AI-proof your career
Historically tech changes end up leaving a small rump of niche based practitioners, e.g. Blacksmiths servicing riding stables and racing yards, while the majority either exit the industry or take up the skillsets for the new technology. To future proof against AI, it's either be about finding the niche in a shrinking market or changing skillsets.
In terms of those skillsets.
Without Artifical General Intelligence there is going to be a need for someone to translate human requirements into something that the "machine" - however sophisticated - understands _and_ verify the results afterwards. That sounds very much like some form of Behaviour Driven Development.
As to niches; there are a lot of complex, ill-defined but essential Cobol, Perl, PHP and Python systems floating around. Verifying a new translations is going to be expensive. QED; specialists keeping those existing systems ticking along is likely to be a thing long enough to make a career from.
Just as vertical integration in the auto industry killed off the auto startup ecosystem, vertical integration in the tech industry will kill off tech startups. This isnt because there won't be demand for innovative new tech or that startups won't be able to innovate, but because control of core platforms will allow the bigger players more leeway to crush and swallow smaller companies as well as to siphon their profit margins.
Think what aws is doing to elastic on a large scale.
Once the tech startup ecosystem dies (which could be soon; high interest rates will suck capital away from startups), the behemoths will probably stop innovating and slash headcounts.
Once that happens, I'm pretty sure that the stewards of capital and captains of industry will scapegoat AI and the Economist and Time magazine and the like will dutifully believe them and so will most of the people who read them.
Not too different from interpreting lighthouse scores.
It can make anyone with the means to access it a bad coder but the value of a professional has always been in picking the best solution from the possible ones.
When you actually decide the best solution you list pros and cons, weigh risk and cost, etc. These are all easily automated through least cost optimization. Not even using AI.
How good the results are, depends on how good you are at constructing the queries. It's a bit like using google - some people have only the most basic knowledge, while others can find pretty much anything, because they're really good at writing queries.
Now, imagine how it's going to be 10-15-20 years from now? The future models will probably cull a chunk of devs., while the good devs will be even more efficient.
But who knows, maybe this will actually help some of the mediocre programmers to focus their energy on other things? Like taking on other roles? Take me for example - I'm no rock-star programmer. I see programming as a means to an end, it's just a tool. I would much prefer to focus on the actual business logic and features of a product, maybe even long-time strategy. If most of the tedious coding was removed, that would make me happy.
> wake me up when it starts to delete bad code instead of writing more of it
I too don't follow AI space closely, but I have hard time imagining it doing anything of the sort. It probably can't even remove a bug in a trivial program.
Correct me if I'm wrong of course, if there's a video that demonstrates something important/impressive related to this then I would certainly watch it.
That said the best way to anything-proof this career that's so lucrative is to be frugal and set yourself up to retire early.
But regarding your last question:
> how would you try to AI-proof your career?
Learn to program from first principles.
Data Oriented Design (2014):
https://www.youtube.com/watch?v=rX0ItVEVjHc
Solving the Right Problems (2017):
The AI-complete bit is stuff like reading a set of instructions and realising that one of the instructions doesn't make sense given the other ones, and pushing back. It's looking through an UPDGRADING.md file to see how an API changed, realising that the thing you want isn't there, then looking at the actual source-code of a library to understand an undocumented breaking change. It's understanding how to write a program such that it's easy to parallelise in the future.
>It is unlikely that AI will completely replace programmers. While AI and machine learning technology has advanced significantly in recent years, there are still many tasks that require human creativity and intuition, such as coming up with new ideas, solving complex problems, and making decisions that involve subjective judgement. Additionally, as AI and machine learning technology continues to advance, it is likely that new job opportunities will be created in fields related to these technologies, such as developing and managing AI systems.
>It is unlikely that AI will completely put programmers out of work. However, it is possible that certain tasks currently performed by programmers, such as routine and repetitive tasks, could be automated by AI in the future. This could potentially lead to job displacement for some programmers. It is also possible that the increasing use of AI could lead to a shift in the types of skills and expertise that are in demand in the job market, potentially making some programming skills and knowledge less valuable. However, as AI and machine learning technology continue to evolve, it is likely that new job opportunities will be created in fields related to these technologies, such as developing and managing AI systems.
chatGTP has been featuring quite a lot on HN
I had literally just asked the same question to chatGTP
and it's asking an interesting/relevant question to chatGTP
I do get your point about spam though
We work on:
- The why: business/leadership, and to some extent, product determines key goals (KPIs, OKRs etc.) and markets we are addressing. Us engineers are only informed parties here.
- The what: product, and to some extent, engineering determines which features to build or improve to attain these goals in a user-friendly and sustainable way. We create projects and design sketches, assign time constraints to them.
- The how: engineering, and to some extent, product ships these features in a maintainable, scalable, performant and supportable way without disrupting existing user experience.
A very large part of this work involves creative processes and logical reasoning about business, UX, software engineering problems. (Of course, that's why I love it :))
Only a tiny amount of this work is "writing boilerplate code" or copying code from Stackoverflow - which ChatGPT is presumed to automate.
Of course, more senior engineers are faster at writing boilerplate, but their speed mostly comes from 1) knowing the existing codebase 2) using the right tools & abstractions.
Moreover, most of the risk involved in the process is not the time taken to write boilerplate i.e. working on something for too long - but rather working on the wrong thing, or doing an implementation that's too slow or hard to maintain (change, test, fix, extend, reason about).
All in all, when I think about software engineering from this perspective, I don't see AI automating it away anytime soon.
I could, though imagine AI being your TDD coder companion. You write some unit or acceptance tests for a service module and the AI generates the code for it. You'd still need to thoroughly review and test the code though. This would work well for basic CRUD/boilerplate modules, but not for anything involving business logic.
Nevertheless, this would still remain just a small part of a software engineer's work, in my opinion. What do you think?
- AI will hypercharge us, but we'll still need to know what we're doing.
- Polyglots will be more common (since nailing syntax won't take nearly as long)
- Increased importance of understanding framework conventions, architecture and 'the system'
- More time debugging crappy AI-written code!
- Demand for software development will increase (not decrease) since time to market and cost will go down, making many more projects viable.
Nope. Experience has shown that tasks that require peak intellectual abilities in humans are actually very easy for computers to do. Computers were outperforming humans at calculation 80 years ago, and have been crushing grandmasters at chess since the 90s.
Meanwhile, controlling a robot to move efficiently, or reliably distinguishing everyday objects like cats and dogs, is still extremely challenging for current AIs, which require more data than any human could see in a thousand lifetimes to perform at a remotely adequate level.
It's "menial" jobs that will be the last to go. Because those rely on innate abilities grown over hundreds of millions of years of evolution. A task like programming that was only invented three generations ago is trivial by comparison.
Perhaps I’m misunderstanding your statement.
Understand all that and create an app for me that does the work X.
Yes. The questions are: "How soon?" and "Which programmers?"
> Should I worry?
Are you satisfied doing a job that could be done by a simple script?
> Do you think that some form of AI will be able to do the job of an average programmer any time soon?
This is already the case.
Additionally, consider that so many programmers are so bad at their jobs that the average is dismal, and also that computer programming isn't that hard.
> what is your estimate?
I have a comment here on HN predicting it would occur by approximately last year, so we are slightly behind schedule from my POV.
> How would you try to AI-proof your career?
I wouldn't. The very idea is counterproductive. I want to maximize my effectiveness, not lock-in a dead-end "career" of make-work that could be done by a machine.
I think what you're really asking is, "How can I earn a living once I can no longer do anything that is more economically valuable than what machines can do?"
Personally, my answer to that is to change the entire economy to a Star Trek mode or something like that. Or go live in a cave in the woods. Not actually great options, eh?
But yeah, broadly speaking, if you as a working programmer could write a script to replace yourself I feel you're obliged to do it, eh?
Not only is that your job: to write software to solve problems, to economically benefit the company you're working for, but it's also the way to avoid becoming a "zombie", working a pointless job just for an excuse to take home a paycheck.
Stay home and go on welfare (along with everybody else), we can call it Universal Basic Income so nobody's ashamed that machines are better at everything than they are.
No. AI is not a greater threat than stack overflow.
The skills of a programmer are not generating 20 lines of code to solve a well known problem. And even a 99% AI is next to useless, since finding errors is exceedingly hard.
For a while I only had Copilot configured for VSCode and PyCharm, but I mostly use Emacs. The day that I took a little time to configure Emacs to use Copilot, it really hit me how useful Copilot - once I always had it available. Also, the ability to tab through multiple code completions lets me choose what I think is a good completion in a few seconds, or hit ESCAPE to discard them. I have been programming since 1964 (my Dad gave me access to a timeshared BASIC when I was a kid) so I can read code very quickly from almost 60 years of work and play.
I also find Copilot works well with my own bottom up, and REPL based development style.
I understand that many developers don’t like Copilot, but, we are all free to choose our own tools.
Anyway, I appreciate your comment even though my experience is different.
No developer is competing with stackoverflow. These tools are enabling developers to quickly generate code for certain problems, which works especially well for boilerplate. But this isn't actually the main skill of a programmer, it is just some mechanical neccesity to writing software.
Much of what developers do is modifying existing code, fixing bugs, designing architecture and solving novel problems. If an AI could reliably do any of these tasks jobs would be endangered, but certainly that is not the case yet and AI would have to come quite a far way before that.
Why? If you setting up service calls correctly you have a client you instantiate and call some method on. Your service calling should be one line plus error handling (1-3 more lines). Databases are initialized once, again 2-3 lines of code.
If you think any of this is a time saver, or is difficult, your job is at risk from copilot. You are training your successor.
One aspect that's being slighly neglected is that programming isn't that special - tools like ChatGPT can potentially impact any kind of knowledge work. So it's not like there's a safe white-collar career track that you can easily move to.
The problems are mostly sociocultural, and I am reminded of "Agent Smith" commenting on humans rejecting a Matrix simulation that didn't suck for them.
However, most software being developed and maintained out there is not part of greenfield projects. It is new pieces being built upon the pieces that came before it. Years and decades of different styles, formats and convention.
Sure, in a few years these tools will speed up how quickly some parts of the UI or some internal logic will be created.
But I am not seeing these tools take over the biggest part of my job: connecting vastly different systems together.
AI is a dangerous and deceptive tool that requires wise and subtle nudging to make it work.
Results I have gotten from OpenAI are all terrible, for instance. I have to learn “prompt engineering” now. It just high level programming.
Who will review and maintain the code produced apart from a developer anyways?
If you think about it it's much less revolutionary in terms of reducing coding jobs than WordPress.
I think the true question is, whether we are going to treat human beings like trash, useless, now that their job has been automated away, or we as a society find a good way to deal with this and steer into a happier future.
It doesn't always work, and the reasons are interesting.
- nuclear conflict erasing 95% of humanity
- deadly pandemic erasing 67% of humanity
- devastating climate crisis erasing 80% of humanity
- impact event erasing 100% of humanity
IMHO any of these events have higher probability when some AI taking over our jobsThat's a limited analogy.
If you replace shovel with robotic excavator, it gets closer to what we have with current AI. It's not replacing jobs _yet_, but as soon as those excavators become fully automated, a single worker will be able to do the job of dozens, at a fraction of the time and cost.
And, yes, AI-powered excavators are a thing[1].
A closer analogy would be the trucking industry. Truckers are losing jobs _today_ as self-driving technology improves.
The same will eventually happen with software. Programmers will still be needed to drive the AI, but the productivity of one will be greatly increased, and human teams will be much smaller. Programmers won't be needed for simple tasks at first, until eventually only "prompt engineers" are left.
So I wouldn't say this is an existential risk yet, but our field will radically change within the next decade.
There's no shortage that is not followed by a glut..
There is a shortage of smart humans and has been for a long time. I've never heard of a glut of smart humans in history.
Same with self driving. We get some nice tools to assist drivers but replacement won't come in my lifetime.
One, it gives small blocks of code, that too, for the most common use cases. Two, the code often contains a few errors (doesn't compile) or has a few security vulnerabilities.
That leaves algorithm research, and so we can then spend 100% of our time on hard CS.
- Replacing plumbing work isn't trivial for AI at all.
- AI can also do algorithms research.
- Most developers aren't qualified or interested in doing algorithm research.
- There isn't that much demand for algorithms research, not enough to match the pool of developers anyway.
The last two points, I agree with.
But they can help with around 20% of the coding part.
(ii) already you can automate low hanging fruit with python + excel but its not done
It will increase their demand, if anything.
In a capitalist economy, people are expected to work hard in order to produce goods and services, which drives economic growth. But if AI can do all of the work, there may be no need for people to work at all.
This could lead to a new economic model, one where prosperity is not tied to the labor of individuals. Instead, the focus could shift to ensuring that everyone has access to the resources and opportunities they need to thrive, regardless of whether they are working or not.