But I actually can’t imagine how you can teach someone to code if they have access to an LLM from day one. It’s too easy to take the easy route and you lose the critical thinking and problem solving skills required to code in the first place and to actually make an LLM useful in the second. Best of luck to you… it’s a weird time for a lot of things.
*edit them/they
Same here. Combing discussion forums and KB pages for an hour or two, seeking how to solve a certain problem with a specific tool has been replaced by a 50-100 word prompt in Gemini which gives very helpful replies, likely derived from many of those same forums and support docs.
Of course I am concerned about accuracy, but for most low-level problems it's easy enough to test. And you know what, many of those forum posts or obsolete KB articles had their own flaws, too.
Stackoverflow has its flaws for sure, but I've learned a hell of a lot watching smart people argue it out in a thread.
Actual learning: the pros and cons of different approaches. Even the downvoted answers tell you something often.
Asking an LLM gets you a single response from a median stackoverflow commenter. Sure, they're infinitely patient and responsive, but can never beat a few grizzled smart arses trying to one-up each other.
My main annoyance? If I'm in that same function, it still remembers the debugging / temporary hack I tried 3 months ago and haven't done since and will suggest it. And heck, even if I then move to a different part of the file or even a different file, it will still suggest that same hack at times, even though I used it exactly once and have not since.
Once you accept something, it needs some kind of temporal feedback mechanism to timeout even accepted solutions over time, so it doesn't keep repeating stuff you gave up on 3 months ago.
Our codebase is very different from 98% of the coding stuff you'll find online, so anything more than a couple of obvious suggestions are complete lunacy, even though they've trained it on our codebase.
TBF, trial and error has usually been my path as well, it's just that I was generating the errors so I would know where to find them.
It's hard to remember what it was like to be in that phase. Once simple things like using variables are second nature, it's difficult to put yourself back into the shoes of someone who doesn't understand the use of a variable yet.
There really shouldn't be. You don't need to know all the turtles by name, but "trust me" doesn't cut it most of the time. You need a minimal understanding to progress smoothly. Knowledge debt is a b*tch.
Should people really understand every syntax there before learning simpler commands like printing, ifs, and loops? I think it would yes, be a nicer learning experience, but I'm not sure it's actually the best idea.
When it's time to learn Java you're supposed to be past the basics. Old-school intros to programming starts with flowcharts for a reason.
You can learn either way, of course, but with one, people get tied up to a particular language-specific model and then have all kinds of discomfort when it's time to switch.
I don't see how, barring some kind of transcendental change in the human condition. Simple lies [0] and ignore-this-until-later is basically human nature for learning, you see it in every field and topic.
The real problem is not about if, but when certain kinds of "incantations" should be introduced or destroyed, and in what order.
Consider, how it's been done traditionally for imperative programming: you explain the notion of programming (encoding algorithms with a specific set of commands),explain basic control flow, explain flowcharts, introduce variables and a simplified computation model. Then you drop the student into a simplified environment where they can test the basics in practice, without the need to use any "incantations".
By the time you need to introduce `#include <stdio.h>` they already know about types, functions, compilation, etc. At this point you're ready to cover C idioms (or any other language) and explain why they are necessary.
But, as a sibling poster pointed out: for now.
But unless you teach a kid that's never done any math where `x` was a thing to program, what's so hard about understanding the concept of a variable in programming?
Many are conditioned to see `x` as a fixed value for an equation (as in "find x such that 4x=6") rather than something that takes different values over time.
Similarly `y = 2 * x` can be interpreted as saying that from now on `y` will equal `2 * x`, as if it were a lambda expression.
Then later you have to explain that you can actually make `y` be a reference to `x` so that when `x` changes, you also see the change through `y`.
It's also easy to imagine the variable as the literal symbol `x`, rather than being tied to a scope, with different scopes having different values of `x`.
At first it's all mystical nonsense that does something, then you start to poke at it and the response changes, then you start adding in extra steps and they do things, you could probably describe it as more of a Eureka! moment.
At some point you "learn variables" and it's hard to imagine being in the shoes of someone who doesn't understand how their code does what it does.
(I've repeated a bit of what you said as well, I'm just trying to clarify by repeating)
I think the mental rewiring that goes on as you move past those primitive first steps is so comprehensive that it makes it hard to relate across that knowledge boundary. Some of the hardest things to explain are the ones that have become a second nature to us.
I didn't have any programming books or even the internet back then. It was a poke and prod at the magical incantations type of thing.