story
Heck, even computing education (and the profession even!) has been propped up by GUIs. After my first year in CS, there were like only three to five of us in a section of forty to fifty who could compile Java from the command line, who would dare edit PATH variables. I'm pretty sure that number didn't improve by much when we graduated. A lot of professionals wouldn't touch a CLI either. I'm not saying they are bad programmers but fact of the matter is there are competent professional programmers who pretty much just expect a working machine handed to them by IT and then expect DevOps to fix Jenkins when it's borked out.
Remember: HN isn't all programmers. There are more out there.
> But, if we assume the user has never seen a graphical application before, then likely all GUI tools will be useless too.
We don't even need to assume, we just need to look at history. GUIs came with a huge amount of educational campaigning behind it, be it corporate (i.e., ads/training programs that teach users how to use their products) or even government campaigns (i.e., computer literacy classes, computer curriculum integrated at school). That's of course followed by man-years upon man-years of usability studies and the bigger vendors keeping consistent GUI metaphors across their products.
Before all of this, users did ask the questions that you enumerated and certain demographics still do to this day.
> Of all of that, CLI tools rely on some of the least amount of assumptions by their nature - they're low fidelity, forced to be simple.
"Everything should be made simple, but not simpler." Has it occurred to you that maybe CLI tools assume too little?