Most people are not creative. Its true. There isn't some horde of people who want to program but don't know it yet because they own a tablet instead of a computer. There isn't some horde of musicians that will never know it because their music comes from an mp3 instead of their own instrument. There isn't some horde of artists who will never know it because their images come from a camera instead of owning a paintbrush. Heck even on a web-forum where contributing content is as low friction as sharing links, only ~10% of users do it. And only 10% of that 10% actually make the content that's posted. 90% of people are happy to passively consume content.
I wish the author were right. I wish there was this huge hidden demand for a real computer revolution. I still think that when I buy a device I should actually own it (which entails freedom to modify). But let's face it idealistic nerd types. We lost. Most people are consumers not creators. Get over it, go to work, program for them, and wipe away your tears with a stack of money.
(Also note that some popular activities are creative even though we don't usually think of them this way; cooking is no less creative than web development or graphics design, and lots of people do creative side projects when e.g. baking gifts for friends, or throwing a party.)
I see having to work for a living as a biggest obstacle to creativity. For most people, their job takes most of their time and energy. After that, commute, making dinner and doing maintenance tasks, there's so little time and energy left that it's no surprise people are not very creative, and opt to watch TV or go to a bar instead. We're being forced out of creativity and into consumerism. I believe that things like Universal Basic Income are worth it because they could reverse this situation.
The amount of media in all forms that is being created today is astounding. That's digital drawing
If coding was as easy as writing we would find at least as much growth there. I know dozens of people who have ideas that would try to code, if it was as conceptually easy for them to do as is writing a memoir.
Such a revolution could definitely happen.
What many people lack is not creativity, but the confidence and skills to creatively express themselves. The majority of people do not have that spark they were born with nurtured, they have it squashed.
When people argue that not everyone needs to learn to program, I contrast it to the idea that not everyone needs to learn to read. If you imagine the demand for competent readers before the printing press, it would be a similar situation to the demand for programming skill now. The ability for us all to read and share information is what has led to rapid progress. Imagine doing science without publishing text, but instead, all of our scientific ideas needed to be expressed in words and gestures, via conferences. Such thing just can't scale to the level it needs to be to have any progress. You simply can't do science without the ability to read and write.
It's becoming clear now that you can't do science without the ability to program too. Computers are a far more powerful medium for sharing ideas than text. When we use only text, to find anything of relevance in a big body of it, the author needs to write an "index" and map page numbers to alphabetically sorted words. He needs to manually update that index when the text is revised. (Fortunately, most papers are written with software which does this automatically now, but papers without hyperlinked contents/indexes are still abundant). In schools, children are taught the basic skills they need to do this, but little more which could accelerate their ability to share ideas.
We need to start thinking of programming ability as a core basic skill, among language, math, science and humanities. Most of it can be taught via math, language and science, and it would give children, who are naturally creative, a skill which they can use for absolutely anything. The idea that you learn to program to get a job as a programmer is a complete nonsense argument, it's like suggesting everyone learns to write so they can become authors.
Programming is active/productive. Reading is semi-active/consumptive. The proper analogy would be the idea that everyone needs to learn to write well.
When I was in high school, the advanced math class had ~two types of people. People good at arithmetics and functions and people good at geometry and functions. People who we're good at arithmetics but weren't good at functions dropped out quickly.
Go to engineering school and you will find software people who thing about graphics as an afterthought. They remember numbers and words easily, enjoy squashing bugs. And they are good with function like ideas, that turn into programs. Like search engines, text editors, finding primes and brute forcing passwords. (I don't know really as I don't belong to their tribe.)
Then there are construction and mechanical engineering students. They won't code voluntarily, because they are crap at it. "What the fuck did I name that thing? How many steps should this while loop do?" Just doesn't compute. Yet these people can manage pretty massive concept design and 3D models and whatnot. CAD work is fun to them, unlike most CS majors. I'm one of that group. While I can do small programs in Python, I don't enjoy what I'm capable of doing. Learning more seems futile, as I will never be actually good at it.
This divide into two groups shows in later life too. Just check the prices of decent CAD tools, and compare them to decent IDEs. For some reason people who are able to enjoy both CAD and coding are incredibly rare. And that makes them very valuable.
If you could make graphic, grown up and powerful language, that would tap pretty huge resource of brain power. Currently it might happen as some sort of hydraulic simulation tool or autolisp plug to Autocad.
I think the main problem with the computer revolution (if defined by everyone programming) is that many people don't like to program. Even those who could learn are really trying to accomplish something else as their primary task. Commercial and bespoke software is a faster way to get to work on their primary task.
I would tend to agree that there is a dirth of painters, musicians, etc in the world. Of my close friends only two of us can play a musical instrument. I think I have to agree with you that easily accessible programming is never going to result in lots of people programming.
In a lot of ways universal programming has gone about as far as it will go. Learning how to program (let me be clear, I do not mean learning how to be a good software engineer) now is actually easier, IMO, than learning a musical instrument. Of my closest friends, two of us play a musical instrument, but three of us program: there's me, the SE, my friend who is an Urban Planner and knows GIS software and R really well, and my other friend who is a patho-biologist and knows Python and R.
When has there ever been a huge hidden demand for revolutionary technological developments?
In my humble opinion, to which I am entitled, current Apple hardware is still well-designed like the Apple hardware of the past, but none of it resembles a "bicycyle for the mind".
These phones and tablets are "computers" but are programmable only by permission; they are consumption instruments that are meant to support some plan to dominate the communications, media, entertainment industries. Not my idea of a programmable, pocket-sized, networked computer.
All due respect to Apple and their wild commercial success, but looking to the future, I get more excited about my RPi or Teensy than I do about my Apple devices.
I have little interest in paying for a license to a bloated, complex, proprietary IDE (Xcode) and seeking approval from an "app store" when I can write ARM assembly from a netbook or laptop using a free and open source assembler and run it instantly on the RPi.
The revolution is yet to come. I hope. kparc.com/o.htm
What you are referring to is deployment. You want to be able to deploy or distribute your programs freely through the official channels. And because that is locked down, you consider it "programmable by permission." I would argue that this is not the case.
All I care about is the logic and elegance of programming. I don't care how my program runs — whether I write ARM code that is simulated through some App Store app. Whether I write Lua code that is run through an iOS game engine, or whether I deploy it directly to the hardware. That is immaterial because I still get to enjoy the art of programming.
Apple's phones and tablets are programmable computers and they can be programmed through officially and unofficially distributed apps. Both free and paid, open and closed source. Just because the official distribution model doesn't suit your personal preference does not make these devices any less programmable computers.
Also note that Xcode is free and you can freely deploy apps from Xcode to your devices. So I am not sure I understand your criticism here. Nor do you need to even use the closed-source IDE (Xcode) when the compiler and language are open source.
The real issue is how easily you can code. 8-bit micros hit the sweet spot. No system available today comes close.
You powered up the machine, and the first thing you saw was a BASIC line editor. There was nothing else to distract you. It was instant-on with no setup.
You had to write code to use the machine at all. You even had to write code - albeit one line - to load a game from a tape.
For the gifted, BASIC led naturally to machine code and to graphics made by writing bytes into memory.
No modern environment has anything like the same simplicity, directness, or sense of natural progression.
Xcode, gcc, anything with a build system (never mind a package manager) are insanely complicated in comparison. They're so complicated professionals have to write books explaining them to other professionals.
Even Python - possibly the best candidate for a successor to BASIC - has a quirky IDE and two and a half different popular versions, and a lot of other complications that a BASIC cursor doesn't.
JavaScript? You really have to learn CSS and HTML and jQuery and $(infinitely long list of frameworks goes here) and - oh look, is that the time?
There is a huge difference between encouraging programming by making access to it friction-free and trivially easy, with a learning space that's comprehensibly small but not dumbed-down and toy-like, and making programming possible for users who don't mind hurdling a lot of obstacles.
That first category is completely empty today. It shouldn't be, but it is.
You are really wasting time hating on Apple and spinning them (i you really are an apple device owner) since Apple is the one who broke open the mobile device so you could run software with out permission (Before you had to get AT&T for Verizons permission to put code on your phone) and who made high quality development tools and platforms available for free.
Apple is the one who shipped the Apple 1 with BASIC... and they haven't stopped.
Posted from a mac
Where on earth are you getting this nonsense from? You could install unsigned applications without anyone's permission on old systems like S60 ages before the iPhone.
There were J2ME devices as well.
I was writing apps for my Nokia smartphone before the iPhone was even a project.
The article sort of gets hung up on form factor. Tablets are yet another window into the universe of computing, and there's a lot of creation happening.
I suspect that the real promised land will need VR for Lego-like construction of components and/or AI-assisted compilers enabling a sort of DWIM programming. As it is programming is just too fragile to be of interest to the "laity."
The ComputerCraft mod is interesting, especially a version where you can use a simple GUI inside Minecraft itself to create Lua scripts that control robots "turtles" and computers in the game world: http://computercraftedu.com/
A seven-year-old can make programs (maybe from smaller functions that they've already written) that will hopefully find the diamonds, build the houses, and manage the farms of their Minecraft life.
This seems possible because of the limited number of objects and actions in a Minecraft world, while Minecraft is still rich enough for plenty of experimentation, trial and error, and aha moments...
Here's the reality though, Hypercard was a failure. People didn't use BASIC on the Apple II. some did, sure, but most didn't.
Most people are not programmers and not programming inclined. Apple gives away the tools you need to build software for your iPad or iPHone-- in the form of Xcode which is FREE FOR EVERYONE to use. It's one of the best tools out there-- when Microsoft was still charging thousands a year, Apple put theirs out for free (and of course, these days Linux and GCC and Ruby on Rails and the whole programming tradition of open source puts even more tools in people's hands.)
But heres the thing. Most people don't care. It's never been a better time to be a programmer.
But the vast majority of people don't want to be.
But the vast majority of people don't want to be.
That just means the tools aren't done yet.
My experience is anecdotal from providing computer support and education over ~10 years, so obviously this should be taken with a grain of salt. But across all age groups and diverse backgrounds, those who aren't into computers seem to be intimidated by them. In trying to teach even the basics of the command line to the technicians I was training, often the biggest factor as to how fast they learned was "are they intimidated by computers?", and even those who fit the stereotype of PC gamer nerds would still freeze up a bit when going through the basics of command line like learning about cd and ls.
It's true that quite a few people probably aren't inclined towards programming - the skills and mindset a good programmer has are fairly unique, much like a good composer or a good artist. It's a very delicate mixture of creativity, technical skill, logical reasoning, knowledge, and of course determination. (This list describes all of the above professions, imo) The issue is whereas it's perfectly acceptable for kids and adults to pick up an instrument or dabble in painting from time to time as a hobby, hobbyist programming just really isn't as readily taken up.
It's why I scoff at articles that call the current generations of kids "tech savvy", since they're anything but. Most are still appliance users, and many can't even do everything the appliance advertises. I don't fault anyone but education systems that don't have strong support and exposure to programming and computers at young ages. It's the same reason that second language acquisition is so poor in the US, merely because people just aren't being exposed to it at an age when they don't have social inhibitions kicking in. The focus of the programs doesn't have to be to produce an army of super programmers, just a class of children that are familiar with the subject matter, as they might be able to recall some fact about ancient egypt or about agriculture.
I don't think we'll have a completely code-savvy populous for a long time; but it wouldn't be hard to have hobbyist programmers become much more common.
All of the software, apps, and services out there should tell you that it's more than this.
I've seen multiple articles talking about people as young as 10 years old creating apps. This wasn't possible in the 80s and 90s.
I'm still not sure why adblock was thrown in there. It has only made it more difficult for indy sites and thw average person to make money and is helping to create an environment where only large corporations can survive.
The same revolution happened with the music industry: unless you are signed to a major label, it will not pay the bills.
Yeah it was. Some of us started programming at 5 or 6 years old in the 1980s and 1990s. That doesn't mean we're better at it than people who start at later ages, but it does show that it was possible for kids to code (and I'm sure some even made money at it) back then.
ps: any source for Alan Kay ads quote ?
I think it's important not to only see things as a 'computer specialist', especially if that perspective (perhaps rightfully) can lead to pessimism these days.
Throughout my childhood, the main reason why computers excited me was the promise of realizing all the sci-fi stuff I read about and saw on television: tricorders, virtual reality, video communication, voice- and touch-interfaces, zoom-in-and-enhance high-resolution maps, instant access to the knowledge of the world through some kind of AI (all voice-enabled, obviously).
And now, all these things actually exist (to a large degree), and in a device that I carry in my pocket!
The child that I was did not for the most part care about building these tools, or being about the modify and inspect them. He cared about using them. And he's excited about the immense progress in what feels like a very short time.
This adult that I am, meanwhile, has a tendency to instead mostly complain about wifi-issues, siri not picking up on my commands, inability to install flux on my phone, app crashes, the new Google Maps interface, dropped Skype calls, the state of front-end development, and so on.
However justified that may be, I've found that focusing on what that kid wants and overcoming the issues that stand in the way has been a much better motivator than focusing on that adult.
To name a specific example. Based on the articles and discussions here I sometimes feel a bit... sad that I've mostly been working in web development since coming of age. Apparently we're reinventing the wheel badly, javascript is a pretty bad or at best mediocre language, html/css are terrible because they were not intended for app development, npm is a shitty package manager, and so on. Sometimes I even start feeling nostalgic for the good old days by proxy.
But then, when I finish a little journalling/project logging tool that scratches a personal itch, and I can instantly release that to the web and let my brother play around with it, or when I write a little bookmarklet that allows me to fold/unfold/upvote HN comments using the letters on my keyboard, well, then I feel good again.
Because then I remember that not that long ago I wrote a game in Delphi. It required trying to figure out how to so something based on random computer magazines and a single Delphi for Dummies book, it required waiting days for help from some dude in Florida who thankfully was happy to assist me. It required putting the game on a floppy disk and hoping that as it was passed along to my dad and his colleagues, it would somehow get into the hands of others.
That's when I get excited again about working with computers, and the progress we've made. And that's the mindset that makes it easier for me to try and think about ways to get my younger siblings and others as excited about building and tinkering as I am.
I am particularly excited about Intel/Micron's 3D xpoint technology, which will be on sale next year. Extrapolating the exponential rate of improvement in storage technologies, I wouldn't be surprised if we had persistent TB storage at SRAM speeds by 2040.
There is also no reason to believe that XCode or the equivalent will not come to iOS now that the devices are nearly performant enough.
As far as a completely open system like Smalltalk goes - that was a wonderful world which I wish we lived in, but we don't - not because of Apple, but because of malware, black-hats, and cyberwarfare.
However, Apple also very much limits the ability to run e.g. Squeak as an app on IOS, you might run it, but not download source code over the internet. And if you read the Anandtech iPad Pro test you can see how much software is held back by the restrictions on exchanging data between apps.
I am writing this as an Apple user and owner of several IOS devices. IOS was a huge accomplishment in touch UI usability, now it is time to develop the computing part to similar levels.