Long variable names may take more characters, but there is no way they increase bugs.
Indenting(!) doesn't even take many more characters if you use tabs...
No, it's not supposed to be a joke.
Steve McConnell 1993 observed density is proportional to source program length, so this should be obvious to every programmer: If a small program (as measured in source code bytes) is more likely to be a correct program, then this follows.
A major issue with discussing programming is the sheer number of people who believe they know how to program, when any non-programmer could see quite obviously that they don't: A professional bridge-builder doesn't often fail to build a bridge, but a professional CMS programmer seems to unable to get much past hello world without a bug or two; Less code will therefore produce less bugs.
There is a summary of his advice here[1], but just to highlight
Describe everything the routine does And we mean literally everything. If that makes the name ridiculously long, the name isn't the problem. Your routine is.
And
Make names as long as necessary According to McConnell, the optimum name length for a variable is 9 to 15 characters; routines tend to be more complex and therefore deserve longer names. Make your names as long as they need to be in order to make them understandable.
I've read McConnell, and your claims are so completely the opposite of what he recommends that I'm still unconvinced you aren't trolling.
[1] https://blog.codinghorror.com/i-shall-call-it-somethingmanag...
"A study from Basili and Perricone found that routine size was inversely correlated with errors: as the size of routines increased (up to 200 lines of code), the number of errors per line decreased (Basili and Perricone 1984).
And the conclusion (on page 174): …None of the studies that reported decreased cost, decreased error rates, or both with larger routines distinguished among sizes larger than 200 lines, and you're bound to run into an upper limit of understandability as you pass 200 lines of code
With that in mind, consider that KDB's SQL92 interface is 35 lines (the parser is 14 lines). It may be an extreme example of what McConnell is observing, and yet was himself unable to learn from.
> I'm still unconvinced you aren't trolling.
Look at it this way: Here is software that is faster than what you can write (or maybe you want to write your own taxi problem implementation), and if you don't try hard, you will miss out in finding out how to do something that you can't do.
That downvote button is so easy, that internet person is just a troll, I've been programming for ages, so of course I know what I'm talking about, but if you look through my comment history, you might find it easier to convince yourself at least that I believe that program length matters and permit yourself a discussion about it. You might learn something.
* https://news.ycombinator.com/item?id=8476294
* https://news.ycombinator.com/item?id=8477064
* https://news.ycombinator.com/item?id=10872209
and so on.
Edit, see http://mathoverflow.net/questions/8295/origins-of-mathematic... for the confusion this causes..
- you need to write same variable name on each step of derivation of a formula
- paper and blackboards don't have autocomplete
- juxtaposition is used for multiplication, thus SwapBlue can be Swap times Blue or a single variable called SwapBlue, spacing for multiplication is not a good solution when writing to board/paper
Mathematics notation is so notoriously awful that mathematicians often can't even read equations from other disciplines without significant extra context.