I think Lisp still has an edge for larger projects and for applications where the speed of the compiled code is important. But Python has the edge (with a large number of students) when the main goal is communication, not programming per se.
In terms of programming-in-the-large, at Google and elsewhere, I think that language choice is not as important as all the other choices: if you have the right overall architecture, the right team of programmers, the right development process that allows for rapid development with continuous improvement, then many languages will work for you; if you don't have those things you're in trouble regardless of your language choice.
Thank you, Peter. This is how I have felt for years, but could never find words that describe it as well as you just did.
Someone should write a program that automatically posts this paragraph at the top of every language war thread. I think they should write that program in php :-)
But I've also been astounded at how slow CPython is compared to SBCL (the Common Lisp implementation I use) when I have to do long runs to gather data. (For the things I've been playing around with, my Common Lisp implementations have been something like 5 to 20 times faster.)
For example:
(pseudo
a = 0
b = 100
s = 0
for (i from a to b)
s = s + i
write(s)
)If such pseudo code can be embedded into SBCL it would generate fast machine code, also it would be possible to easily modify pseudo code syntax.
And thank you PG/YC for giving me the opportunity to do so.
Communication is increasingly the more important part of programming. Engineering is only really useful if its well communicated, or in a binary you trust.
oops, it seems it already happens today...
Well from what he writes, he seems to say Python looks much closer to their actual pseudo-code, and it's therefore easier for students to translate pseudocode to Python than to Lisp.
> In my opinion, it is better for most people because they were taught syntaxes
Such as english. One of the original goals of Python (inherited from ABC[0]) was to be a good teaching language. I'd expect that when Peter talks about his students, he's mostly talking about students with low-to-no knowledge in programming. Those who are already knowledgeable probably don't have a hard time adapting.
Redesigning a data format to match how I notate and think about it, to minimize my cognitive load, has been very helpful to me.
The long-term trend in computers seems to be to trade performance for helping the developer.
> when the main goal is communication, not programming per se
This reminds me of debates about the scientific method. Some say that you can test hypotheses etc alone without publishing and you are doing science; but others define science as a community activity, and so without publishing, it is not 'science'. While I love the idea of the lone researcher, and clever insights definitely come from individuals, without a community there is no SOTSOG.
1. Python doesn't suck. I was able to mix FP & OOP approaches to get to my goal fairly quickly.
2. iPython was fun to use, helped out a lot, but it's not SLIME.
3. Guido has an excellent goal with making code readable, and significant white space is not a bad choice. However, I find being able to analyze active data structures in a Clojure namespace to be a superior way to learn about a system.
4. Python's libraries are pretty good, and it's already been written. As a first impression, Python libs are much better to use than Clojure wrapped java libs. I'm going to look into porting SQLAlachemy to Clojure, it rocks.
5. Paster has a ton of functionality. I'd like to see a similar Clojure tool, maybe Lien can evolve into that.
6. I would like to see more FP constructs natively available in Python.
7. __method__ is an interesting convention. You can have an object implement the right method, and your object works with Python syntax. However, I find it to be a poor man's version of Clojure's protocols (Full Disclojure, I have a conflict of interests here).
8. Decorators are an interesting way to do functional composition, but I prefer comp and monads. Way more versatile.
9. INSERT MACRO RANT HERE
That's all I've got for now. I'm sure I forgot something.
SFD
edit: grammar & spelling
def comp(f, g):
def h(*args, **kwargs):
return g(f(*args, **kwargs))
# fix up h.__doc__ and friends
return h
or simply (lambda the, args: g(f(the, args))(x, y)
(don't remember comp's semantics, is (comp f g) = f o g or g o f?)too long; don't read:
Decorators are certainly cool, but semantically they represent something more like a pattern than a FP construct. A decorator represents something you might want to do to lots of functions, a property you want all instances of a function to have without writing it explicitly into each function. Function composition is more along the lines of having two functions which are interesting on their own, but which sometimes you want to compose.
With decorators, it would also be awkward to compose multiple functions. Observe:
def compose_with(g):
def decorator(f):
def decorated_function(*args, **kwargs):
return g(f(*args, **kwargs))
return decorated_function
return decorator
def h(x): math.sqrt(x)
@compose_with(h)
def g(x): 2 * x
@compose_with(g)
def f(x): x + 1
versus (for some reasonable definition of apply...) def compose(*fns):
def composition(*args, **kwargs):
return reduce((lambda computed, next_fn: next_fn.apply(computed)),
fns,
(args, kwargs))
return composition
# define fns as above without decorator
hogof = compose(f, g, h)(def my-decor (partial comp decor-bevior))
Done.
Macros are the main reason I decided to create Adder, a Lisp-on-Python with minimal impedance mismatch. Unfortunately, the first macro-heavy program I wrote turned out to be really slow, because macros engage the compiler, which, of course, is in Python.
When I first tried it, it took something like 50s at 2.4GHz, virtually all of which was the compiler. (The compiler runs at load time; obviously, saving the compiled code for the next run would help.) I got it down to...let me try it now...7s at 3GHz, but that's still too slow for a 200-line program.
If anybody's interested, the code's on Github [1]. To see the macro-heavy example, look at samples/html.+, which is an HTML generator. The framework takes 169 lines; the sample page starts at line 171. To see the output, run:
./adder.py samples/html.+
[1] http://github.com/metageek/adder
(Edit: it requires Python 3.x.)
The Python byte-compiler, specifically, appears to be fairly slow. I hadn't really thought much about this, but while contributing to a benchmark yesterday (http://news.ycombinator.com/item?id=1800396), it wound up staring me in the face. There's actually surprisingly little difference performance-wise in running Lua from source vs. precompiled (both pretty fast), whereas the difference between Python and pyc in my benchmark was wider than every other possible pair in the chart except python interpreted vs. "echo Hello World". (I didn't have any JVM languages, though.)
I don't really do eval-based metaprogramming in Python, but do so on occasion in Lua. I thought I felt better about doing so because Lua is syntactically much simpler (and has scoping rules that make avoiding unexpected variable capture easy), but the Lua compiler itself also appears to be substantially faster than Python's. (It doesn't do much analysis, but still usually runs faster than Python.)
And yes, it's not as good as straight-up Lisp macros, but Lua's reflection also covers a lot of low-hanging fruit that macros would otherwise handle. The biggest thing lacking in Lua compared to Lisp is an explicit compile-time phase for static metaprogramming. (Code generation is an inferior alternative.) Lisp macros win big in part because they can avoid the overhead of parsing, but parsing Lua is fairly cheap thanks to its small, LL(1) grammar (http://www.lua.org/manual/5.1/manual.html#8).
I still think something like this would be worthwhile, and I wish you the best of luck! I particularly like your (.bar.baz foo) syntax. And good on you for moving away from bytecode generation; I also found that to be a dead end.
So what are your plans for macros+namespaces? Will macros from imports live in the same namespace?
And what did you mean by "Python supports only two levels of lexical scope" at http://www.thibault.org/adder/ ? I don't recall any constraints regarding lexical scope levels in the VM.
I'm tempted to create a LOOP clone for Clojure, then laugh villainously as I unleash it upon the world.
There's tons of really wild ideas in Interlisp that seemed to be the half-baked acid trip ideas of West Coast hippies at the time, that are just starting to become rediscovered in the past couple of years (pervasive undo -> reversible debugging, DWIM-like autosuggestions in more places), and even the implementation techniques used are still innovative (for example the error-trapping implementation of Conversational Lisp (http://docs.google.com/viewer?a=v&q=cache:4GnEnGS2XXkJ:c...) is quite similar to how Geoff Wozniak approached auto-defining functions (http://exploring-lisp.blogspot.com/2008/01/auto-defining-fun...)
Could you port CL's LOOP to Clojure easily? Or would its internal hyper-imperativeness make that tricky?
http://common-lisp.net/project/iterate/doc/Don_0027t-Loop-It...
DOOOO IT
Nit: significant indentation. Mostlanguageshavesignificantwhitespace, somemorethanothers (for instance, at least in 1.8, Ruby seems to have more whitespace issues than Python)
6. I would like to see more FP constructs natively available in Python.
FP constructs such as? Map is there. Reduce is one import away. There is nice syntax for list comprehensions. What do you miss?what exactly were you thinking about?
I'm sure Python has very good libraries, but I would find it constraining to program in a language without proper macros.
I do Common Lisp and C++ at my day job (ITA), and I do much of my personal hacking in Python. In Python and C++ I miss macros; in Lisp and Python I miss RAII and strong typing; in Lisp and C++ I miss dictionary literals.
And, in all of them, I miss algebraic datatypes.
Common Lisp does have strong typing. What it does not have is static typing.
I am at the SPLASH conference, and the Dynamic Language Symposium is happening right now. There is controversy over whether we can find a way to have the benefits of both static and dynamic typing in the same language. The great advances in type inference make me hopeful. The keynote speaker, Allan Wirfs-Brock, replied to my question about this with more pessimism. It is not a simple question; for one thing, not everybody even agree about which factors are "pro" or "con" for either static or dynamic. I am not doing programming languages these days (I'm doing databases) but I continue to be hopeful.
So roll your own. Or use mine:
Python is strongly typed.
Also, just to jab at C/C++, pointers to void... really? It all but makes C/C++ a weakly typed language.
http://wiki.python.org/moin/Why%20is%20Python%20a%20dynamic%... http://www.artima.com/weblogs/viewpost.jsp?thread=7590 http://en.wikipedia.org/wiki/Duck_typing http://articles.sitepoint.com/article/typing-versus-dynamic-...
(obj a 1 b 2 c 3)Here's a function that makes code harder to read:
def sumAList(aList): return 7
This doesn't mean that functions are bad.
Now there is an argument that macros make code harder to read in that I've yet to see a really good macro system that isn't dependent on the code having very little syntax (e.g. S expressions), since the more different the code is from the AST, the harder it is to manipulate the code successfully.
Combined with the fact that more syntax can make code much easier to read, there is a conflict here.
However, I don't think that's the argument you are making.
do this do this and that and that too do this amen
No branching nor decision trees as not to confuse common folk. Now programming is a socially acceptable activity!
There are a few libraries that help make it easier(so you do not need to manipulate the ast yourself). For example:
@macro
def macroname(arg1, arg2):
... macro contents ...
There's some current information for you old time lispers, so next time you don't sound so dated in your Battles with Trolls in the great never ending language war flames ;)-reader-macros
-symbol-macros
-compiler-macros
-macrolet
-symbol-macrolet
EDIT: Fixed layout
"(1) It just turned out that when Google was started, the core programmers were C++ programmers and they were very effective. Part of it is a little bit of culture. (2) Early Lisp programmers (Erann Gat) at Google actually noticed that other programmers were equally or more productive. It has more to do with the programmer; we're getting to the point where language choice is less important (as opposed to 20 years ago). (3) Lisp is optimized for a single programmer or a small group of programmers doing exploratory work... If I want to make a change in a weekend I'd rather do it in Lisp than anything else, but by the time you get up to hundreds of programers making changes are not a language problem but a social one. (4) Libraries."
Paraphrased from: http://www.youtube.com/watch?v=hE7k0_9k0VA#t=03m20s.
When he finished Peter [Norvig] took questions and to my surprise called first on the rumpled old guy who had wandered in just before the talk began and eased himself into a chair just across the aisle from me and a few rows up.
This guy had wild white hair and a scraggly white beard and looked hopelessly lost as if he had gotten separated from the tour group and wandered in mostly to rest his feet and just a little to see what we were all up to. My first thought was that he would be terribly disappointed by our bizarre topic and my second thought was that he would be about the right age, Stanford is just down the road, I think he is still at Stanford -- could it be?
"Yes, John?" Peter said.
I won't pretend to remember Lisp inventor John McCarthy's exact words which is odd because there were only about ten but he simply asked if Python could gracefully manipulate Python code as data.
"No, John, it can't," said Peter and nothing more, graciously assenting to the professor's critique, and McCarthy said no more though Peter waited a moment to see if he would and in the silence a thousand words were said.
http://smuglispweeny.blogspot.com/2008/02/ooh-ooh-my-turn-wh...
Though, may I add that Python (or any other modern programming language) can manipulate its own code as data - only not as gracefully as Lisp. In other words, a Lisp program is its own AST - but in other languages the AST is only a "parse" away (and Python specifically makes computing it very easy).
I've often wondered why McCarthy has never been asked to Startup school to talk about developing and using Lisp and the advantages?
(Of the two though, Python seems to have more problems with people refusing to speak the native idioms and insisting on writing $LANGUAGE in Python instead. Python Is Not A Functional Language. It is a multiparadigm language where the functional is definitely the foreign and borrowed paradigm on top of an imperative/OO core. Ignoring that will bring you grief, but it won't be Python's fault.)
Later edit: In fact, refusing to speak Python's native idioms has been getting noticeably worse in the last six months. If you want a solid OO language with some decent functional borrowing, learn Python. If you want a pure functional language for whatever reason, do us all a favor and don't learn Python. Or at least don't add to the chorus of people complaining Python isn't Haskell, just go learn Haskell. Or Clojure, or whatever.
Actually there have been three significant developments in the last five years that IMO tilt the scales back over to the Lisp side:
1. Clojure
2. Clozure Common Lisp a.k.a. CCL (a very unfortunate confluence of names -- the fact that Clojure and Clozure differ by only one letter but otherwise bear almost no resemblance to each other causes no end of confusion).
3. The state of Common Lisp libraries has gotten a LOT better in the last five years.
And a lot (perhaps equally LOT) better still in the last couple weeks, with Quicklisp. Not that I've tried it yet :)
I think Python is a surprisingly nice language in many ways. It falls into the category of being an "acceptable Lisp". But I think most Lispers still prefer the real thing and only use Python when they need to be pragmatic.
But I think even though Ruby offers a bit more for Lispers, I just don't like the design of the language as much in general as Python. If I don't code Python for a year and have to delve into it again it takes only a few seconds to get back to speed again. In Ruby, on the other hand, I always forget how the blocks & variables behave and am thrown by the strange predicate syntax, making this far more unpleasant.
At least these languages (that I know of) have macros:
http://boo.codehaus.org
http://nemerle.org
http://www.perl6.orgPython is a great language, sure, but it's not a Lisp, and it's best appreciated on its own terms. Both because Lisp and Python really aren't very similar, and because people coming to Python with transliterated Lisp idioms are going to be disappointed.
Clojure's data structures seem a lot like Python's but are a bit more elegant and avoid many of the little python headaches that come up often like d['k'] vs d.k vs d('k'). In clojure it would be (d :k) or (:k d) and both work. If you need memoization there are functions to help you. In clojure there definitely seems to be an attempt to make all the core data structures as compatible as possible (even having a formal abstraction (iSeq) for all of them)
Culturally, Python seems to care more about a minimal core language. Clojure.core has probably 3-4 times as many built-ins as Python. Many of the clojure functions are supporting features Python doesn't have or handles with syntax, like macros and conditional expressions, but there are also clojure functions like even?, that probably won't ever be a Python built-in.
Especially for predicates, functions like even?, every?, ffirst,
When does that come up? d['k'] is a key access (to a collection), d.k is an attribute access (to an object) and d('k') is a method call. I'm not sure where the headache/confusion would be here, unless you're trying to do very weird things with Python (which you should not)
class months(object):
def __init__(self):
m='jan feb mar apr may jun jul aug sep oct nov dec'.split(' ')
n=range(1,13)
self.__dict__.update(dict(zip(m,n)))
def __getitem__(self, key):
return self.__dict__[key]
def __call__(self, key):
return self.__dict__[key]
d=months()
print d['jan'] # 1
print d('jan') # 1
print d.jan # 1(For the record, this line supposed to be edited out, but apparently I forgot to actually delete it)
I haven't seen that with Python, Ruby, C++, Java, even Haskell feels 'modern' in that way. Why must it be so hard to get simple stuff going ?
(Note: I like function programming concepts, but I would expect a programming language to be easier to bind into a context of reality.)
Glad you noticed! We've been working hard on this: http://haskell.org/platform
* Why Lisp macros are cool, a Perl perspective (2005) http://news.ycombinator.com/item?id=795344
* Can a language have Lisp's powerful macros without the parentheses? http://news.ycombinator.com/item?id=9172
1.- Mathematica: you can use both infix and prefix form, Fullform[a+b] = [Plus,a,b]. Mathematica internally use prefix notation. Evaluation is more flexible than Lisp, you can define a function and decide whether it evaluates some, all or none of its arguments. 2.- Maxima: A layer over Lisp to define a infix language,in which you can define operators to resemble math notation, for example f(x):= x^2 similar to (defun f(x)(* x x)) 3.- Dylan. A lisp with infix notation. 4.- Willem Broekema cl-python, python in Lisp. 5.- Clojure. Clojure brings some nice syntax for getters and setters, function arguments and much more. 6.- comp.lang.lisp versus clojure. Clojure has a great community, lisp has some problems with lords. 7.- abcl is here, that is Lisp in java. abcl can run maxima without errors and that is great. 7.- Ruby, jruby, ioke, duby, those are efforts to achieve a very expressible language. 8.- javascript, the good parts. javascript with some anotations can be the next lisp. 9.- quick-lisp for a better installer than asdf.
I used to think macros matter more than syntax, because you can freely define your own micro-languages. I didn't really practice it, because of the practical limitations of the available lisps [1]. Now I think the opposite, that syntax matters more. Syntax helps you parse the code visually and you use lower level parts of your cortex to understand the code [2]. You can build arbitrary DSLs in lisps, but they all have no syntax, so they are of limited cognitive help. I think the real win are modern languages with syntax, that is malleable enough to facilitate the cognitive apparatus of the programmer in most cases, or at least most cases that matter. For example, an obvious DSL is the mathematical notation - Python / Ruby handle it well enough with operator overloading, Lisp actually does worse because of the prefix notation.
It is important to understand that you can approximate the bottom-up style of building abstractions with libraries (instead of DSLs), parameterizing the proper things, with minimal syntax noise. The remaining difference between macros and using higher level functions is mostly in run time optimization.
I guess seasoned lispers learn to "see through" all the brackets and engage the lower-level part of the brain in parsing lisp code. Ironically, something similar happens to Java developers - after enough hours looking at the code they start to ignore the ugly try/catch clauses that can't be properly abstracted away because of language limitatons.
[1] with the exception of one big project in Common Lisp, but I did only a little programming in it, and this was before I fully appreciated macros - but the guy before me used them extensively to build two layers of domain specific languages
[2] L Peter Deutsch talks about this in Coders at Work and this is probably more valuable than what I have to say:
Deutsch: I can tell you why I don’t want to work with Lisp syntax anymore. There are two reasons. Number one, and I alluded to this earlier, is that the older I’ve gotten, the more important it is to me that the density of information per square inch in front of my face is high. The density of information per square inch in infix languages is higher than in Lisp.
Seibel: But almost all languages are, in fact, prefix, except for a small handful of arithmetic operators.
Deutsch: That’s not actually true. In Python, for example, it’s not true for list, tuple, and dictionary construction. That’s done with bracketing. String formatting is done infix.
Seibel: As it is in Common Lisp with FORMAT.
Deutsch: OK, right. But the things that aren’t done infix; the common ones, being loops and conditionals, are not prefix. They’re done by alternating keywords and what it is they apply to. In that respect they are actually more verbose than Lisp. But that brings me to the other half, the other reason why I like Python syntax better, which is that Lisp is lexically pretty monotonous.
Seibel: I think Larry Wall described it as a bowl of oatmeal with fingernail clippings in it.
Deutsch: Well, my description of Perl is something that looks like it came out of the wrong end of a dog. I think Larry Wall has a lot of nerve talking about language design—Perl is an abomination as a language. But let’s not go there. If you look at a piece of Lisp code, in order to extract its meaning there are two things that you have to do that you don’t have to do in a language like Python.
First you have to filter out all those damn parentheses. It’s not intellectual work but your brain does understanding at multiple levels and I think the first thing it does is symbol recognition. So it’s going to recognize all those parenthesis symbols and then you have to filter them out at a higher level.
So you’re making the brain symbol-recognition mechanism do extra work. These days it may be that the arithmetic functions in Lisp are actually spelled with their common names, I mean, you write plus sign and multiply sign and so forth.
Seibel: Yes.
Deutsch: Alright, so the second thing I was going to say you have to do, you don’t actually have to do anymore, which is understanding those things using token recognition rather than symbol recognition, which also happens at a higher level in your brain. Then there’s a third thing, which may seem like a small thing but I don’t think it is. Which is that in an infix world, every operator is next to both of its operands. In a prefix world it isn’t. You have to do more work to see the other operand. You know, these all sound like small things. But to me the biggest one is the density of information per square inch.
Seibel: But the fact that Lisp’s basic syntax, the lexical syntax, is pretty close to the abstract syntax tree of the program does permit the language to support macros. And macros allow you to create syntactic abstraction, which is the best way to compress what you’re looking at.
Deutsch: Yes, it is.
Seibel: In my Lisp book I wrote a chapter about parsing binary files, using ID3 tags in MP3 files as an example. And the nice thing about that is you can use this style of programming where you take the specification—in this case the ID3 spec—put parentheses around it, and then make that be the code you want.
Deutsch: Right.
Seibel: So my description of how to parse an ID3 header is essentially exactly as many tokens as the specification for an ID3 header.
Deutsch: Well, the interesting thing is I did almost exactly the same thing in Python. I had a situation where I had to parse really quite a complex file format. It was one of the more complex music file formats. So in Python I wrote a set of classes that provided both parsing and pretty printing. The correspondence between the class construction and the method name is all done in a common superclass. So this is all done object-oriented; you don’t need a macro facility. It doesn’t look quite as nice as some other way you might do it, but what you get is something that is approximately as readable as the corresponding Lisp macros. There are some things that you can do in a cleaner and more general way in Lisp. I don’t disagree with that. If you look at the code for Ghostscript, Ghostscript is all written in C. But it’s C augmented with hundreds of preprocessor macros. So in effect, in order to write code that’s going to become part of Ghostscript, you have to learn not only C, but you have to learn what amounts to an extended language. So you can do things like that in C; you do them when you have to. It happens in every language. In Python I have my own what amount to little extensions to Python. They’re not syntactic extensions; they’re classes, they’re mixins—many of them are mixins that augment what most people think of as the semantics of the language. You get one set of facilities for doing that in Python, you get a different set in Lisp. Some people like one better, some people like the other better.
Lisp has a 2-stage syntax.
The first stage is the syntax of s-expressions, which is surprisingly complex. S-Expressions provide a textual syntax for data: symbols, lists, pairs, strings, various number formats, arrays, characters, pathnames, ...
The first stage is implemented by the 'reader' and can be reprogrammed by an ancient API to the reader via read tables.
The second stage is the syntax of the Lisp programming language. This is defined on top of s-expressions and is really a syntax over data structures (not text). This Lisp syntax deals with: data items, function calls, special forms (thirty something) and macro forms.
This syntax stage is implemented as part of the interpreter/compiler (EVAL, COMPILE, COMPILE-FILE) and can be extended by writing macros, symbol macros and compiler macros. In earlier dialects it could also be extended by writing so-called FEXPRs, functions which get called with unevaluated source code (-> data in Lisp).
So, we get a lot of complex syntax due to special forms and macros. It just looks a bit different, since the data syntax is always underneath it (unless one uses a different reader).
For example a function definition would be:
(defun foo (a b) (+ (sin a) (sin b)))
The syntax for that is: defun function-name lambda-list [[declaration* | documentation]] form*
With more complex syntax for 'function-name', 'lambda-list' and 'declaration'.Lambda-list has this syntax:
lambda-list::= (var*
[&optional {var | (var [init-form [supplied-p-parameter]])}*]
[&rest var]
[&key {var | ({var | (keyword-name var)}
[init-form [supplied-p-parameter]])}*
[&allow-other-keys]]
[&aux {var | (var [init-form])}*])
Not every valid Lisp program has an external representation as an s-expression - because it can be constructed internally and can contain objects which can't be read back.Not every s-expression is a valid Lisp program. Actually most s-expressions are not valid Lisp programs.
For example
(defun foo bar)
is not a valid Lisp program. It violates the syntax above.Readtables aren't any more ancient than the rest of ANSI INCITS 226-1994 (R2004) (the language standard formerly known as X3.226-1994 (R1999)), but the interface to them is very low-level and non-modular.
The current readtable is specified by a dynamic variable, so the readtable facility can be made modular with a nicer interface, in a portable manner. This is exactly what the library Named-Readtables does: http://common-lisp.net/project/named-readtables/
Now realize the significance of this: Common Lisp is the only language allowing total unrestricted syntactic extension and <i>modification</i> in a modular and (somewhat) composable way.
I've been using named-readtables for the past month, and between it and Quicklisp, I haven't been this excited about programming in CL since I started (which is 8 years ago, not that long, but I'm not a total noob either).
I am not particularly familiar with python too, I should educate myself properly, dwell more into it though. I only get by for what I need to do in Maya with it.
I really, really like Lisp. The syntax, as a result of it having been introduced to me fairly early, never was a hangup of any kind. SLIME is quite wonderful. I always yearn for quasiquotation.
However, I cannot justify using it for a small-scale project that has to ship relatively soon with limited resources. The reason: Python has useful projects with good release culture and documentation that are not seen in the Lisp world, if only for the lack of contributors. (Considering the number of authors, I found common lisp documentation to be quite good, in fact, but I still find them pretty hard pressed to compete with, say, the Django manual, or most of the Python standard library itself)
Software — especially at the small, early stage scale — is still loaded with problems that are not matters of extensive, inspired, and unique programming. Or, they could be, but did you really want to write your own whiz-bang form validation instead of building all the other features of your product?
Clojure I think represents a very interesting escape from this trap, and if I had a lot of Java dependencies (whether bequeathed or because of unique library advantages Java may have. WEKA comes to mind for some things) I would most certainly want to use it. But in a project I control from the beginning, Python wins, hands down, in a number of domains.
I will also expose one heretical opinion of mine about Lisp Macros: they're great, but easy to overuse, and seldom So Necessary if one makes good use of higher order functions. Some code, admittedly, comes out nicer using macros, but I can almost always get the expressive power I need with little contortion using higher order functions.
If I had the time to follow my heart, I'd want to do something to continue to vitalize Lisp. But not while I need to ship.
To round out this post, here are some minor peeves on the Python side though:
* No symbol data type. I don't like using strings for these. I use SymbolType for this, but it's still comparatively clunky compared to CL.
* Hyphens are the best intra-token-delimiter (see what I mean?)
* No macros
* No quasiquotation
* Slow
* Have to avoid stepping on myself because of the mutable environment frames you see a bit more frequently as opposed to say, Scheme or Erlang where tail recursion is the name of the game. I also make use of "labels" in cl.
* Did I mention tail recursion elimination? No? Well...sometimes this would be nice.