On the other hand, the load performance is quite poor. On the 12x dw2.large hardware, a good clustered analytical database engine should be able to easily load 1.2TB in less than 15 minutes while the database tables are online and being queried. That it took well over an hour, and with a very simple data model at that, would argue against it being good for "real-time" even with SSDs. (This is not a surprising result though; Redshift is just a clustered PostgreSQL variant, which does not have the best internals for real-time.)
Source: I helped build a very high speed network data analytical tool on top of ParAccel (before it was bought by Amazon and rolled into redshift).
ParAccel, like a large percentage of parallel analytical databases, are forked off the excellent PostgreSQL code base because those internals were designed to be easy to extend and modify. Netezza, Vertica, EMC/Greenplum, Teradata/Aster, et al are all PostgreSQL derivatives as well with varying degrees of divergence. I've designed and built custom parallel derivatives of PostgreSQL for companies too, it is surprisingly straightforward.
There are only a handful of original, high-quality database kernels out there because it is enormously difficult to design one from scratch. Most good databases copy an existing design, or even more conveniently, fork the mature, easily modifiable, BSD-licensed, Stonebraker-designed PostgreSQL kernel. Every basic kernel design has distinctive characteristics that tend to stick with everything derived from them, which leaves an identifiable "fingerprint" on a new database if you know what to look for. You inherit both the strengths and weaknesses of the underlying kernel design.
(Source: I've designed analytical databases engines for a long time.)
I was amazed how much improvement I've seen just by getting an SSD - and how cheap it was compared to all other solutions.
http://www.wolframalpha.com/input/?i=%240.25+per+hour+for+a+...
$183 a month.
"Amazon Redshift is a fast, fully managed, petabyte-scale data warehouse service that makes it simple and cost-effective to efficiently analyze all your data using your existing business intelligence tools."
That to me screams "enterprise" and "big data" and all sorts of other silly buzz words. Your average startup is probably not going to need this, but their target audience may view that $183/month base price tag favorably.