I'd recommend estimating how much you think it will cost you, and then 10X that value. If that value is going to be hard to cover, stick with something else. If you're working for a Fortune 500 however, Dynamo is very nice.
Unfortunately you _still_ need to be aware of how you plan on querying your data to maximize the performance.
This is one of the most common questions/comments I see in the forums.
More info: http://docs.aws.amazon.com/amazondynamodb/latest/developergu...
Also be aware that leveraging the Local Secondary Indexes _does_ enforce a 10GB limit on items with the same hashkey, see the last section here: http://docs.aws.amazon.com/amazondynamodb/latest/developergu...
Our stack is on top of EC2, and the DynamoDB read latency is extremely low (1ms -> 10ms per read) -- using it as a data store means you don't even need a caching layer (memcached, etc.) since it's equally as fast, and all data is persistent (and you have no room limits).
I'd highly recommend it to anyone.
Also: if you're using Python, the dynamodb-mapper library is a great (stable) tool for working with DynamoDB.