I actually really doubt that Google, Amazon et al have proper backups of every client's storage - I've never come across details or even an idea of such a system. They just have enough redundancy and, more importantly, a "never-delete" architecture - data is merely tagged for deletion for a significant amount of time before it's ever deleted, and various systems check consistency on an ongoing basis.
Of course, even that doesn't prevent you from fucking up - your datastore will do exactly what you tell it to. Nobody can prevent you from doing the equivalent of rm -rf on your S3 store, or accidentally deleting the only copy of that movie your client's been working on for the last four years, and nothing can protect you from it except a decent backup.