There's some clever wordplay/marketing here... "designed to provide 99.99..99%" means that the theoretical model of the system tells you that you lose 1 in X files per year when everything is working as modeled (e.g. "disks fail at expected rate as independent random variables"). If something not in the model goes wrong (e.g. power goes out, a bug in S3 code), data can be lost above and beyond this "designed" percentage. The actual probability of data loss is therefore much, much higher than this theoretical percentage.
A more comical way to look at it: The percentage is actually AWS saying "to keep costs low, we plan to lose this many files per year"; when we screw up and things don't go quite to plan, we lose a _lot_ more.