Arbitrary precision is not a valid answer: every multiplication doubles the number of mantissa bits. Starting with a 64-bit precision, after just 40 multiplications, you will have consumed 800 GB of RAM just for storing a single number, at which point you'll ask yourself: how many decimal digits do I really need? Which was the initial question...