I've rarely seen any non-giants with PBs of data properly compressed. For example, small JSON files converted into larger, compressed parquet files will use 10-100x less space. I am not familiar with images but see no reason why encoding batches of similar images should make it hard to get similar or even better compression ratios
Also, if you decide to move off later on, your transfer costs will also be cheaper if you can move it off in a compressed form first.