Good point, it's just an average. And to be fair i checked the numbers in the article it seemed closer to 3.2TB/day which is closer to 300Mbit/s. But what i meant is a home fiber connection can deal with that. Although consumer ISPs don't have good bandwidth over all routes (it's good to Youtube/Amazon but ridiculously slow to some other consumer ISPs). If you don't want to serve from home, i'm sure many entities would be happy to donate disk space and bandwidth to help a project like this, setting up a mirror list like we have for distro repositories.
Also, we may be taking the problem the wrong way around: do these multigig files need to be accessed by everyone from a web browser? No, it's a dump file used by specific people in specific circumstances. Then why are we using HTTP for this in the first place? In this case, only publishing over Bittorrent/IPFS makes sense and many people will happily seed, pushing costs toward 0 for the publisher (and very close to 0 if you only push to a first circle of mirrors who can then push to more mirrors, some of which can be declared webseeds in your torrent).