I'm just investigating why the nightly backup of the work server is taking so long. Turns out python (as conda, anaconda, miniconda, etc) have dumped 22 million files across the home directories, and this takes a while to just list, let alone work out which files have changed and need archiving. Most of these are duplicates of each other, and files that should really belong to the OS, like bin/curl.
I myself have installed one single package, and it installed 196,171 files in my home directory.
If that isn't gratuitous bloat, then I don't know what is.