This is a very big assumption.
Dynamic linkers are also clever enough to only mmap the required parts of dynamic library.
Don't get me wrong, there are places I think that static linking is ideal. I wish more distributors of binary only software would statically link, or at least include standalone required dynamic libraries, rather than rely on system dynamic libraries.
I wish them luck in their experiment and hope they can improve static linking, but I suspect they will learn more about why dynamic loading "wastes" so many resources the more they come in contact with real world libraries.
It brings in used libraries, all at once. E.g. if you used sincos() from math.a, and math.a contained 47 other math functions, then you'd get all 48 math functions in your static binary just from using sincos().
Someone correct me if I'm wrong but I believe it's only with good whole program optimization at link time that it's possible to truly prove that a function is unneeded and exclude it (and then re-link if needed to re-resolve symbols to their new address in virtual memory).
You are probably talking about demand paging (which happens on statically linked binaries.