Unfortunately the benchmarks I ran to get that knowledge aren't public. They were run on a ~1TB database of the production data of a mid size web company. They spanned thousands of different queries. They were trying to answer the question "is our business better off to have out of date statistics or fresh statistics", and out conclusion was that fresh statistics were too risky because they can cause a query plan to change unexpectedly in the middle of the night, and the whole database to fall over from resource exhaustion as some common query now takes hours to complete. The companies most costly downtime had already been caused by that once.
We instead generated statistics on a backup copy, then loadtested that the statistics worked acceptably, and then wrote those statistics to the production database. We did that every 3 months, because we found that outdated statistics didn't really have any appreciable performance impact, and performance degradation was gradual rather than a cliff.