Why does it matter? It doesn't idle that high; it only goes that high of you're using it flat out, in which case the extra power usage is justified because it's giving that much more performance over a 100 W TDP CPU. Now I totally get it if you don't want to go Threadripper just for ECC because it's more expensive, but max power draw, which you don't even have to use? I've never seen anyone shop a desktop CPU by TDP, rather than by performance and price.
Oh oh, me! Back in the day I bought a 65W CPU for a system that could handle a 90W. I wanted quiet and figured that would keep fan noise down at a modest performance penalty. It should also last longer, being the same design but running cooler. I ran that from 2005 until a few years ago (it still run fine but is in storage).
Planning to continue this strategy. I suspect it's common among SFF enthusiasts.
IMO, shopping by performance/watt makes sense. Shopping by TDP doesn't. (Especially since there is no comparing the AMD and Intel TDP numbers as they're defined differently; neither is the maximum the processor can draw, and Intel significantly exceeds the specified TDP on normal workloads).
In a sandwich style case you're usually limited to low profile coolers like Noctua L9i/L9a since vertical height is pretty limited.
If you want a 45W TDP from the 3700X, you can just pop into Ryzen Master and ask for a 45W TDP. Boom, you're running in that envelope.
I think shopping based on TDP is not the best, because it's not comparable between manufacturers and because it's something you can effectively "choose".
As a petty "Take that", I dropped the max frequency from 2.0 GHz to 1.0 GHz. I ran a couple benchmarks to prove the cap was working, and then just kept it at 1.0 for a few months, to prove my point.
It made a bigger difference on my ARM SBC, where I tried capping the 1,000 MHz chip to 200 or 400 MHz. That chip was already CPU-bound for many tasks and could barely even run Firefox. Amdahl's Law kicked in - Halving the frequency made _everything_ twice as slow, because almost everything was waiting on the CPU.
And the relationship between power and performance isn't linear as processor voltages climb trying to squeeze out the last bit of performance.
So if you want to take a 105W CPU and ask it to operate in a 65W envelope, you're not giving up even 1/3rd of peak performance, and much less than that of typical performance.
My passive-cooled desktop is also running a slightly trottled down 65W CPU.
So yes, there are people who choose there hardware by TDP.
These days, a quiet, pwm fan with good thermal paste (and maybe some linux CPU throttling) more than achieves my needs for a "silent" pc 99% of the time.
I would love to be told my above assumptions are wrong if they are.
The worst bit is, AMD and Intel define TDP differently-- neither is the maximum power the processor can draw-- though Intel is far more optimistic.
Get a huge cooler like Noctua d14, and you pc becomes silent. It lasts forever, requires no maintenance, a good investment.
If you are adventurous, watercooling is even better, but its a can of worms I decided I'd rather live without - possibility of leaks and cost make it harder to justify
That's me. When I start to plan for a new system, I select the processor first and read its thermal design guidelines (Intel used to have nice load vs. max temp graphs in their docs) and select every component around it for sustained max load.
This results in a more silent system for idle and peace of mind for loading it for extended duration.
You can passively cool threadrippers if you underclock them enough and have good ventilation in case.
In my case loading means maxing out all cores and extended period of time can be anything from five minutes to hours.
Both are optimistic lies, but-- if you look at the documents it looks like currently AMD needs more cooling, but actually dissipates less power in most cases and definitely has higher performance/watt.
Performance/watt metrics and idle consumption would have been a far better way to make this choice.
If you have a choice between A) something that can dissipate 65W peak for 100 units of performance, but would dissipate 4W average under your workload, and B) something that can dissipate 45W peak for 60 units of performance, but would dissipate 4.5W under your workload... I'm not sure why you'd ever pick B.
Also, even though the CPU may draw less, can still the power supply waste more, just because it is beefy? Comparing with a sports car, they have great performance, but also use more gas in ordinary traffic? Can a computer be compared with that?