C# async runtime is mixed mode, threadpool will try to optimize the threadcount so that all tasks can advance fairly-ish. This means spawning more worker threads than physical cores and relying on operating system's thread pre-emption to shuffle them for work.
That's why synchronously blocking a thread is not a complete loss of throughput. It used to be worse but starting from .NET 6, threadpool was rewritten in C# and can actively detect blocked threads and inject more to deal with the issue.
Additionally, another commenter above mistakenly called Rust "bare metal" which it is not because for async it is usually paired with tokio or async-std which (by default, configurable) spawn 1:1 worker threads per CPU physical threads and actively manage those too.
p.s.: the goal of cooperative multi-tasking is precisely to alleviate the issues that come with pre-emptive one. I think Java's project Loom approach is a mistake and made sense 10 years ago but not today, with every modern language adopting async/await semantics.