It's going to keep getting worse until a) developers and project managers realize doing inherently unsafe things is bad and b) they have the resources to give the additional ongoing levels of scrutiny. I'm not hopeful that this will happen at large in the industry, though I know it _is_ happening within individual companies and projects.
I'm sure we'll mitigate the damage to some extent by making package managers smarter and implementing finer-grained permissions. That will improve the situation over time, but it also takes us in the wrong direction by allowing us to forget that when we're shipping dependencies, we ultimately own their behavior.
And I'm not really arguing against vetting your dependencies or improving dependency management. I'm just saying in the real world, that if I made this particular imperfection in software development practices my hill to die on at work, there's a 99% chance it is not good for me or my career. So my options are, swim with the tide knowing we're doing things imperfectly, or fight an uphill battle for a more perfect world knowing that unless we avoid some major vulnerability every other Javascript developer falls victim to, there will be many eyes in my office staring over at me wondering if my extra caution is really worth the company's investment. If I keep my job at all.
I want to write great software, but to do that, I need to actually have a job writing software. And until I get a job at Google or Facebook or Amazon (none of those being places I've ever actually applied to) I am generally working in conditions without the resources to do the kind of dependency vetting we're talking about in this thread.
You could also treat supply chain attacks on software dependencies like another IT security risk your company is exposed to (just like virus infection, ransomware attacks, phishing, etc) and go through the same thinking (and if appropriate other) processes to manage them. The company can then make a conscious decision on whether it's worth investing in mitigating, eliminating or accepting the risk.
There's lots of information out there on dealing with cyber security risks, e.g. https://www.ncsc.gov.uk/collection/risk-management-collectio....
(Apologies if this is all obvious, I'm just trying to highlight an alternative approach which might help you deal with the dilemma and not have to "solve" it all by yourself)
Do you actually get to do this wherever you work? Honestly it would be great to have the luxury of that kind of patience and time to invest in my work. But it's universally unrealistic in my experience.
This is not at all a question of "what would be the ideal or perfect scenario." This is a question of what's pragmatic and politically accomplishable in most work environments.