That said, this is an important question. We, particularly those us who work on critical infrastructure or software, should be asking ourselves this regularly to help prevent this type of thing.
Note that it's also easy (and similarly catastrophic) to swing too far the other way and approach all unknowns with automatic paranoia. We live in a world where we have to trust strangers every day, and if we lose that option completely then our civilization grinds to a halt.
But-- vigilance is warranted. I applaud these engineers who followed their instincts and dug into this. They all did us a huge service!
EDIT: wording, spelling
The current situation is ridiculous - if I pull in a compression library from npm, cargo or Python, why can that package interact with my network, make syscalls (as me) and read and write files on my computer? Leftpad shouldn’t be able to install crypto ransomware on my computer.
To solve that, package managers should include capability based security. I want to say “use this package from cargo, but refuse to compile or link into my binary any function which makes any syscall except for read and write. No open - if I want to compress or decompress a file, I’ll open the file myself and pass it in.” No messing with my filesystem. No network access. No raw asm, no trusted build scripts and no exec. What I allow is all you get.
The capability should be transitive. All dependencies of the package should be brought in under the same restriction.
In dynamic languages like (server side) JavaScript, I think this would have to be handled at runtime. We could add a capability parameter to all functions which issue syscalls (or do anything else that’s security sensitive). When the program starts, it gets an “everything” capability. That capability can be cloned and reduced to just the capabilities needed. (Think, pledge). If I want to talk to redis using a 3rd party library, I pass the redis package a capability which only allows it to open network connections. And only to this specific host on this specific port.
It wouldn’t stop all security problems. It might not even stop this one. But it would dramatically reduce the attack surface of badly behaving libraries.
It is hijacking a process that has network access at runtime not build time.
The build hack grabs files from the repo and inspects build parameters (in a benign way, everyone checks whether you are running on X platform etc)
so the problem is IFUNC mechanism, which has its valid uses but can be EASILY misused for any sort of attacks
We should also be asking ourselves if we are working on critical infrastructure. Lasse Collin probably did not consider liblzma being loaded by sshd when vetting the new maintainer. Did the xz project ever agree to this responsibility?
We should also be asking ourselfs if each dependency of critical infrastructure is worth the risk. sshd linking libsystemd just to write a few bytes into an open fd is absurd. libsystemd pulling in liblzma because hey it also does compressed logging is absurd. Yet this kind of absurd dependency bloat is everywhere.
Enough to be cautious, enough to think about how to catch bad actors, not so much as to close yourself off and become a paranoid hermit.
North America is only about 5% of the world's population. [1] (We can assume that malicious actors are in North America, too, but this helps to adjust our perspective.)
The percentage of maliciousness on the Internet is much higher.
[1] _ See continental subregions. https://en.wikipedia.org/wiki/List_of_continents_and_contine...
I've been evil, been wonderful, and indifferent at different stages in life.
I have known those who have done similar for money, fame, and boredom.
I think, given a backstory, incentive, opportunity, and resources it would be possible to most people to flip from wouldn't to enlisted.
Leverage has shown to be the biggest lever when it comes to compliance.
I guess every 3 letter agency has at least one. You can do the math. They havent't learned anything after Solar Winds.