Robust systems are designed to avoid single points of failure. Humans are fallible. So, for example, both the pilot and the air traffic controller are intended to be paying attention so that if one of them makes a mistake the other can pick it up. If the pilot is making an error, the air traffic controller gets on the radio to tell them they're getting too close to another aircraft, in time for them to course correct.
If air traffic control is under-staffed, now the warning the pilot gets might come a minute later than it would have otherwise, and already be too late. Then you no longer have a robust system and it's only a matter of time before one of the pilot errors the system was designed to be able to catch in time instead results in a collision.