I would argue that it's more noticeable in those older games where they weren't using lag compensation and you had to lead your shots in order to hit other players. If you're testing on a game which has rollback netcode then lag matters less because the game is literally hiding it from you.
What task is actually being measured here matters, too. For example, while it is true that humans cannot generally react faster than 100ms or so; most actual skills being tested by competitive gameplay are not pure reaction tests. They are usually some amount of telegraphed stimulus (notice an approaching player, an oncoming platform, etc) followed by an anticipated response. Humans are extremely sensitive to latency specifically because they need to time responses to those stimuli - not because they score really well in snap reaction tests.
Concrete example: the window to L-cancel in Melee is really small - far smaller than humanly possible to hit if this was purely a matter of reaction times. Of course, no player actually hits that window, because it's humanly impossible. They don't see their character hit the ground and then press L. They instead press L several frames in advance so that by the time their finger presses the trigger, their character has just hit the ground and made the window. Now, if I go ahead and add two frames of total lag to the display chain, all of their anticipated reactions will be too late and they'll have to retrain for that particular display.