Would it really be 0 though? Assuming 60Hz, the bottom of each frame is scanned out 16ms after the top. Assuming that Rock Band 3 renders an entire frame before displaying it (definitely true) and that it actually renders at 60FPS as opposed to rendering at a higher resolution and tearing (definitely true on console, might not be true on an emulator?), the video latency will range from 0ms for the top of the frame to 16ms for the bottom, for an average of 8ms.
Admittedly, I don't know what Rock Band 3 calibration numbers actually measure, e.g. whether they already take into account factors like this.
If you can manage to render at a high frame rate then you could reduce latency by tearing, but at that point, I feel like you're leaving a lot on the table by not using a 240Hz OLED or something, which can show the entirety of every frame.
Supposedly OLED has comparable response times to CRT anyway. The article says that OLED gaming monitors are unattainable, but it's 5 years old and things have changed.