Its easy to think, "Why don't they use [New Hotness Software ]?" Which on the surface seems to be a good idea. Until you absolutely need sub-millisecond precision I/O, then you kinda start to cry when you realize how hard precise timing in computers is.
If you use lab equipment in say Linux, BSD, OSX, Windows. Your using a time shared, not Read-Time OS. So your I/O events aren't when the event happens, but when the scheduler is damn well ready to let you know the event happened.
The easiest example is a timing equipment I was using to count digital pulses form a quartz crystal. In a 'modern' secure OS I couldn't really get below 0.1% margin of error. Which wasn't acceptable for the equipment which wasn't low enough for our uses. I fall back to an older insecure real-time platform and bumped up to 0.005%.
Security is great. I attend security conferences in my spare time and try to stay up to date on the topic. The main problem is when you get into most this computing everything isn't running glibc and win32. Hacking isn't very easy unless you know the system to start with.