I worked on similar software that ran on Windows, though the decision was made 18 years ago. It was driven by:
* GUI dev tools were an order of magnitude better than under linux
* nidaq (very common daq and dio cards) didn't have or had flaky linux drivers. usb may be a better choice today but was not then available. Plus it's convenient/less expensive to push functionality to the computer side and minimize embedded development.
* in the beginning (though we fixed this mistake), we shipped software, our machine, and a ni-daq card; users provided a computer. This was a nightmare. We ended up shipping whole computers custom built by Dell to our specs with ni-daq cards installed (and later glued in to prevent them slipping out during shipping) plus our custom system images. At the time, I don't think Dell would do this for linux; you could find other vendors that would, but they weren't easy to work with or much more expensive.
* Nobody is eager to switch OSes because every software update to the machine has to be FDA certified (ie kiss at least $100k goodbye), so the company is very risk averse. Beyond even the cost to rewrite what has evolved into a relatively complicated gui app.
* As mentioned elsewhere, once you leave windows you have to train people to use linux guis that historically had window managers / gui toolkits that didn't work like windows.
All of the above is complicated by the fact most of these medical device companies pay their devs shit and have trouble hiring good devs. (Engineering shortage, obviously!)