Not only that, many Americans seem to believe their system of having insurance companies as middlemen improves the system somehow.
My girlfriend is a doctor here in Europe (though she has worked in Mexico). As such she has worked with private and public systems (generally in Europe, you are automatically part of the public system. You can pay private yourself and generally have to wait less).
Anyway, both her and her boss pointed out that if you have anything serious, go to the public system. Why? Because insurance companies always want to pay the least for the cheapest drugs / treatment. The doctors don't enjoy working for the private, because too much of it is trying to justify using the more expensive treatment to the insurance companies.
At the end of the day the public healthcare system exists to help people get better. The private healthcare system exists to make money.