Fair enough. Thanks for the education in terms.
I'm extrapolating a lot from a small comment, so forgive me if I've misinterpreted your comment, but I have an issue with your reasoning; I don't know what it tells us that 50% of people suffer from mental illness at any given point in their life, or how that can be suggestive of any evidence against a relationship between mental illness and suicide.
The numbers may seem telling on the face, but the rate of mental illness in cases of suicide is not based on "diagnosable," it is based on actually diagnosed cases (and jumping ahead, the NIH report linked in the parent comment certainly came to a very different conclusion than you seem to have inferred.)
There's a reason I would think that they sample populations in this way, as in terms of "diagnosable;" most centrally in order to get a more accurate count of the actual incidence, and because the rate of incidence is actually known to be higher than the rate of actual diagnosis.
This kind of sampling helps us to better know the actual rate of mental illness, and taken with the known rate of diagnosed mental illness it may allow us to learn the rate of undiagnosed cases.
What I'm saying is, we can't know anything more than that from these simple averages taken together. The NIH report tracked individuals with known case history, and that's exactly how you can know more.
It is also known that many people who are suffering as such will not seek help. Those are exactly the individuals that "The CDC's data [which] relies on reports from coroners, medical examiners and law enforcement agencies" simply can't take into account. The rate of undiagnosed mental illness is nonzero and significant (a quick search shows recent reports suggesting that it is also near 50%). If there was a perfect 1:1 correlation (say, every suicide co-incident to a case of mental disorder, diagnosed or not), you would absolutely expect that CDC post-mortem number to be lower; less than NIH number by the rate of undiagnosed mental illness.
You can compare these statistics in aggregate, and "rates of undiagnosed mental illness" are also knowable in aggregate through population sampling methods like these, but those people with undiagnosed mental illnesses, who then committed suicide, cannot be directly counted. Cases of "co-morbid mental disorder" among those individuals, as the NIH report calls it, are not so readily quantified. (I think the CDC report does not even attempt to quantify them. This is the idea of a psychiatric autopsy.)
So I think that you can't say more without knowing more about those individuals suffering from undiagnosed mental illnesses (yeah, that's the thing you unfortunately can't know.)
As those undiagnosed illnesses are ostensibly people who did not seek help, those are exactly the ones who the CDC stat won't be able to measure.
You can know something about the relative uncertainty of those measurements though – if you could say that 50% of individuals with severe psychiatric disorders are untreated, then you can have an expected value for the difference between these numbers. This also may seem suggestive, but it is also not sufficient evidence to directly draw any conclusions from.
Actually diagnosed mental illnesses can, however, be correlated with incidences of suicide, which the NIH study linked in the parent commenter's reference does. That report indicated that there was shown a strong correlation.
Disregarding all of the reports and statistics though, as a layman, if forms of treatment used are remotely successful, I would hope that suicide rates among patients with a diagnosis should be lower than the rate of suicide in undiagnosed cases. And if so, that is also going to be a driver for the type of error I'm trying to express that I thought you were not fully considering.