No its not - LLMs are not medical experts. Nor are they experts for pretty much anything. They just extrapolate statistically the next token. If you fed them anti-vaxer information, theyd start recommending you to not get vaccinated so as to not obtain autism or something like that. We should not use them as experts on anything, much less so for medical information.
On the other hand, if you want to use them to generate large amounts of text and images, sure go do that. They can do that I guess.