If I were you I'd take this down immediately before you get in trouble or worse, someone gets seriously hurt.
Try not to put your mental health in the hands of a language model, you deserve real help.
Overall, I'm just saying often these personal projects don't translate to the public well at first. But eventually you fine tune it for yourself enough and find the true value in it. Then you figure out how to translate that to the public
But I fully understand and appreciate the comments made by other posters.
I have put it back online with a disclaimer that must be acknowledged, and pointed out this is a research app not a substitute for therapy.
I hope that does enough to mitigate some of the risks, whilst allowing me to continue to work at making this a safer and more valuable resource. Whenever that may be, or not :)
No. No. No no no no no no no no. Soooo much no. You haven't thought this through enough and while I want nothing more than to help all of those in pain using my skills in technology; this is non-trivial and you have not done so here.
I hope that isn't too harsh. I just think it can't be understated just how important it is to get this sort of an idea right on the first try. There should be _no_ tolerance for hacker mentality/move-fast-and-break-things, because the things you are breaking are people.
From your comment however, you don't rule out applying AI in this field entirely?
The obvious issue is that it can confidently output the wrong answer. This will be less of a problem though as model accuracy continues to improve.
Really the bigger problem are the ones that doctors and therapists already have: patients are not always reliable narrators and may mis-describe symptoms, insert their own prejudices, and outright conceal important information. In particular, many patients may not even possess the proper writing and reading skills necessary to properly respond to even the most accurate of prompts.
This is now the muddy domain of "sorry, you just need a human available". There's too much risk online, too many edge cases to accomodate, and since the tech itself actually provides a _convenience_ over doctor visits; you could be causing more harm than you might think if people decide to ignore their doctors for the "cheaper" option. This is particularly relevant to me as an American, where health care is a mess, and mental health care is far worse.
A better approach might be to assume that this type of tool is available for patients who are currently receiving some kind of treatment; whether that is a doctor's appointment or in-patient care at a behavioral health facility. There's probably a chance to reduce friction and maybe improve patient outcomes if such a tool could be provided as an early survey, perhaps. Or - as you have done here, a way to teach some coping skills that are highly individualized to each person.
Really though yeah, I think a qualified professional should be in the room and making sure things go well.