4

I've seen events, like CogX, and articles that describe how machine learning techniques or algorithms can be used to diagnose mental health issues.

Here's my question.

How can artificial intelligence and machine learning algorithms or techniques be used in diagnosing mental health issues, besides, for example, Facebook using machine learning algorithms to detect people who may commit suicide?

nbro
  • 39,006
  • 12
  • 98
  • 176
MJM
  • 149
  • 1
  • Check this paper http://ieeexplore.ieee.org/document/7293425/?reload=true – quintumnia Jun 21 '17 at 08:59
  • Try to go through this paper for your benefit https://thesai.org/Downloads/Volume7No1/Paper_76-Prediction_of_Mental_Health_Problems_Among_Children.pdf – quintumnia Jun 21 '17 at 09:14
  • 1
    just a comment, but i think this is a very bad idea. mental health issues are very complex and our understanding of it is very limited – k.c. sayz 'k.c sayz' Aug 27 '17 at 02:49
  • @k.c.sayz'k.csayz' that's a fair point, but there are commonalities within disorders that would likely be identifiable for an advanced AI. (See my answer on the successful results of the application of Machine Learning in predicting suicide.) – DukeZhou Oct 27 '17 at 20:27

3 Answers3

2

There's a strong sentiment towards the idea that medical diagnosis is largely Abductive Reasoning. See this presentation for additional details. One approach to automated abductive reasoning is parsimonious covering theory.

If you want a relatively in-depth look at all of this, check out the book Abductive Inference Models for Diagnostic Problem-Solving .

Expert systems have also been shown to work very well for medical diagnosis. As far back as the 1970's, systems like MYCIN could beat human experts in terms of diagnosis and treatment plans.

From what I can tell, there doesn't seem to be any reason in principle to think that AI/ML can't be used across the board for medical diagnosis, mental health or otherwise.

mindcrime
  • 3,737
  • 14
  • 29
1

It's already being done, and apparently with very good results.

See: Predicting Risk of Suicide Attempts Over Time Through Machine Learning, Walsh, Ribiero, Franklin

Here is the abstract from the paper:

Traditional approaches to the prediction of suicide attempts have limited the accuracy and scale of risk detection for these dangerous behaviors. We sought to overcome these limitations by applying machine learning to electronic health records within a large medical database. Participants were 5,167 adult patients with a claim code for self-injury (i.e., ICD-9, E95x); expert review of records determined that 3,250 patients made a suicide attempt (i.e., cases), and 1,917 patients engaged in self-injury that was nonsuicidal, accidental, or nonverifiable (i.e., controls). We developed machine learning algorithms that accurately predicted future suicide attempts (AUC = 0.84, precision = 0.79, recall = 0.95, Brier score = 0.14). Moreover, accuracy improved from 720 days to 7 days before the suicide attempt, and predictor importance shifted across time. These findings represent a step toward accurate and scalable risk detection and provide insight into how suicide attempt risk shifts over time.

This has gotten national press attention quite recently. Here are a couple of prior articles on the result and endeavor:

Artificial Intelligence is Learning to Predict and Prevent Suicide (Wired)

Artificial intelligence can now predict suicide with remarkable accuracy (Quartz)

Is strongly suspect ML can be used in an array of application related to human mental health. For instance, it is entirely possible a app would be able to discern the mood swings in people with bipolar disorder based on activity or lack thereof.

DukeZhou
  • 6,237
  • 5
  • 25
  • 53
0

You may infer some mental problems using text processing. Any text written by an individual contains a lot of clues about their mental state, probably including any health problems.

However, I believe pursuing this would prove to be difficult due to lack of ground truth. Not many people makes it public that they have mental problems and without a reliable database it is quite difficult to make research on the field.

Cem Kalyoncu
  • 330
  • 2
  • 10