AI is being trained to identify signs of depression by studying posts on Instagram.
Some people use their social media posts as an obvious cry for help. Others may be doing so, even if they’re not consciously aware of it, a study from the University of Vermont reveals.
Like Picasso’s blue period, the colours and composition of images can reveal a lot about the artist’s state of mind – if you know what to look for.
That’s the trouble with diagnosing depression: we often don’t know what to look for. In fact, the study authors note, GPs have a 42% success rate of diagnosing the illness face-to-face. By contrast, a machine learning algorithm trained on well-established psychological research on depressed people’s’ preferences for colour, brightness and shading was able to tell if the photographer was depressed – or about to get diagnosed – 70% of the time. Far from perfect, but accurate enough to raise some interesting questions about the role of artificial intelligence (AI) in diagnosing mental illness.
After training, the AI got to work on analysing 43,950 photos from 166 volunteers’ Instagram feeds. Of the volunteers, 71 had experienced periods of officially diagnosed clinical depression in the past three years. “Although we had a relatively small sample size, we were able to reliably observe differences in features of social media posts between depressed and non-depressed individuals,” said Dr Andrew Reece, the study’s co-author. “Importantly, we also demonstrate that the markers of depression can be observed in posts made prior to the person receiving a clinical diagnosis of depression.”
So what did the depressed photographers’ composition have in common? Generally, those in a depressed period would post pictures that were darker, and with more blues and greys than other Instagram users. On top of this, while healthy users were more inclined to use filters like Valencia for a bright, warm tone, depressed photographers would sap the life from images with Inkwell. “In other words, people suffering from depression were more likely to favour a filter that literally drained all the colour out the images they wanted to share,” the scientists wrote in a companion blog post.
Another interesting pattern: depressed individuals were more likely to post photos with faces in them. Crucially, however, when faces were in shot, there were typically fewer people than those posted by healthy individuals. In other words, it’s possible that this counterintuitive point could be the result of a propensity for “sad selfies” rather than people watching.
What’s particularly striking about all of this is that volunteers asked to guess whether the photographs were taken by someone who was depressed were also often correct in their diagnosis – but not as often as the AI, and with completely different reasoning to the AI. “Obviously you know your friends better than a computer,” points out Professor Christopher Danforth, the study’s other co-author, “but you might not, as a person casually flipping through Instagram, be as good at detecting depression as you think.”
So, what’s the point of this research? Well hypothetically, a computer screening could help those going through a period of depression get the help they need before they realise they want it. If the machines are better at diagnosis than people, then why wouldn’t you turn to them? “So much is encoded in our digital footprint,” says Danforth. “Clever artificial intelligence will be able to find signals, especially for something like mental illness.”
On the other hand, this – as the researchers recognise – raises huge ethical and privacy concerns. Remember what happened when Facebook conducted its own experiment in analysing users’ emotions? The intervening three years haven’t seen people becoming any more comfortable with internet giants analysing people’s well-being.
But with medical resources stretched – especially those in mental health – a low-cost screening process using AI may be considered the future regardless, though that’s not a bridge we have to cross just yet. “This study is not yet a diagnostic test, not by a long shot,” says Danforth, “but it is a proof of concept of a new way to help people.”