New App Uses Your Selfies To Detect Depression

That forced smile you flash at your phone’s camera could soon reveal more than you realize. Researchers have developed an artificial intelligence app that analyzes selfies to detect early signs of depression with 75% accuracy.

Called MoodCapture, the smartphone app captures images via your front-facing camera during routine phone use. An AI model then scans the photos for facial cues and environmental signs associated with the onset of major depressive disorder. These range from reduced eye contact and muscle rigidity to dominant colors and lighting in the background.

The developers say MoodCapture could be publicly available within five years. It marks the first time candid, natural photos have been leveraged to screen for depression remotely.

Passive Detection

Past apps required users to actively submit selfies or complete mood inventories. MoodCapture eliminates this effort by passively gathering data as you open apps or text friends. Most people unlock their smartphones hundreds of times per day without realizing it.

“A person just unlocks their phone and MoodCapture knows their depression dynamics,” explains lead researcher Dr. Andrew Campbell, a computer science professor at Dartmouth, in a university release. “It can then suggest they seek help.”

This frequent, discreet analysis allows MoodCapture to identify changes in mood over time. Catching early shifts toward depression enables preventative support before symptoms escalate.

MoodCapture Trained To Spot Depression

To program the AI screening system, the developers first captured over 125,000 natural phone images from clinically diagnosed participants with depression. The photos were taken at random intervals over 90 days as subjects responded to a depression severity questionnaire.

Researchers fed these images into a deep neural network model primed to codify facial expressions like rigid muscles or downward gaze as depressive signals. Background elements were also flagged as clues based on lighting, dominant colors, and sparseness.

The trained AI was then tested on photos of a separate group of subjects, again answering depression survey questions. Without any user input, MoodCapture analyzed changes in their expressions and environments to infer their reported depression levels with 75% accuracy.

Real-Time Recognition

Study co-author Dr. Nicholas Jacobson notes technologies like MoodCapture meet patients where they are. “Many of our therapeutic interventions for depression are centered around longer stretches of time,” he explains, “but these folks experience ebbs and flows in their condition.”

Brief self-assessments or monthly appointments fail to capture the rapid shifts many encounter. That’s where passive facial recognition steps in, detecting subtle mood changes most surveys would miss.

Detecting downturns in the moment enables instant support, whether that’s an app encouraging a walk outdoors or connecting users to counseling resources. Early intervention can stop depressive dips before they spiral into crises requiring hospitalization.

It also expands access for those unable to frequently visit therapists in person. Jacobson states that on average, people only spend 0.03% of their time with clinicians. MoodCapture bridges the gap, providing continuous touchpoints between appointments.

User Privacy

Naturally, privacy is a concern when apps analyze your appearance without asking. The researchers stress that all images are anonymized and encrypted to protect personal information. Following diagnosis, no identifiable selfies would ever leave a user’s device.

“Even if the data ever does leave the device, there would be no way to put it back together into an image that identifies the user,” says co-lead author Subigya Nepal, a PhD candidate on the team.

The ultimate goal is to distill photos into biometric data reflecting depressive signals like frown depth or brow position. This abstraction safeguards privacy while extracting insights into wellbeing.

Refining Accuracy

While demonstrating initial promise, the researchers acknowledge MoodCapture must be more accurate before public deployment. But they believe boosting performance to 90% is realistic by refining the AI model.

As Jacobson notes, the existing 75% figure stems from baseline analysis of a diverse group. Precision could improve by personalizing the app to track individual facial movements and environmental factors over time. Building knowledge of a user’s specific tells allows highly customized mood detection.

The app might also expand its scope by incorporating other smartphone data like sleep patterns, activity levels, and communication logs. Integrating selfies with passive phone metadata may yield additional context to recognize depression.

Our Future Early Warning System?

Despite needing tweaks, MoodCapture offers a proof of concept for scalable mental health screening. Embedding therapeutic tools within phones we already use daily provides round-the-clock touchpoints impossible through traditional care models.

Jacobson envisions a future iteration that provides gentle support before users realize anything is amiss. An alert encouraging a walk outside or call to a loved one could stabilize moods before clinical depression sets in.

By leveraging AI and smartphone cameras we mindlessly peer into hundreds of times per day, researchers have opened the door to making mental healthcare continuous rather than crisis-driven. Automated facial analysis converts selfies from vain indulgences into windows into our emotional states over time. Far from just a fad, these casual images may become critical warning signs that wellbeing is declining.

The team published their findings on the arXiv preprint database in advance of presenting it at the Association of Computing Machinery’s CHI 2024 conference in May. 

Leave a Reply

Your email address will not be published. Required fields are marked *