Depression, a condition that affects nearly 300 million individuals globally, approximately 4% of the world’s population, can be challenging to identify. Especially difficult are those cases where individuals are unable or unwilling to express their feelings of distress to friends, family, or medical professionals. In response to this, Stevens professor Sang Won Bae is directing efforts towards the creation of smartphone applications and systems powered by AI that could potentially identify signs of depression in a non-invasive manner.
Bae comments on this issue, acknowledging the gravity of depression and expressing a commitment to finding solutions. Considering the widespread usage of smartphones, Bae believes these devices could be an effective and ready-to-use tool for depression detection.
One such system, PupilSense, being developed by Bae in collaboration with Stevens doctoral student Rahul Islam, operates by continuously capturing images and measurements of a smartphone user’s pupils. Bae explains how the reflexes and responses of the pupils, as shown by decades of research, can be linked to depressive episodes.
PupilSense operates by accurately calculating the diameter of the pupils in relation to the surrounding irises. It does this using 10-second “burst” photo streams captured whenever users unlock their phones or access specific apps. In an early trial involving 25 volunteers over four weeks, the system (embedded in their smartphones) analysed around 16,000 interactions after collecting the pupil-image data.
An AI was trained to distinguish between “normal” and abnormal responses. Bae and Islam then processed the photo data and compared it to the volunteers’ self-reported moods. The most successful version of PupilSense proved 76% accurate at identifying instances when individuals felt depressed, outperforming the current leading smartphone-based depression detection system, AWARE.
Bae, who has also developed smartphone-based systems for predicting binge drinking and cannabis use, is confident that the technology will continue to evolve now that the concept has been proven. The system was first revealed at the International Conference on Activity and Behavior Computing in Japan and is now open-source on GitHub.
Apart from PupilSense, Bae and Islam are also working on a system known as FacePsy, which analyses facial expressions for mood indications. FacePsy operates discreetly on a phone, capturing facial snapshots whenever a phone is unlocked or specific apps are accessed. The system ensures users’ privacy by deleting the facial images immediately after analysis.
Bae explains that when they began, they were unsure which facial gestures or eye movements would correlate with self-reported depression. Some findings were expected, while others were surprising. For example, the pilot study found an increase in smiling was associated with signs of a depressed mood, not happiness.
Bae suspects this could be a coping mechanism, where individuals maintain a cheerful exterior despite feeling low. Alternatively, this could be an artefact of the study, necessitating further research.
The early data also revealed other potential depression signals, such as reduced facial movements in the morning and specific patterns in eye and head movements. Interestingly, eyes being more open during the morning and evening were also associated with potential depression, suggesting that outward expressions of alertness or happiness can sometimes conceal underlying depressive feelings.
Bae believes that the FacePsy pilot study is an important first step towards a compact, inexpensive, and user-friendly diagnostic tool, as other AI systems for detecting depression often require the use of a device or multiple devices. The findings of the FacePsy pilot study will be presented at the ACM International Conference on Mobile Human-Computer Interaction (MobileHCI) in Australia in early October.
Comments are closed for this post.