KDD Papers

DeepMood: Modeling Mobile Phone Typing Dynamics for Mood Detection

Bokai Cao (University of Illinois at Chicago);Lei Zheng (University of Illinois at Chicago);Chenwei Zhang (University of Illinois at Chicago);Philip S. Yu (University of Illinois at Chicago);Andrea Piscitello (University of Illinois at Chicago);John Zulueta (University of Illinois at Chicago);Olu Ajilore (University of Illinois at Chicago);Kelly Ryan (University of Michigan);Alex Leow (University of Illinois at Chicago)


The increasing use of electronic forms of communication presents new opportunities in the study of mental health, including the ability to investigate the manifestations of psychiatric diseases unobtrusively and in the setting of patients’ daily lives. A pilot study to explore the possible connections between bipolar affective disorder and mobile phone usage was conducted. In this study, participants were provided a mobile phone to use as their primary phone. This phone was loaded with a custom keyboard that collected metadata consisting of keypress entry time and accelerometer movement. Individual character data with the exceptions of the backspace key and space bar were not collected due to privacy concerns. We propose an end-to-end deep architecture based on late fusion, named DeepMood, to model the multi-view metadata for the prediction of mood scores. Experimental results show that 90.31% prediction accuracy on the depression score can be achieved based on session-level mobile phone typing dynamics which is typically less than one minute. It demonstrates the feasibility of using mobile phone metadata to infer mood disturbance and severity.