Accepted Papers

HiTANet: Hierarchical Time-Aware Attention Networks for Risk Prediction on Electronic Health Records

Junyu Luo: The Pennsylvania State University; Muchao Ye: The Pennsylvania State University; Cao Xiao: IQVIA; Fenglong Ma: The Pennsylvania State University


Deep learning methods especially recurrent neural network based models have demonstrated early success in disease risk prediction on longitudinal patient data. Existing works follow a strong assumption to implicitly assume the stationary disease progression during each time period, and thus, take a homogeneous way to decay the information from previous time steps for all patients. However,in reality, disease progression is non-stationary. Besides, the key time steps for a target disease vary among patients. To leverage time information for risk prediction in a more reasonable way, we propose a new hierarchical time-aware attention network, named HiTANet, which imitates the decision making process of doctors inrisk prediction. Particularly, HiTANet models time information in local and global stages. The local evaluation stage has a time aware Transformer that embeds time information into visit-level embed-ding and generates local attention weight for each visit. The global synthesis stage further adopts a time-aware key-query attention mechanism to assign global weights to different time steps. Finally, the two types of attention weights are dynamically combined to generate the patient representations for further risk prediction. We evaluate HiTANet on three real-world datasets. Compared with the best results among twelve competing baselines, HiTANet achieves over 7% in terms of F1 score on all datasets, which demonstrates the effectiveness of the proposed model and the necessity of modeling time information in risk prediction task.

How can we assist you?

We'll be updating the website as information becomes available. If you have a question that requires immediate attention, please feel free to contact us. Thank you!

Please enter the word you see in the image below: