Accepted Papers

Multiple Relational Attention Network for Multi-task Learning

Jiejie Zhao (Beihang University);Bowen Du (Beihang University);Leilei Sun (Beihang University);Fuzhen Zhuang (Chinese Academy of Sciences);Weifeng Lv (Beihang University);Hui Xiong (Rutgers University);


Multi-task learning is a successful machine learning framework which improves the performance of prediction models by leveraging knowledge among tasks, e.g., the relationships between different tasks. Most of existing multi-task learning methods focus on guiding learning process by predefined task relationships. In fact, these methods have not fully exploited the associated relationships during the learning process. On the one hand, replacing predefined task relationships by adaptively learned ones may result in higher prediction accuracy as it can avoid the risk of misguiding caused by improperly predefined relationships. On the other hand, apart from the task relationships, feature-task dependence and feature-feature interactions could also be employed to guide the learning process. Along this line, we propose aMultiple Relational Attention Network (MRAN) framework for multi-task learning, in which three types of relationships are considered. Correspondingly, MRAN consists of three attention-based relationship learning modules: 1) a task-task relationship learning module which captures the relationships among tasks automatically and controls the positive and negative knowledge transfer adaptively; 2) a feature-feature interaction learning module that handles the complicated interactions among features; 3) a task-feature dependence learning module, which can associate the related features with target tasks separately. To evaluate the effectiveness of the proposed MARN, experiments are conducted on two public datasets and a real-world dataset crawled from a review hosting site. Experimental results demonstrate the superiority of our method over both classical and the state-of-the-art multi-task learning methods.

Download

How can we assist you?

We'll be updating the website as information becomes available. If you have a question that requires immediate attention, please feel free to contact us. Thank you!

Please enter the word you see in the image below: