Multi-Task Feature Interaction Learning
KAIXIANG LIN*, Michigan State University; Jianpeng Xu, Michigan State University; Shuiwang Ji, Washington State University; Jiayu Zhou, Michigan State University
Linear models are widely used in various data mining and machine learning algorithms. One major limitation of such models is the lack of capability to capture predictive information from interactions between features. While introducing high-order feature interaction terms can overcome this limitation, this approach dramatically increases the model complexity and imposes signiﬁcant challenges in the learning against overﬁtting. When there are multiple related learning tasks, feature interactions from these tasks are usually related and modeling such relatedness is the key to improve their generalization. In this paper, we propose a novel Multi-Task feature Interaction Learning (MTIL) framework to exploit the task relatedness from high-order feature interactions. Speciﬁcally, we collectively represent the feature interactions from multiple tasks as a tensor, and prior knowledge of task relatedness can be incorporated into different structured regularizations on this tensor. We formulate two concrete approaches under this framework, namely the shared interaction approach and the embedded interaction approach. The former assumes tasks share the same set of interactions, and the latter assumes feature interactions from multiple tasks share a common subspace. We have provided eﬃcient algorithms for solving the two formulations. Extensive empirical studies on both synthetic and real datasets have demonstrated the eﬀectiveness of the proposed framework.