A PHP Error was encountered

Severity: 8192

Message: Non-static method URL_tube::usage() should not be called statically, assuming $this from incompatible context

Filename: url_tube/pi.url_tube.php

Line Number: 13

On Sampling Strategies for Neural Network-based Collaborative Filtering

KDD Papers

On Sampling Strategies for Neural Network-based Collaborative Filtering

Ting Chen (University of California, Los Angeles);Yizhou Sun (University of California, Los Angeles);Yue Shi (Yahoo! Research);Liangjie Hong (Etsy Inc.)


Abstract

Recent advances in neural networks have inspired people to design hybrid recommendation algorithms that can take care of both (1) user-item interaction information and (2) content information including images, audios, and text, without tedious feature engineering. Despite their promising results, neural network-based recommendation algorithms pose extensive computational costs, making it harder to scale and improve upon.

In this paper, we propose a general neural network-based recommendation framework, which subsumes several existing state-of-the-art neural network-based recommendation algorithms, and address the efficiency issue by investigating sampling strategies in the stochastic gradient descent algorithm for the framework. We tackle this issue by first establishing a connection between the loss functions and the user-item interaction bipartite graph, where loss functions are defined on links while costly computation are on nodes. Based on this insight, three novel node-based sampling strategies are proposed, which can significantly improve the training efficiency of the proposed framework (up to $\times 30$ times speedup in our experiments), as well as improving the recommendation performance. Theoretical analysis is also provided for both the computational cost and the convergence. We believe our study of sampling strategies have further implications on general graph-based loss functions, and would also enable more research under the neural network-based recommendation framework.


Comments