A PHP Error was encountered

Severity: 8192

Message: Non-static method URL_tube::usage() should not be called statically, assuming $this from incompatible context

Filename: url_tube/pi.url_tube.php

Line Number: 13

KDD 2020 | FedFast: Going Beyond Average for Faster Training of Federated Recommender Systems

Accepted Papers

FedFast: Going Beyond Average for Faster Training of Federated Recommender Systems

Khalil Muhammad: Insight Centre for Data Analytics, University College Dublin; Qinqin Wang: Insight Centre for Data Analytics, University College Dublin; Diarmuid O' Reilly-Morgan: Insight Centre for Data Analytics, University College Dublin; Elias Tragos: Insight Centre for Data Analytics, University College Dublin; Barry Smyth: Insight Centre for Data Analytics, University College Dublin; Neil Hurley: Insight Centre for Data Analytics, University College Dublin; James Geraci: Samsung Electronics; Aonghus Lawlor: Insight Centre for Data Analytics, University College Dublin


Download

Federated learning (FL) is quickly becoming the de facto standard for the distributed training of deep recommendation models, using on-device user data and reducing server costs. In a typical FL process, a central server tasks end-users to train a shared recommendation model using their local data. The local models are trained over several rounds on the users’ devices and the server combines them into a global model, which is sent to the devices for the purpose of providing recommendations. Standard FL approaches use randomly selected users for training at each round, and simply average their local models to compute the global model. The resulting federated recommendation models require significant client effort to train and many communication rounds before they converge to a satisfactory accuracy. Users are left with poor quality recommendations until the late stages of training. We present a novel technique, FedFast, to accelerate distributed learning which achieves good accuracy for all users very early in the training process. We achieve this by sampling from a diverse set of participating clients in each training round and applying an active aggregation method that propagates the updated model to the other clients. Consequently, with FedFast the users benefit from far lower communication costs and more accurate models that can be consumed anytime during the training process even at the very early stages. We demonstrate the efficacy of our approach across a variety of benchmark datasets and in comparison to state-of-the-art recommendation techniques.

How can we assist you?

We'll be updating the website as information becomes available. If you have a question that requires immediate attention, please feel free to contact us. Thank you!

Please enter the word you see in the image below: