A PHP Error was encountered

Severity: 8192

Message: Non-static method URL_tube::usage() should not be called statically, assuming $this from incompatible context

Filename: url_tube/pi.url_tube.php

Line Number: 13

Effective Evaluation using Logged Bandit Feedback from Multiple Loggers

KDD Papers

Effective Evaluation using Logged Bandit Feedback from Multiple Loggers

Aman Agarwal (Cornell University);Soumya Basu (Cornell University);Tobias Schnabel (Cornell University);Thorsten Joachims (Cornell University)


Abstract

Accurately evaluating new policies (e.g. ad-placement models, ranking functions, recommendation functions) is one of the key problems in improving interactive systems. While the conventional approach to evaluation relies on online A/B tests, recent work has shown that counterfactual estimators can provide an inexpensive and fast alternative, since they can be applied offline using log data that was collected from a different policy fielded in the past. In this paper, we address the question of how to estimate the performance of a new policy when we have log data from multiple historic policies. This question is of great relevance in practice, since policies get updated frequently in most online systems. We show that naively combining data from multiple logging policies is highly suboptimal. In particular, we find that the standard Inverse Propensity Score (IPS) estimator suffers especially when logging and evaluation policies diverge—to a point where throwing away data improves the variance of the estimator. We therefore propose two alternative estimators which we characterize theoretically and compare experimentally. We find empirically that the new estimators can provide substantially improved estimation accuracy.


Comments