×

Learning economic parameters from revealed preferences. (English) Zbl 1406.91095

Liu, Tie-Yan (ed.) et al., Web and internet economics. 10th international conference, WINE 2014, Beijing, China, December 14–17, 2014. Proceedings. Cham: Springer (ISBN 978-3-319-13128-3/pbk). Lecture Notes in Computer Science 8877, 338-353 (2014).
Summary: A recent line of work, starting with E. Beigman and R. Vohra [“Learning from revealed preference”, in: Proceedings of the 7th ACM conference on electronic commerce, EC’06. New York, NY: Association for Computer Society. 36–42 (2006; doi:10.1145/1134707.1134712)] and M. Zadimoghaddam and A. Roth [“Efficiently learning from revealed preference”, Lect. Notes Comput. Sci. 7695, 114–127 (2012; doi:10.1007/978-3-642-35311-6_9)], has addressed the problem of learning a utility function from revealed preference data. The goal here is to make use of past data describing the purchases of a utility maximizing agent when faced with certain prices and budget constraints in order to produce a hypothesis function that can accurately forecast the future behavior of the agent.
In this work we advance this line of work by providing sample complexity guarantees and efficient algorithms for a number of important classes. By drawing a connection to recent advances in multi-class learning, we provide a computationally efficient algorithm with tight sample complexity guarantees \((\tilde{\Theta}(d/\epsilon)\) for the case of \(d\) goods) for learning linear utility functions under a linear price model. This solves an open question in [Zadimoghaddam and Roth, loc. cit.]. Our technique yields numerous generalizations including the ability to learn other well-studied classes of utility functions, to deal with a misspecified model, and with nonlinear prices.
For the entire collection see [Zbl 1302.68013].

MSC:

91B08 Individual preferences
91B16 Utility theory
91A26 Rationality and learning in game theory
91-04 Software, source code, etc. for problems pertaining to game theory, economics, and finance
PDFBibTeX XMLCite
Full Text: DOI arXiv