I recently looked at the paper A Revealed Preference Approach to Computational Complexity in Economics by Federico Echenique, Daniel Golovin and Adam Wierman that attempts to redirect the study of computational complexity in economic settings.

In the usual models we now have in AGT/E, the players are assumed to *have* some kind of preferences, and then the “equilibrium” is expected to be with respect to these preferences. However, the assumption in economics may be taken to be more subtle: it is that they behave *as if* they have preferences. I.e. if you look at their behavior then you may deduce from it some revealed preference that can explain their behavior. While abstractly the notion of revealed preferences does indeed seem easier to swallow than believing that people *really* have preferences, and much discussion in economic thought seems to have gone into this difference, I must admit that I never gave it much thought.

If we take the view that people *have* preferences, then the natural computational complexity problem to study of that of the task of “INPUT: preferences; OUTPUT: equilibrium according to these preferences”. On the other hand if you take the weaker revealed preferences point of view, then this paper suggests that the natural complexity to study is “INPUT: player behavior; OUTPUT: equilibrium according to the revealed preferences”. This may be a much easier task!

The paper starts with the following basic demonstration in the context of consumer choice: there are infinitely divisible goods, and a consumer has a utility function specifying his utility for every bundle of goods. Importantly, we are not assuming anything on the utility function except that it is weakly monotone in each coordinate (free disposal). Given a vector of prices and a budget that consumer’s demand will be the bundle that maximizes subject to . Looking at this in the old way, the optimization problem of calculating the demand is hard: without any further structure on , an exponential number of queries will be needed. However, they suggest the following way to look at it: our input should be a set of responses that we have seen from the user. Each response is a triplet: , where is the bundle demanded by our player when presented with prices and budget . Given these responses we should optimize for any that is consistent with these previous responses — the revealed utility of the player. This problem turns out to be easy as, by Afriat’s theorem, the revealed preferences may always be taken to be a concave function, for which optimization is efficient using convex programming.

The authors give a few more sophisticated examples of such a difference and more over argue that this difference is at the root of economists’ refusal to view excessive computational complexity as a critique of an equilibrium concept. The “usual” argument against taking worst-case complexity results seriously is that perhaps the hard instances never occur in reality. The usual response to such a claim would be that such an argument would call for the identification of the special properties of real world instances that make the problem easy. This paper suggests such a special property: revealed preferences may indeed be simpler than general ones.

on October 26, 2010 at 6:48 am |JInteresting!

In your AGT paper with Amir Ronen the valuations are additive and still it is not known how to close the gap there – so i guess that their approach is not a universal explanation.

on October 26, 2010 at 10:15 pm |aDoes their result imply that an equilibrium with respect to the revealed preferences can be found efficiently? (even thought they do it for a single player).

Suppose there are just 2 players. Given the behavior of one player the other player’s behavior is rationalizable if and only if it is so thorugh a concave utility function… So it seems there is no need to do it for both players simultaneously.

on October 27, 2010 at 6:33 am |Haris AzizThe following papers may also be of related interest:

* S. Kalyanaraman and C. Umans. The complexity of rationalizing matchings. Proceedings of Algorithms and Computation, 19th International Symposium (ISAAC) . pages 171-182. 2008.

S. Kalyanaraman and C. Umans. The complexity of rationalizing network formation. Proceedings of the 50th Annual IEEE Symposium on Foundations of Computer Science (FOCS). pages 485-494. 2009.

Best,

on March 9, 2014 at 10:02 pm |Rigor + Relevance | A report from the Hot TIME Symposium[…] you’re curious, you can check out a few papers we have on the topic. Also, Aaron Roth and Noam Nissan each wrote posts that nicely explain the […]