The Choices Are: Slick and Exponential, or Brute Force and Polynomial
Guest Post by Vijay Vazirani
A new, difficult problem, begun within TCS twelve years ago, namely finding good algorithms for computing market equilibria, has stretched the holy grail of TCS, namely polynomial time solvability, to its very limits — so much so that today, the two options listed in the title seem the only ones available! Let me quickly provide some background, say a bit about these options and invite you to a discussion that — hopefully — goes right to the core tenets of our field.
The notion of market equilibrium, which asks for prices under which there is parity between supply and demand, is central to economics. The Arrow-Debreu Theorem (1955), showing existence of such prices for a realistic, complex model of the economy, is perhaps the most central result in economics. Yet, this result has an unsatisfying aspect to it: Whereas this notion is inherently algorithmic, e.g., Walras (1874), while defining this notion also gave a mechanism for arriving at an equilibrium, the Theorem is highly non-constructive. Several top mathematicians, including Scarf and Smale, have given algorithms, but they are slow.
Hence, bringing to bear powerful tools from our theory of algorithms was a fine goal. Work started with the easiest market models and gradually moved on to more realistic, complex models. At first, the primal-dual paradigm and interior point methods for convex programs did admirably well. However, when aspects of intractability showed up, attention shifted to the two options listed in the title.
“Slick” algorithms are in the style of the (pivoting-based) simplex algorithm, though suitably enhanced to the setting of the linear complementarity problem — these are called complementary pivot algorithms, e.g., the classic Lemke-Howson algorithm (1964) for 2-player Nash is one such. Such algorithms run very fast in practice — indeed simplex (1947) is still the method of choice for most LP applications — but one can make them take exponential time by intricately doctoring up the instance.
Let us say that an algorithm is “brute force” if it accomplishes the job of finding one solution by finding them all — it has no idea how to do any better. However, it does achieve polynomial running time if certain parameters are assumed constant e.g., for CLIQUE, when restricted to graphs having cliques of size at most (a constant), there is a trivial algorithm.
From the viewpoint of practicality, the former win hands down — even for moderate instance sizes, the latter will take prohibitively long. However, there is another important criterion: the study of polynomial time algorithms has historically led to deep insights into the structure of the problems studied. Surprisingly enough, even on this aspect, these exponential time algorithms do well, e.g., simplex leads to a proof of strong duality and Shapley has obtained deep structural properties of 2-Nash from the Lemke-Howson algorithm. For now, I am not aware of any insights from brute force algorithms.
Smoothed analysis provides a fairly convincing theoretical reason for the success of simplex. Surely there should be a more general setting that encompasses complementary pivot algorithms.