Vijay Vazirani has spent a considerable amount of time in the last few years on developing combinatorial algorithms for problems (mostly involving market equilibria) that can be solved by general convex programming. In this guest post Vijay talks about the motivation for this:
Since convex programs can be solved efficiently using “continuous” methods, such as ellipsoid or interior point methods, why bother designing extremely ellaborate and difficult combinatorial algorithms for them? Let me propose the following thought experiment (it is not a difficult one!) to bring home the point. Think of a world in which these continuous methods were developed in the 1920’s and ever since, problems such as network flow and matching, which can be cast as linear programs, were routinely solved using such methods. Then in 1956, Ford and Fulkerson propose their beautiful combinatorial algorithm for max-flow and it is immediately trashed, since it is “needlessly complicated and difficult”. Edmonds’ matching algorithm, which is proposed in 1965, meets a similar fate. As a consequence, in this world, combinatorial optimization is a stillborn field. How much of a tragedy would that be? After all, matchings and max-flows could still be computed efficiently …
Observe that the continuous methods solve the instance at hand, without giving any insight into the entire problem from which the instance arose. In fact, such methods do not even need to be told the problem from which the instance arose — all that is needed is the specific linear or convex program for the given instance. So, in this world, theoreticians (if they can be called that) would not have the deep insights we have into the structure of numerous fundamental combinatorial problems. And you can gauge what kind of a “theory” of algorithms this world would have!
I believe that if our research community does not pursue the design of combinatorial algorithms for convex programs, our world would also lose out on a beautiful, deep theory. My lone work on this topic, some joint with my students, is too insignificant an effort in comparison with the ideas, structures and methods that still need to be unearthed. My inability to convince people so far, on what had seemed to me an obvious direction worth pursuing, is the reason for this post.
The convex programs I have studied over the last 8 years arose in AGT, in the context of market equilibria and Nash bargaining games. The theory so far has started forming around a remarkable convex program given by Eisenberg and Gale in 1959. In order to solve these nonlinear programs combinatorially, the classical primal-dual algorithm design paradigm had to be extended from its previous setting of LP-duality theory to the more general setting of convex programs and KKT conditions. The algorithms are non-trivial and require substantial structural insights. In turn, these structural insights provide a starting point for tackling more general problems. To get an idea of how rich the theory is already, consider the following episode. A few months ago, Gagan Goel and I started designing a combinatorial algorithm for a certain Nash bargaining game under peicewise-linear, concave utility functions; the linear case had already been solved. Out of the structural insights gained, we managed to define a new, natural market that models perfect price discrimination and in which buyers have peicewise-linear, concave utility functions (the usual Fisher and Arrow-Debreu markets were recently shown to be PPAD-complete under these utility functions). The equilibrium of our market is captured by a generalization of the Eisenberg-Gale program and is polynomial time computable — even combinatorially. In addition, the convex program yields very simple proofs of both welfare theorems for this market!
It is important to point out immediately that my work is limited to a very thin sliver of convex programs — depite their nonlinearity, these programs admit rational solutions. How about attacking more general classes of convex programs combinatorially, perhaps via approximation algorithms, since they won’t admit rational solutions? Considering how extensive the theory is already, I believe it cannot exist in islolation and is indicative of the tip of an iceberg. We need a lot more smart people to ponder on these issues to find the rest of the iceberg!
Solving the convex programs combinatorially and solving the problems behind them without using the convex program are two separate things.
The linear utility market equilibrium problem in fisher setting have a (pseudo) combinatorial algorithm and coincidentally, it also happens to have a convex program. The convex program was not solved combinatorially, as in the sense that no property of it was used.
Also, capturing the problem by a convex program already implies that a lot of structure of the problem has been found, and often this fact itself is quite insightful. It proves theorems, just like capturing max-flow using the linear program implies max-flow min-cut theorem. Pretty much every basic fact about the linear utility, fisher setting problem can be derived by the eisenberg-gale convex program, and the program gives much more insight than all the known (pseudo) combinatorial algorithms together. For an example, the program tells us what the world’s utility function is in such a setting. It also tells us how to combine the utility function of two people.
It also let us how to generalize the computability to other settings, e.g., it tells us that scalable utility functions are also easy and have nice primal-dual structure. One can also derive the max-flow min-cut theorem by the fact that these problems can be captured by a linear program. On the other hand, I do not know of any basic structural theorem about the linear utility, fisher setting is implied by the known combinatorial algorithms which was not derived by the eisenberg-gale program or the known tatonnement processes.
I am not saying, one should not look for combinatorial algorithm. Often, unlike strongly polynomial, combinatorial is a notion of flavor and taste, and not a strict mathematical notion. This is in a good sense, since Mathematics has a notion of elegance.
Case in point, I do not completely think the primal-dual algorithm for linear case, fisher setting is combinatorial. It has steps which are basically similar to what get used in analogue algorithms. To me simplex algorithm is combinatorial, and the known primal-dual algorithm for linear utilities in fisher setting looks more like an interior point algorithm, and less like a simplex algorithm.
Check out
Click to access plc.pdf
which uses combinatorial insights derived from linear Fisher paper to settle the complexity of piecewise-linear, concave Fisher and Arrow-Debreu markets.
Check out Sections 1.3 and 17.1 in
Click to access NBalg.pdf
for a discussion of what a “combinatorial” algorithm is and Lemma 6 for the combinatorial object the algorithm is looking for.
Check out
Click to access spending.pdf
which generalizes the linear Fisher combinatorial algorithm to a market, that generalizes linear Fisher, that is not known to admit a convex program.
Finally, check out Section 4 in
Click to access market.pdf
and Section 10 in
Click to access NBalg.pdf
to see how the KKT conditions of the corresponding convex programs are first relaxed and gradually tightened in the combinatorial algorithm.
In a different direction, check out Noam’s Sept 10, 2009, blog to see how combinatorial algorithms for market equilibria were an “inspiration” for his auction for TV ads designed at Google and currently operational there:
https://agtb.wordpress.com/?s=copenhagen
I am just trying to argue your statements in which you are trying to trash the convex programming based algorithms.
Being able to write a linear or convex program itself is a hard part, and if you are able to apply a big hammer or generic approach to a problem, that’s because you have discovered a fundamental property of the problem or have gained insight. Further this process is 2 ways, since once you are able to apply a generic approach, it could lead to additional theorems.
Any new algorithm is insightful. To sell combinatorial algorithm on the basis that it did not use a LP/CP diminishes a combinatorial approach. The right way to sell is, hey this combinatorial algorithm is really simple, have fast running time or strongly polynomial running time, or proves this additional theorem constructively etc.
If we do not stick to this, then in some problems what could happen under the cover that an LP/CP solver is employed, but instead of calling the solver as a subroutine, we could just unravel it for the problem and call it combinatorial.
When one reads the combinatorial algorithm on linear utility, fisher setting, one can’t surely say that this is not happening there. Unfortunately, the simplex kind of algorithm which you earlier had did not turn out to be polynomial time. Calling the eventual algorithm combinatorial, diminishes a motivation to find a polynomial time version of the original algorithm you discovered, which was simplex-kind.
What is different in simplex algorithm from other LP solvers? Simplex does not use the geometry of the numbers.
Why max-flow algorithm was useful, because they were faster when discovered. They were also insightful because they did not use the geometry of the numbers. Not using the geometry of the numbers forces us to use/discover the combinatorial structure of the problem. But it is eventually the combinatorial properties which are discovered which should be the selling point.
Again who is getting insight from where is somewhat subjective. I still do not know if you could mention any insight about the example you mentioned, i.e., linear utility, fisher setting. Is there a specific concrete insight you could mention?
On the otherhand some of the current work of a colleague, which is somewhat going in the reverse direction, is remarkably insightful.
Again, in more detail, new insights from combinatorial linear Fisher algorithm and its use in understanding complexity of piecewise-linear, concave (plc) utilities markets:
Lemma 5.2 in
Click to access Nisan_Non-printable.pdf
gives a combinatorial characterization of equilibrium prices. It is generalized in Lemma 1 in
Click to access plc.pdf
to the plc case. In turn, this leads to an LP for checking this characterization (Section 4) which leads to proof of rationality for this case (Theorems 2 and 3).
Further, the LP is used crucially in establishing membership in PPAD (Lemma 10 and Theorem 4). Worth noting that this proof is different from proofs of membership in PPAD of other problems, and a previous claim of proof of membership for plc was retracted.
For a more balanced treatment of combinatorial and continuous methods, in the absence of space restrictions, see Section 1.3 in
Click to access NBalg.pdf
Hi,
I have two high-level questions. They may sound irrelevant or stupid, but it will be great if anyone could kindly answers me.
What do you mean by “structure”? I think I know what people mean when they are talking about it. But could anyone explain what it really means here as clear and as formal as possible?
Second, have people already understood the structures (whatever it menas) of linear problems? It seems to me that for some linear problems, what people are trying to do is to fit them into LPs, yet for some other ones those simple and elegant combinatorial algorithms just work, with their own specific reasons (or “insights”). In general, does humankind understand what is really going on even for linear problems?
All I am saying that lowering the importance of CP/LP based algorithm is not a way to sell Combinatorial algorithms.
Just like any other work, a combinatorial algorithm should make its own case of merit.
Economics have this notion of tatonnement. 5 years ago, I together with other co-inventors have also used market-equilibirum to design auctions. Some of this work is published by USPTO. There is no surprise here, since market-equilibirum is an outcome of competition when each agent is assumed to be infinitely small.
We used the inspiration of tatonnement process, i.e., goods with more demand than supply tends to go up in price. Unfortunately, we did not get any inspiration from all the layers which prove polynomiality, and as I said, those layers are not even objectively combinatorial.
There are uses of those layers in proving the robustness of auction, but have not seen any published results yet.
Until we use those extra machinery we develop for polynomiality we are not adding any value on top of what economics already has. They already have some of the inspirations our algorithms would provide, though without polynomiality.
So a right way could be that we are using the inspiration of the taottonement processes in both designing the polynomial time algorithms, and then the same inspiration is used again in designing the auction.
My own personal feeling is that is something is not yielding polynomial, it may lack some essential properties when get used. Therefore, (in support of Vijay’s point), I strongly believe that if the market-equilibirum is used to design an auction, one should use the balanced flow to pick the right allocation. Other allocations might be fragile.
But then this should be the selling point of the work, and not that, “oh we did not use LP/CP solver”, because LP/CP is a feature and not a bug.
[…] a link to Vazirani’s guest post on Noam Nisan’s blog Algorithmic Game Theory, which seems to relate to my interest in convex […]
In a different direction, check out Noam’s Sept 10, 2009, blog to see how combinatorial algorithms for market equilibria were an “inspiration” for his auction for TV ads designed at Google and currently operational there:
thanks
killing games
I am moved by your words, Vijay, especially this part:
“My lone work on this topic, some joint with my students, is too insignificant an effort in comparison with the ideas, structures and methods that still need to be unearthed. ”
I feel that way all the time. My specialty has been differential algebra, but I keep pushing myself in new directions all the time. I tried my own hand at modelling global warming recently. But, my true passion – equal to solving differential equations in order to go after the most important prize of all: the million dollar Peta X-Meat Prize to synthesize meat and protein without growing and killling a whole animal – is mathematical modelling of justice. I am trying to wrap my mind around how to describe the notion of players being “forced” to do something, or being forced not to do something – i.e. by having their choices limited – by other players. As well as how to model each player deciding for themselves how to model “causation”: who causes what to happen.
[…] algorithmic/complexity questions of the computational difficulty of finding equilibria in games and markets we have seen only little progress, mostly for some market variants. There was some commotion […]
[…] Bargaining Game. (Vijay Vazirani):A nice blog post about this line of work has already been written here. This paper introduces the notion of a rational convex program (RCP). RCP is a convex program […]
Shopping for cars is generally a stressful experience. It does not have
to be, though. With a little knowledge and determination, your car shopping experience can be
devoid of stress. Use the tips that follow to make your car shopping experience one that you enjoy,
with a shiny new car to show for it.
Hey! This is my 1st comment here so I just wanted
to give a quick shout out and tell you I genuinely enjoy reading through your blog posts.
Can you recommend any other blogs/websites/forums that deal
with the same subjects? Many thanks!
I’m not that much of a internet reader to be honest but your
sites really nice, keep it up! I’ll go ahead and bookmark your site to come back later on. All the best
I added half and half to my coffee in the 2nd week which is a major no, no on this diet and I stayed
at the same weight for the week. Then you certainly commence what’s referred to as the “maintenance” period where one can devour ordinarily however you keep clear of
sugars and additionally starches. People also say that
after consumption of these drops, they even witnessed considerable
difference in the fatty region before and after consumption.