A few days ago I returned from Beijing where I attended the Innovations in Computer Science (ICS 2011) Conference. The most remarkable thing about this conference for me is its venue: ITCS in Tsinghua university in Beijing. I have never been in China before, and my expectation was to see an awakening giant; very soon it was clear that the giant is already completely awake: Beijing basically has the same look and feel of a huge western metropolis: skyscrapers, large roads, traffic jams, people dressing and behaving similarly to what we see in the west, cars looking more or less the same (but driving is rougher, say southern-European style), western-style Cafes, and fast-food places abound (ok, there are somewhat more Chinese restaurants than usual). Tsinghua university also felt very much like the old and grand universities that I’m used seeing in Europe or the USA in terms of architecture, vegetation, and students. Frankly, I was at first disappointed by the lack of any “foreign” or “developing” feel to the place. However, there were compensations: I definitely felt the awe of a country boy coming to the big city, an awe I used to feel when arriving in cities like NYC, Paris, Rome, or London, but have since lost (I wonder whether this is due to my own de-sensitization or whether their grandeur has really dimmed.) The people seem open and friendly and getting along as a tourist is very easy. Westerners seem to draw no special attention from anyone, and even Uri Zwick speaking in Chinese (which he has been studying for years) did not seem to draw any particular attention (except from the other conference attendees). Personally, I was especially delighted by the beautiful parks in which, despite the biting cold weather, everywhere there were groups of people dancing, singing, playing Chinese hacky sack, and other social activities. Obviously I got to see only a small part of Beijing (the Tsinghua area and a few of the usual tourist attractions), not to mention the rest of China, but I got a distinct sense that people look happier than what I’m used to seeing elsewhere.

That finished, now let me say a few words about the (extremely well run) conference itself (who paid for airfare and accommodations of *all* speakers). Certainly after the many discussions of this new conference last year (e.g. here, here, here, here, and here), it is natural to ask about the quality of papers as well as about their conceptual character. My judgment of both is mixed: in terms of quality I would compare ICS to the main European theory conferences (ICALP, ESA): all papers do have something new and significant, but only a few seem very strong (like many papers in FOCS/STOC or even in SODA/EC are). Not bad I’d say for the 2nd year of existence and for a non North-American conference. In terms of the “innovative” character expected of papers, one does get a clear feeling that most papers (but still far from all of them) do try to advance some conceptual message, and make it explicit in the writeup as well as their presentation, but still in most cases this message is tied to a usual type of “result”. (An example of a truly conceptual talk that much appealed to me was Ueli Maurer’s (joint work with Renato Renner) advocating an algebraic abstraction for the foundations of cryptography.) Perhaps one can see a theme common to many papers in the conference of having a very interesting question, but only a mildly interesting answer.

ICS had a large number of AGT-related papers, two invited talks related to AGT, as well as a panel on AGT.

Gil Kalai’s invited (highly-conceptual) talk on “influence and noise” starts with the classic KKL work on influences and reaches some of his modern results on approximation in computational social choice. Gil also writes on ICS in his blog.

Silvio Micali’s invited talk described his paper with Jing Chen, Conservative Rationalizability and The Second-Knowledge Mechanism. This work considers a simple auction of a single item in a setting with private values where players have some information about each others’ value. The authors carefully choose a non-Bayesian modeling and suggest and motivate a very robust (“conservative”) notion of implementation which is worst case and takes into account only a single level of iteration in the removal of dominated strategies. Significantly, this notion does not attempt to pinpoint an equilibrium but rather reaches the implementation goal under any reasonable profile of strategies*. *I really liked the notions used and their argumentation and motivation, but the mechanism obtained itself was rather un-illuminating, basically asking players to tell all about each other, rewarding them for information, and severely penalizing anyone caught cheating*, *but as Silvio argued: “all is fair in Love and Mechanism Design”*. *

An important algorithmic result regarding game theory was the paper “Strategy iteration is strongly polynomial for 2-player turn-based stochastic games with a constant discount factor” by Thomas Dueholm Hansen, Peter Bro Miltersen, and Uri Zwick. They consider the long-standing open problem of computing the value of stochastic zero-sum games played on graphs, and provide the first fully-polynomial time algorithm for the “constant-discount” case. The problem without discounts (or with the discount factor being part of the input) is still open and is one of the few problems known to lie in but not known to have a polynomial time algorithm.

One of my favorite papers in the conference was “Is Submodularity Testable?” by C. Seshadhri and Jan Vondrak. Their starting point is that there are many algorithms — for various tasks on sub-modular functions (including combinatorial auction allocation) — that run in time that is sublinear in the representation of the function, often time that is polynomial in — the description length of items in the domain. The fact that these algorithms work for sub-modular functions (but not for all functions) suggest that this sub-modularity “is noticed” in sub-linear time; so they ask the basic question of whether sub-modularity can be tested in sub-linear time, in the usual property-checking sense. Their results are clever and insightful but much weaker than one would desire — this exactly suggests that there is some interesting and important structure there yet to be found. As I tend to view sub-modularity as one level in a hierarchy of “complement-freeness”, I find similar questions to be of interest at other levels of the hierarchy as well.

The panel on Algorithmic Game Theory/Economics was pretty interesting (at least to me — the organizer). Despite a few pointed questions from the audience (“Why do most economists continue to ignore AGT?”), the general atmosphere was enthusiastic for the field (not surprising, again given the organizer). From different points of view people thought that the field is doing well, with the level of agreement getting to the point that several people commented on how Computer Scientists should learn is disagree with each other more often (like economists viciously do). Yet, I could see two different approaches among the panelists: the “moderates”, Shanghua Teng and Peter Bro-Miltersen, emphasized the algorithmic side of AGT focusing on the interesting algorithmic questions and beautiful structure underlying them. The “extremists” , Silvio Micali and myself, de-emphasized the algorithmic aspects per-se and were interested more in new points of view and types of challenges coming from the meeting of economic/game-theoretic considerations and computational ones. Several people made a point of disparaging the concept of Nash equilibrium, and not just due to its problematic computational status (as Gil Kalai said: the computational complexity is the least of the problems of the Nash equilibrium concept as a model of behavior).

Next year ICS 2012 will be held in MIT, Boston.

Read Full Post »