- Revenue Maximization via Nash Implementation by Thanh Nguyen
- Strategy iteration is strongly polynomial for 2-player turn-based stochastic games with a constant discount factor by Thomas Dueholm Hansen and Peter Bro Miltersen and Uri Zwick
- Pricing Loss Leaders can be Hard by Yi Wu
- Beyond the Nash Equilibrium Barrier by Robert Kleinberg, Katrina Ligett, Georgios Piliouras, and Eva Tardos
- A Complexity View of Markets with Social Influence by Xi Chen and Shang-Hua Teng
- The Hitchhiker’s Guide to Affiliation Networks: A Game-Theoretic Approach by Christian Borgs, Jennifer Chayes, Jian Ding, Brendan Lucier
- Best-Response Mechanisms by Noam Nisan, Michael Schapira, Greg Valiant and Aviv Zohar
- Distributed Computing with Adaptive Heuristics by Aaron D. Jaggard, Michael Schapira, and Rebecca N. Wright
- Renegotiation-Safe Protocols by Rafael Pass and abhi shelat
- The Effects of Diversity in Aggregation Games by Petros Mol, Andrea Vattani, Panagiotis Voulgaris
- Posting Prices with Unknown Distributions by Moshe Babaioff, Liad Blumrosen, Shaddin Dughmi, Yaron Singer
Archive for September, 2010
Peter Cramton is collecting signatures of auction experts on a letter to be submitted to the Health Subcommittee of the U.S. House of Representatives regarding the particularly ill designed auction of the “Medicare Competitive Bidding Program for Durable Medical Equipment”. Some of the problems mentioned are very basic:
The first problem is that the auction rules violate a basic principle of auction design: bids must be binding commitments. In the Medicare auction, bidders are not bound by their bids. Any auction winner can decline to sign a supply contract following the auction. This undermines the credibility of bids, and encourages low-ball bids in which the supplier acquires at no cost the option to sign a supply contract.
The fourth problem is a lack of transparency. It is unclear how quantities associated with each bidder are determined. These quantities are set in a non-transparent way in advance of the auction. Bids from the last auction event were taken in November 2009, and now more than ten months later, we still
do not know who won contracts. Both quality standards and performance obligations are unclear. This lack of transparency is unacceptable in a government auction and is in sharp contrast to well-run government auctions such as the Federal Communications Commission spectrum auctions.
The list of accepted papers to SODA 2011 has been published. Somewhat strangely the list published by SODA only mentions a subset of the authors for each paper, and in an arbitrary order (according to what’s in their system, apparantly).
AGT/E related ones:
- A Stackelberg Strategy for Routing Flow over Time by Lisa Fleischer, Elliot Anshelevich, and Umang Bhaskar
- A Subexponential Lower Bound for the Random Facet Algorithm for Parity Games by Thomas Dueholm Hansen, Uri Zwick and Oliver Friedmann
- Distributed Selfish Load Balancing on Networks by Martin Hoefer and Thomas Sauerwald
- Fast Convergence of Natural Bargaining Dynamics on Exchange
Networks by Yashodhan Kanoria
- Local Smoothness and the Price of Anarchy in Atomic Splittable
Congestion Games by Florian Schoppmann and Tim Roughgarden
- Matroid Secretary Problem in the Random Assignment Model by Jose Soto
- Mechanism Design via Correlation Gap by Qiqi Yan
- Near-Optimal No-Regret Algorithms for Zero-sum Games by Constantinos Daskalakis
- On Minmax Theorems for Multiplayer Games by Yang Cai and Constantinos Daskalakis
- On the Approximability of Budget Feasible Mechanisms by Pinyan Lu, Nick Gravin and Ning Chen
- On the Complexity of Approximating a Nash Equilibrium by Constantinos Daskalakis
- Online Stochastic Matching: Online Actions Based on Offline Statistics by Shayan Oveis Gharan and Vahideh Manshadi
- Online Vertex-Weighted Bipartite Matching and Single-bid Budgeted
Allocations by Gagan Aggarwal, Gagan Goel, Chinmay Karande and Aranyak Mehta
- Pricing on Paths: A PTAS for the Highway Problem by Thomas Rothvoss and Fabrizio Grandoni
- Risk-Averse Stochastic Optimization: Probabilistically-Constrained
Models and Algorithms for Black-Box Distributions by Chaitanya Swamy
- Towards Optimal Bayesian Algorithmic Mechanism Design and Multi-
Parameter Bayesian Algorithmic Mechanism Design (merged papers) by Zhiyi Huang, Robert Kleinberg and Jason Hartline
- Welfare Guarantees for Combinatorial Auctions with Item Bidding by Tim Roughgarden and Kshipra Bhawalkar
This is my first COMSOC meeting, the third in the series so far. The next is planned for 2012 but there are no details yet about location. The organization has been impressive (92 official attendees to look after), and the conference dinner at the Rheinturm, in the revolving restaurant 172m above the ground, was excellent. Congratulations to Joerg Rothe and his team.
There was a conscious effort to get a broad range of invited speakers: political science, game theory, AI, economics, mathematics were represented. There were 57 submissions to this non-archival conference, and 39 were accepted. Much information including slides of talks is on the conference website.
It was noted at the business meeting that the schedule was rather crowded and more free time would be good in future. The best solution that I can see is some reduction in the number of talks (maybe accept only about 50%). Also, given that in some sessions much background material is common to more than one talk, perhaps the speakers could collaborate to save time. Some of the talks described mechanisms that could be used for aspects of conference organization, especially the decision of which papers to accept. It would be nice to see some self-reference of this sort for future meetings.
I will give a very biased summary of highlights in the interests of space. Some of the key themes that I noticed: parametrized complexity, allocation (including fair division, matching/stable marriage and cake cutting); experimental results and explicit computation, (such as integer and linear programming, random generation, local search); social networks; manipulation/bribery/control/cloning/partial winner etc.
The LOGICC tutorial day was on Monday. A summary of the invited tutorials:
Vince Conitzer (Duke): a clear overview of (some of) the computational social choice research area. Useful for students new to the area.
Agnieska Rusinowska (Paris-Sorbonne): a survey of the idea of influence in social networks. Too long and detailed for my taste, but the slides will be useful, with a huge number of references.
Nicholas Tideman (Virginia Tech): after long experience, claims that spatial models are the best for modelling real voter behaviour. Discussed difficulties of obtaining real data and had plenty of computational problems to offer the audience, often involving Gaussian quadrature. Challenged us to generate fake elections that he could not distinguish from real ones.
Toby Walsh (UNSW, NICTA): presented experimental results about manipulation of voting rules using methodology similar to studying SAT. It seems that many NP-hard manipulation problems exhibit smooth phase transitions as the size of the manipulating coalition increases. This calls for an analytic proof.
The conference proper started on Tuesday. The invited talks:
Gabrielle Demange (Paris School of Economics): has a new ranking method (“generalized handicap”) which she has characterized axiomatically. Ranking methods focus attention of the rankers, and there is a dynamic process which converges in some cases. This reminded me of Noam’s very early post on the attention economy. She explicitly did not treat strategic behaviour.
Matthew Jackson (Stanford): in experiments, if you offer people $100 now or $200 in 2 years, most take the $100. But if you offer $100 in 6 years or $200 in 8 years, they take the $200. This is inconsistent with standard rational models. Main idea is that perhaps each person has several “personalities”, some with more patience than others. He showed that in that case, this kind of time-inconsistency is logically inevitable under very weak assumptions.
Bettina Klaus (Lausanne): discussed the allocation mechanisms based on deferred acceptance (scarce heterogeneous indivisible goods with no money transfers) used for matching medical students to hospital residencies, etc, originating in the stable marriage theorem of Gale and Shapley. She and coauthor have axiomatized these methods. Very clearly showed the link between theory and practice and made the area seem very attractive to study.
Herve Moulin (Rice): the problem of impartial peer evaluation – design a mechanism that allows, for example, academic researchers to rank each other (or simply decide on who wins a prize), in a way that is strategyproof. Requires some restrictions (for example don’t allow a voter to rank herself) so as not to fall foul of Gibbard-Satterthwaite impossibility result. Has solved this problem when there are at least 4 agents. Plenty of open problems.
Hannu Nurmi (Turku): Despite a lot of academic results based on flaws in existing voting rules, very little reform has occurred in practice. New proposed rules may require a different model from the standard ordinal preferences used in most of social choice; voting rules are used for many purposes, and we may need to specialize analysis to different cases. Long discussion of various paradoxes. Asked for computational help to study the structure and probability of profiles leading to various paradoxes.
Among the contributed talks, almost all of which were interesting and well presented, my favourites were:
Tyler Lu (Toronto): a model that interpolates between the extremes of social choice and full personalized recommendations. Apparently this is isomorphic to theory of fully proportional representation. Hardness results, greedy algorithms, approximation algorithms.
Ariel Procaccia (Harvard): cake cutting is awesome! Computer scientists have a lot to contribute, so please join in! Presented polynomial time deterministic and randomized algorithms for cake cutting that give envy-free and proportional allocations, and induce truthful behaviour.
Piotr Faliszewski (AGH Krakow): “distance rationalizability” provides a framework for voting rules that allows for more general results, based on the notion of distance to a consensus. Since distance and consensus can take very many values, this basic idea is too general and every rule is distance rationalizable. Putting natural conditions on the distances allows for more interesting results including some on complexity of winner determination.
Felix Fischer (Harvard): the problem of finding a strategyproof mechanism for approval voting (choosing a set of winners) when the voters and candidates coincide. Deterministic results are negative, but a randomized 4-approximation algorithm exists. Practical use in deciding on conference accepted papers, social networks?
Toby Walsh called for a standard library of benchmark voting data for testing algorithms, along the lines of what is available in many other fields (call it VoteLib for now). I second this enthusiastically.