## Sunday, May 31, 2009

### STOC 2009, Day 1

Just a couple of quick highlights from Day 1.

Costis Daskalakis, when discussing oblivious PTAS's for Nash equilibria, presented the following probability lemma that seems like it should be very useful (note: it's not clear I'm stating it correctly, or with all appropriate conditions): if we have collections of random variables X_i and Y_i and look at sum_{i=1}^n X_i, sum_{i=1}^n Y_i, and the sums agree on their first k moments, the variation distance between distribution of the sums in 2^{-\Omega{k}}. (I believe the X_i and Y_i have to all be independent.) The result is based on something called the Krawtchouk Expansion. I'm looking forward to seeing this argument; it seems like a potentially useful lemma for other contexts.

Aaron Bernstein gave a nice talk on his result with David Karger creating an oracle for All-Pairs Shortest-Paths result that can find the shortest path when a node or edge fails; that is, you can find and store shortest paths avoiding an individual node or edge, and you can do it in (up to polylogarithmic factors) the same time/space as for a standard APSP structure. This goes on my list of algorithms I'd like to have a student implement, to see how it really works.

James Martin said...

I don't understand the first one at all. If you suppose that
sum_{i=1}^n X_i = sum_{i=1}^n Y_i,
i.e. that the sums are the same, then what does it mean to talk about the variation distance between the distribution of the sums? (and how can the X_i be independent of the Y_i?) Perhaps I am not understanding what n is....

Michael Mitzenmacher said...

Sorry, that's a typo-- they aren't equal, they just have the same first k moments. (Equal in expectation, for one.)

Will change.

ashwin kumar b v said...

is the distance between distributions you referred to Kullback–Leibler divergence?