I lost sleep last night trying to prove a collection of random variables were negatively associated.

Negative dependence is one of those nice tricks that hardly ever gets used because it usually ends up being harder than it should be. While there are many flavors of negative dependence, the most natural is probably "negative assocation". The intuition is simple -- given a collection of random variables, if when one goes up that means the others should go down, then they are negatively associated. More formally, given any monotone non-decreasing function f of a subset of the variables, and a monotone non-decreasing function g of another disjoint subset of the variables, if f goes up, g should go down. Proving this holds formally is often more difficult than one would expect.

Why should we care? Well, it turns out that if a collection of random variables are negatively associated, then even though they are dependent, you can just apply your standard Chernoff bounds to them, without a care. Chernoff bounds (and other tail probability bounds) pop up in many arguments, but they can be very difficult to deal with when the random variables are dependent. Usually, you have to switch to a martingale argument, since standard Chernoff bounds apply only to independent random variables. If you can get negative dependence, it's much cleaner.

The best introduction to negative dependence is probably this surveyish article by Dubhashi and Ranjan. The focus is on how balls and bins problems are a natural example of negative dependence -- if a ball lands in one bin, it can't be in any other! Naturally, this explains my interest.

For example, the following problem came up for me some years ago in this paper. Suppose we thrown n points uniformly at random on the boundary of the unit circle. We want bounds on the number of arcs of length larger than say c/n for some constant c. If arc lengths were independent, we could just apply a Chernoff bound, easy enough. But of course they're dependent -- the sum of the arc lengths is 1! Intuitively, though, if one arc gets longer, then the others must get shorter, so there should be negative dependence. We proved what we needed to get the Chernoff bound, but it wasn't pretty. (I've since seen that a stronger version of this result is given as an exercise on negative association in the very nice-looking draft monograph by Dubhashi and Panconesi; to tell the truth, I'd like to see it worked out, as it seems to me that the conditioning is a bit subtle, but then again, geometry confuses me.)

In fact, in that paper we actually wanted a similar result in the following setting. Suppose one throws n points uniformly at random into a fixed region (say, for example, the unit square, or to avoid boundary issues, the 2-d unit torus). We want bounds on the number of Voronoi cells of size larger than say c/n for some constant c. If Voronoi cell sizes were independent, we could just apply a Chernoff bound, easy enough. But of course they're dependent! Intuitively, though, if one cell gets bigger, then the others must get smaller, so there should be negative dependence. Or maybe that isn't the case. Now I can't remember if I ever found an example that led me to believe it wasn't the case... Anyhow, we couldn't prove negative dependence easily, so we ended up using an uglier martingale argument that sufficed.

I was surprised at the time that I couldn't find any reference to type of this problem in the geometric literature. If negative dependence of Voronoi regions in random settings is still open (and true!), I'd be happy to ponder it with anyone who has a better sense of geometry than I do. In general, the area of negative dependence seems like a promising area for additional results.

Subscribe to:
Post Comments (Atom)

## 24 comments:

This doesn't answer your question, but I'm curious about one thing. In the circular arc problem, you have this issue with random variables that sum to 1. This is essentially the unit simplex, and I'm reasonably certain that the joint distribution of the lengths of the arcs maps uniformly to a point in the simplex.

If true, this would mean that the question you're asking is a volume estimation question over a subregion of the simplex, because the question can be translated as: "what is the volume of the region outside a small ball of radius whatever on the simplex", and again (taking a flying leap into handwavy space) we understand a reasonably amount about such bounds via the multinomial distribution (which behaves a Gaussian proxy for the simplex)

A *lot* of work has been done on statistical properties of random Voronoi diagrams over the last five decades, but mostly on Poisson processes rather than uniform distributions (minor difference), and mostly experimental statistics rather than analysis (major difference). So I'm not surprised that you couldn't find anything in the past literature.

Suresh -- you lost me at "unit simplex". :) Actually, no, the first paragraph makes sense, but you lost me when you started waving your hands about volume estimation. Sure, we're considering a subregion of the simplex where the number of coordinates of size more than c/n is within a small error of its expectation. So how do we bound the size of such a region?

Jeff -- I looked up a bunch of the Poisson stuff you're talking about when looking for the answer to this problem. It all looked experimental to me, and in particular I didn't notice anybody looking at tail bounds.

Hey, look, I found an open problem in geometry! I've still never written a SoCG paper! Suresh, Jeff, anyone, interested in some analysis?

Ah I see. I think I have another thought, but am ashamed to state it in public, so I'll email you ;)

Here is a high level idea that might just work:

In the one dimensional case it may be easier to show negative regression and not negative association. We need to show that if I and J are disjoint sets of indexes then E(f(I)|J=t) is decreasing in t. Now, if the set of indexes J is contiguous then the conditioning reduces the problem to is a similar problem with fewer points (a-la the proof that occupancy variables in balls and bins have negative regression).

I don't know if this works, in any case I don't think it generalizes to more than one dimension.

This discussion reminds me

of the paper

http://dx.doi.org/10.1007/s00454-003-2951-4

Very nice problem.

Please allow me to ask a (probably silly) question on a related problem.

If we have n points uniformly at random in a ring centered at the origin (i.e the intersection of two discs of radious 1-r and 1+r). What about the (euclidean) distances between them? Are they negative dependence?

I'm strictly an amateur at this stuff, but I spent a little time with the new Bollobas/Riordan book on percolation, and it's awesome. Treats percolation in Voronoi diagrams towards the end (hence the book's cover), though your question isn't directly at issue.

There's a striking correlation inequality that I learned about from this book that seems germane to the problem, but only (so far as I can see) to the Poisson version--which might be close enough for some purposes. It was stated in a finite boolean version, which could apply here via approximations.

Here it is: suppose f, g are any two boolean functions on n bits. Let E(f) be probability that f = 1 on uniform input, sim. E(g).

Let E(f ^^ g) be the probability that f and g are both 1, AND there are *disjoint* certificates witnessing both facts.

Then we have

E(f ^^ g) <= E(f)*E(g).

(I understand how to prove this for monotone f, g--it follows from Kleitman's Theorem, which I posted on; but the general case I'm stumped on. In any case we'll only use the monotone case.)

(continuing)

Oops, I misspoke, think we'd need the general case. So, here, let f = g = event that there's a Voronoi cell of size T in the pointset represented.

If different Voronoi cells had disjoint witnesses, we could apply this thing to show negative dependence. But now that I actually think about it, one has to witness not only that the cell is empty of other points, but also that there are no points near it. So witnesses might have to overlap.

Still, maybe this type of result can be useful.

Hi Mike,

I recently came across this paper by Dubhashi, Jonasson and Ranjan, which improves some negative correlation bounds I had in one of my papers. It also has crisp proofs and uses a general methodology (the Feder-Mihail theorem) which may be of help to you -- my paper also dealt with random variables with a fixed sum.

aravind

Andy D:

The inequality you refer to is the FKG inequality which is one of the key tools in the area. A version of it is used in the survey by Dubangi and Ranjan that was mentioned in the original post. It figures prominently in percolation but also in lots of other areas.

Thanks for all the helpful pointers and thoughts, everyone -- if I get anywhere with the problem, I'll be sure to announce it here.

There actually is a nice witness structure for Voronoi cells: namely the intersecting circle patterns from the dual Delaunay.

Specifically, fix a point, and write down all the circumcircles of triples of points that contain this point. Then we know that all the points contained in these triples "influence" this Voronoi cell, in the sense that this point's Voronoi cell (once it is inserted) contains some of the mass of the cells of the influencing points (this fact is used to do voronoi-based interpolation).

Now maybe this dependency structure can be used to create the "islands" that Andy wants to create (by disconnecting the graph structure) giving cels with disjoint witness sets ?

Andy -- could you give a pointer to the inequality form you are using? It would be helpful in figuring out what qualified as an actual "witness" here, and what the implications are.

Thanks,

MM

Sure, a discussion of Reimer's Ineq. can be found here:

http://citeseer.ist.psu.edu/cache/papers/cs/3312/http:zSzzSzwww.math.gatech.eduzSz~randallzSzr-bcr-f.pdf/the-van-den-berg.pdf

or here, p. 3-4:

http://citeseer.ist.psu.edu/cache/papers/cs/9016/http:zSzzSzconley.math.wisc.eduzSz~pemantlezSznegcorzSzversion990506.pdf/pemantle99towards.pdf

Second link should've been

http://citeseer.ist.psu.edu/206312.html

Earlier claims about what would constitute a witness here were wrong, deleted.

Check out the new paper on negative dependence: arXiv:0707.2340

Michael, the problem with the circle and distances is covered in:

Barbour, Holst, Jansson. Poisson Approximation

After reading the surveyish article you mentioned, I wonder how is the negative dependence theory useful if the balls are not thrown independently but instead placed by a k-wise independent hash function.

In this case, it seems that the occupancy variables would be k-wise negative quadrant dependent only (a weaker notion). Applying Chernoff bounds to these variables would be nice; but is it at all feasible?

madeira-- while you can't apply Chernoff bounds to k-wise independent variables, there are similar tail inequalities for k-wise independent variables that should be generalizable to this setting. I can't recall offhand what papers have the tail bounds, but they shouldn't be too hard to find...

A good reference for k-wise dependent variables is:

Jeanette P. Schmidt, Alan Siegel, and Aravind Srinivasan. Chernoff-hoeffding bounds for appli-

cations with limited independence. SIAM J. Discrete Math., 8(2):223–250, 1995.

-Adam Smith

Post a Comment