It treats very well the standard approaches -- Chernoff-Hoeffding bounds, martingales, isoperimetric inequalities, and so on, but I think what particularly stands out in this book's treatment is the consideration of what to do when the random variables are not quite so nice. Tail bounds tends to be "easy" to apply when all the random variables are independent, or when your martingale satisfies a nice simple Lipschitz condition; it's when the variables are dependent or there's some special side case that wrecks your otherwise pleasant martingale that you need to pull out some heavier hammers. This book makes those hammers seem not quite so heavy. Chapter 3 is all about Chernoff-Hoeffding bounds in dependent settings; another chapter has a subsection on martingale bounds for handling rare bad events. I've had these things come up in the past, so it will be nice now to have a compact resource to call on with the appropriate bounds at hand.
I don't think this book is for "beginners"; I'd recommend, for instance, my book
2 comments:
I have seen a draft of this book earlier (following a link from geomblog) and I love it! It does really make concentration seem a lot easier.
I agree that this is an excellent book (I've been using a draft for a couple of years), and it fills a big gap. While there are good surveys on measure concentration, it is hard to find something that is as approachable for a computer scientist, and yet covers things like isoperimetry, Talagrand's, Kim-Vu's, etc.
I think I might have gotten a better price than Amazon, from the Cambridge University press desk at STOC. It may be worth checking them out at FOCS.
--kunal
Post a Comment