tag:blogger.com,1999:blog-8890204.post7384575003116222680..comments2022-08-10T05:55:57.894-04:00Comments on My Biased Coin: STOC 2009, Day 1Michael Mitzenmacherhttp://www.blogger.com/profile/06738274256402616703noreply@blogger.comBlogger3125tag:blogger.com,1999:blog-8890204.post-27406329096515002232009-06-02T05:15:17.484-04:002009-06-02T05:15:17.484-04:00is the distance between distributions you referred...is the distance between distributions you referred to Kullback–Leibler divergence?ashwin kumar b vhttps://www.blogger.com/profile/07777870747608944808noreply@blogger.comtag:blogger.com,1999:blog-8890204.post-57397238474400665412009-06-01T11:22:07.010-04:002009-06-01T11:22:07.010-04:00Sorry, that's a typo-- they aren't equal, they jus...Sorry, that's a typo-- they aren't equal, they just have the same first k moments. (Equal in expectation, for one.)<br /><br />Will change.Michael Mitzenmacherhttps://www.blogger.com/profile/02161161032642563814noreply@blogger.comtag:blogger.com,1999:blog-8890204.post-76670731205624871332009-06-01T10:46:02.957-04:002009-06-01T10:46:02.957-04:00I don't understand the first one at all. If you su...I don't understand the first one at all. If you suppose that <br />sum_{i=1}^n X_i = sum_{i=1}^n Y_i, <br />i.e. that the sums are the same, then what does it mean to talk about the variation distance between the distribution of the sums? (and how can the X_i be independent of the Y_i?) Perhaps I am not understanding what n is....James Martinhttp://www.stats.ox.ac.uk/~martinnoreply@blogger.com