A paper I co-authored on network coding -- Network Coding Meets TCP (arxiv version) -- was accepted to INFOCOM. Full credit for the success goes to the graduate student Jay Kumar Sundararajan who led the project (and is graduating and looking for jobs this year...) Our goal (as the title suggests) is to make a TCP-compatible network coding congestion control scheme, and our approach uses an interesting variation on acknowledgments Jay Kumar had utilized previously; instead of acknowledging packets, you acknowledge "degrees of freedom" (or, encoded packets that will eventually decode to message packets).
The INFOCOM mail said 282 papers were accepted from 1435 submissions (post-withdrawals). A quick check shows that INFOCOM has been below a 20% acceptance rate regularly in recent years, and even assuming a completely unverified estimate that 10-25% of the submissions are things that really shouldn't have been submitted in the first place, in my opinion that's still a pretty low acceptance rate for what's supposed to be the big, open-tent networking conference of the year. (In the 1990s, the acceptance rate was more commonly around 30%.) I'm sure there were a lot of good papers that got rejected this time around.
Most networking conferences have acceptance rates around 20%. Is this a good thing? Conference competitiveness has been blogged about before, but there doesn't seem to be much of a high-level discussion about the issue -- I recently saw Ken Birman and Fred Schneier wrote an article about it for Communications of the ACM. Any ideas out there?
Monday, December 22, 2008
Subscribe to:
Post Comments (Atom)
8 comments:
It's an interesting article: I find the counter-intuitive idea of making reviews *less* detailed in order to deter authors from submitting quite intriguing.
But what I see is that conferences are already evolving to adapt to these changed circumstances. For example, posters are very common in places like NIPS, to manage the volume of submissions and allow more people to present their work. CVPR does the same thing as well (and these are both prestigious conference venues). VLDB is experimenting with a rolling submission format, Academy-Award style. Some of the ML conferences are experimenting with targeted reviewing, where the authors nominate reviewers/areas in advance.
I often get a little testy though with attempts to preach down to the masses from the Olympic heights of Mt. Full Professor. One of the driving forces behind the publishing culture is the tenure process, which the authors of this article acknowledge. But it will be hard to make any wholesale changes in culture (especially back towards journals) and retain any credibility in the eyes of deans and administrators who've heard the "CS doesn't do journals, our field moves too quickly, conferences are king" litany for far too long.
Failing that, exhortations to authors to "write better papers and admit responsibility as scientists" are going to fall on deaf (untenured) ears.
Michael,
Congrats on the infocom acceptance. (BTW, thanks for giving a very nice talk at Buffalo.)
The so-called "mini-conference" was introduced at infocom 2007 (I think) to include about 10% of papers which do not make it to the main conference. More explanation can be found here
u r blog Is very nice
As Hung said, they seem to address this with the mini-conference idea. Papers in the range top 20-30% are invited to the mini conference with a shorter 5-page abstract appearing in the proceedings.
More related to the paper itself than to the topic of selectivity of networking conference: it's enjoyable to read, but I don't see the "network coding" part to it. It seems it is a coding scheme which is sync'ed with TCP, but not a network coding scheme. Transport coding, maybe?
Or maybe I'm confused on what network coding means. In this case, it looks like a single flow which is encoded e2e. That it uses the linear xor-ing of packets common in network coding is relatively orthogonal: other encoding schemes could work as well. Isn't network coding some coding designed to take advantage of the network's topology (say, promiscuous relay network as in MORE or the initial butterfly topology for mutlicast)?
It seems to me that it is fashionable to use the term network coding for any coding solution. However, this paper is interesting in its own right, wonder whether LT like codes can be used for this application.
BTW, are their any theoretical results for NC gain in wireless. I know it is supposed to be unbounded for wired but what about wireless. Can it give an order gain ?
The main idea of the paper is the new interpretation of ACKs. Every degree of freedom received (i.e. every equation involving packets) triggers a TCP-ACK, even if it doesn't reveal a packet immediately.
This idea applies to any situation where there is coding across packets of the same session. The coding could be at the end-host or inside the network. That is why we used the term network coding.
In the original submitted version of the paper, this generality was not emphasized. This has been corrected in the final version which will appear in the conference proceedings. We also included a preliminary simulation result for a situation where an intermediate node re-encodes incoming linear combinations.
We see this paper as a way to interface network coding and TCP. In particular, it will be useful for running TCP in a coded opportunistic routing scenario like the one considered in the MORE paper. The key difference is that our scheme is a sliding window scheme, not batch-based, which makes it more compatible with TCP.
So called "INFOCOM mini-conference". Is it considered as second class (rank 2 or 3) conference or a workshop? How to categorize these papers in relation to INFOCOM and other conference papers?
Post a Comment