In the last of my posts about New Zealand, I'll talk about Mike Langston's talks on computational biology. He talked a lot about the style of computational biology. The difficulty of getting data unless you form a real partnership with people (who view their data as very valuable), the noisiness of the underlying problems, the need to worry about entire computation systems with memory/compute power/specialized hardware rather than individual algorithms, the compromises one has to make between theory and making things actually work, and so on. As I listened, it dawned on me, there's another area of research where I hear about similar issues -- search engines and computational advertising. The similarities sound uncanny.
The similarities reached the level of specific problems. The technical aspects of the talk were about heuristics/methods for maximum clique and biclique (bipartite clique) on real data. I've certainly heard of biclique coming up in search engine papers. Langston claimed to have the fastest bipartite clique algorithm in practice (at least on the biological data he was interested in). I wonder if there's room for crossover between these two fields?
The talk left me with a feeling that I've had before when seeing computational biology talks. The field seems interesting, but it looks like to have a real impact, you really need a lot of devotion to learning the underlying biology and making connections with working biologists. It doesn't seem like a good field for dabbling. So far, I haven't found myself ready to take the plunge and try to get involved with the field. Luckily for me, I still have other interesting things to do.