Thursday, February 09, 2012

Citation Circles on FSP

Sorry I haven't blogged much.  Life's been busy. 

But here's an interesting post by FemaleScienceProfessor discussing the appearance of "citation circles", people forming groups to cite each other (when possible) in order to get their citation numbers up.  On the Web, the equivalent is link farms, designed to get one's ranking up in Web search.  It makes me wonder if other methods for improving one's search rank are showing up or already commonly in play by people who try to work their citation count numbers upwards.  And whether methods for finding link farms used by search engines could (and should?) be used to correct citation counts in some way.


Anonymous said...

Do we really care ?

We look at citations that appear in interesting papers.

If we are lazy enough to count citations, maybe we should do something else ...

Anonymous said...

Should not. The admins that need to evaluate faculty outside their own research area should come up with a better metric than h-index. Making it adversarial doesn't benefit anyone.

Anonymous said...

> Do we really care ?

Deans, university and departmental rankings, etc do care a lot.

You don't have to care, but then it's you who will loose. (Unless deans and rankings will stop care.)

And they use citations because it's a simple metric that can be used to evaluate research without any knowledge about the subject in that area. Therefore it's used by deans, and people from outside your area. Even if it's not perfect.

Anonymous said...

If ranking is important for us (and for most of us it is) we can either fight a flawed ranking system or fight people cheating a flawed ranking system.

The advantage with fighting the system is that we get two flies in one go. The problem is that it's much harder to fight the system since we have to fight people that give us funding, which we're too lazy/afraid to do.

Michael Mitzenmacher said...

I'm with Anon #3. I think Anon #1 misses the point. There are plenty of places where people are "lazy" enough to count citations -- people not familiar with the field is one example, but even I myself use citations counts as a useful guide to quality. I know it's not completely accurate, but there are times where it's useful -- I may have a task where I need as Anon #3 describes a simple metric, and the task is such I'm certainly not going to spend the time going through a half dozen or more papers to get a finer resolution.

Citation counts are useful to the extent that they're approximately accurate. If there's widespread gaming of the system, they're no longer a useful albeit approximate measure. Seems like a tragedy of the commons that should be avoided.

Jérémie said...

I feel like the post you're referring t,o mentions the formalization of an already widespread practice. You cite the papers you know - of the people of your close community - rather than go out in the wild to find other relevant work.

On the other hand, "link farms" means linking to pages that are not relevant to your web page, simply to artificially accrue their importance as perceived by the search engine.

So it seems like your parallel between the two notions is exaggerated. I think the larger woe at play here is that many researchers are lazy when it comes to do bibliographical research, eventually achieving something akin to scientific inbreeding - and with such powerful tools as Google Scholar, that shouldn't be the case anymore.

The practice of abusing citations and citing work that isn't at all pertinent to your own article is another matter entirely. But it's such a dangerous proposition that I can't see it becoming widespread. (But I might be naïve, I guess.)