The other day I was alerted to an interesting evaluation of international citation data. The author, Curt Rice, mentions a particular aspect of the data:

In 2000, 25% of Norwegian articles remained uncited in their first four years of life. By 2009, this had fallen to about 15%. This shows that the “bottom” isn’t pulling the average down. In fact, it’s raising it, making more room for the top to pull us even higher.

The context here is that the “bottom” refers to scientific articles that aren’t cited, assuming that no citations mean low scientific quality of that article. Leaving aside the very tentative and hotly debated connection between citations and ‘quality’ (whatever that actually means*) in general, let’s look at just the specific rationale that a decrease in the fraction of articles that go uncited should indicate an increase in the overall quality of research.

To make my case, I’ll take the perspective of someone who strongly believes that the rejection rate of a journal is indicative of that journal’s ‘quality’ (i.e, a high rejection rate makes sure that only the world’s best science is being published). From that perspective, a decreasing fraction of papers remaining uncited is just as bad for science as decreasing rejection rates: surely, just as not every paper can be good enough to deserve to be published, even fewer would be good enough to deserve to be cited? An increasing number of articles with any citations at all can thus only mean one thing: the Dunning-Kruger Effect has come to science. We have now let so many incompetent people join the scientific enterprise, that they cannot discern between good science and bad science any more and cite even the worst bottom of the scientific literature, as if it was even worth paying attention to. As a consequence of the rise of these bulk-publishing new journals which flood the scholarly literature with crap, articles get published who would never have gotten published before and in their authors’ unwitting incompetence, they cite each other out of sheer ignorance. With 80% of all submitted manuscripts being junk, clearing what passes for peer-review these days ceases to be an indicator of quality and with almost all papers being cited, citations has become useless, too.

This may sound outrageous to people who visit this obscure blog, but if you follow the links in the paragraph above, you’ll find precisely this condescension and arrogance.

 

* Obviously, a more easily defensible relation were that between citations and utility

(Visited 30 times, 19 visits today)
Posted on  at 20:21