No Significant Advantage

Don’t be misled by citation figures! (17)
by Ralf Neumann, Labtimes 04/2009

Journal Tuning

Once again, we’re not going to spin a fictional story about Professor Windigmann’s and Postdoc Slime’s ‘particularly sophisticated’ methods of citation picking. As in the previous issue, we have chosen the demystification of another common prejudice, namely, that “methodological articles are cited more frequently than ‘genuine’ research papers”.

Certainly, among the most-cited articles of modern science there are some that solely describe the development of new experimental methods or tools. Who, for example, would not immediately think of the 1970 Nature paper on SDS-polyacrylamide gel electrophoresis of proteins by Ulrich Lämmli? Even today, it is being cited several hundred times per year. The same is also true for the paper in which Marion Bradford described the first determination of protein quantities by using Coomassie dye (Analytical Biochemistry vol. 72: 248-54, 1976).

No doubt, there are real “blockbusters” among the articles that, in particular, describe new methodology. And in terms of citations, these “blockbusters” do beat almost every paper containing “earthshaking” scientific insights.

On the other hand, however, don’t they really deserve it? In fact, there are quite a few people who think that new or, at least, significantly improved methods are particularly at the heart of scientific advance. It is new methodology in the first place, they say, that opens the door to experimental testing for a multitude of problems and theories.

Anyway, at least the Nobel Committee sometimes sees it this way: Frederic Sanger, Walter Gilbert, Georges Köhler, César Milstein, Kary Mullis, Michael Smith, Osamu Shimomura, Roger Tsien...; they all received Nobel prizes for the development of methods and tools such as protein and DNA sequencing, monoclonal antibodies, PCR, site-directed mutagenesis, green fluorescent protein.

However, can it be concluded that methodological papers have a basic advantage? Are they actually being cited more frequently on average than articles presenting ‘real’ research results?

Indeed, one might suspect that methodological articles often reach a wider audience than many research papers that only raise the interest of a handful of researchers in their respective niches. From this, however, one could only deduce that very few methodological papers are likely to be very rarely or even not at all cited.

On the other hand, the facts do indeed indicate that a methodological paper only occasionally skyrockets in terms of citation numbers. For example, each year Thomson Scientific lists the most-cited biomedical paper of the previous year – seldom is there a methodological one ranking among the top 25.

And that methodology-based articles are generally not cited more frequently on average than research papers is quite nicely indicated by the annual impact factors (IF) of purely methodological journals. It’s far from impressive what journals, such as Methods in Microbiology, Electrophoresis, BioTechniques or the Journal of Immunological Methods have to offer, in this respect. A short investigation by Lab Times revealed that of sixty methodological journals, there was hardly one with an IF lower than 1 but, on the other hand, only five had an IF higher than 3. The “leader” was Nature Methods with an IF of 15.5, which is about the level of the Journal of Clinical Oncology or the Journal of Experimental Medicine.

Conclusion: Apparently, methodological papers are rarely never quoted, but equally they are rarely frequently quoted!

Last Changed: 03.05.2012

Information 4

Information 5