www.fgks.org   »   [go: up one dir, main page]

Metrics and Analytics, Research, Sociology

Naughty Twins and the Impact of Journals

Diane Arbus photograph, Identical Twins, Rosel...
Identical twins. Image via Wikipedia

Authors often assume that publishing their results in high-impact journals greatly increases the number of citations their articles will receive.

This assumption has been difficult to demonstrate empirically.  Highly-cited articles may appear in prestigious journals because of careful vetting and selection.  Alternatively, prestigious journals may have a lot to do with the citations a paper receives.

Separating cause and effect is difficult.

Measuring the effect of the journal on the articles published within requires controlling for the intrinsic “quality” of the article, which is not easy to do since each journal contains a set of unique articles.  Any analysis always comes down to comparing apples and oranges.

Publishing identical copies of articles in separate journals and observing their performance would solve the problem in methodology and allow apples to be compared with apples.  But multiple publication is largely considered an unethical practice.  Most academic journals stipulate that submitted manuscripts must be original and not previously published (also known as the Ingelfinger Rule).

And yet, republication happens, and the outcome of these duplicate articles creates a natural experiment upon which to answer the question of how much citation impact a journal exerts on its articles.

In their manuscript, The impact factor’s Matthew effect: a natural experiment in bibliometrics,” released August 21st on the arXiv, Vincent Larivière and Yves Gingras, both at the University of Quebec in Montreal analyzed 4,532 pairs of identical papers published in two different journals.  By comparing identical papers, the authors were able to control for article quality and focus on the citation effect of the journal. Their approach is no different than measuring the effect of environment on human development when identical twins are raised in separate households.

The 4,532 pairs of papers were identified in ISI’s Web of Science using automated software designed to match titles, first authors, and number of references.  In the spectrum of big-picture at one end and granularity at the other, this was definitely a big-picture study.

Larivière and Gingras discovered that articles published in the higher-impact journal received, on average, twice as many citations than those articles published in the lower-impact journal (11.9 vs 6.3).  The percentage of uncited papers was also lower for the high-impact journals.

What is the cause of the different citation patterns?  The authors comment:

We know that many scientists look up the impact factors of journals in order to choose where to publish; they also tend to read journals with high impact factors in their own field and thus the papers in these journals become more visible.

In their manuscript, the authors imply that unethical authors are the cause of the natural experiment, submitting their work to multiple journals simultaneously.  This was not the case of Emerald Publishing, who engaged in systematic republication of hundreds of articles in their journals over several decades.

We also do not know if any of these republished article-pairs included planned statements of republication (such as “originally published in XXX”) with the result that the citer would give credit to the original publication.  (In the case of Emerald’s republications, there were no such indications.)

Their manuscript will be published in a forthcoming issue of the Journal of the American Society for Information Science & Technology (JASIST).  The authors have no intention of republishing it elsewhere.

Reblog this post [with Zemanta]

About Phil Davis

I am an independent researcher and publishing consultant specializing in the statistical analysis of readership and citation data. I am a former postdoctoral researcher in science communication and former science librarian. http://phil-davis.org/

Discussion

7 thoughts on “Naughty Twins and the Impact of Journals

  1. A long time ago, there was a study of citations to NEJM papers within the normal citation window (1-3 years) after the New York City newspaper strike in the late-1970s. Comparing a cohort of papers when the papers were running to a cohort when they weren’t publishing, the authors found a significant difference in overall citation. Less press coverage = less citation. It’s a reminder that we publish into a media environment, now probably moreso than ever, and that amplification creates awareness. Some journal brands translate better into amplification systems, and are seen by the media more often. This might explain some of what these researchers found.

    You can’t cite what you don’t know about.

    Posted by Kent Anderson | Aug 26, 2009, 12:42 pm
  2. Kent, Good analysis but I believe there is a lot of crowd mentality in citing articles with the result that only a single version of article gets cited most of the time. Overtime, at least some, authors see a reference cited in a particular context and just cite it in their own without bothering much to read the original.

    Posted by Aravind Akella | Aug 26, 2009, 2:45 pm
  3. I don’t see why this should be a surprise to anyone. It’s perfectly obvious that a “high impact” journal (whatever that is) is going to be read and cited more than a “low impact” journal. That’s the definition of impact. We hardly need a scholarly study to know that.

    Posted by Commentarius | Aug 27, 2009, 1:46 pm
  4. Pretty alarming to think there are 4,532 duplicate papers in the system!

    Posted by Stuart | Aug 28, 2009, 6:21 am

Trackbacks/Pingbacks

  1. Pingback: Impact factors and citation rates: A natural experiment with unethical “twins” | materialsdave.com - Aug 26, 2009

  2. Pingback: Bibliometria » Blog Archive » La primogenitura y la lengua de las gemelas de Mateo - Aug 31, 2009

  3. Pingback: Impact Factors — A Self-fulfilling Prophecy? « The Scholarly Kitchen - Jun 9, 2010

Side Dishes by Stewart Wills

Find Posts by Category

Find Posts by Date

August 2009
S M T W T F S
« Jul   Sep »
 1
2345678
9101112131415
16171819202122
23242526272829
3031  

The Scholarly Kitchen on Twitter

SSP_LOGO
The mission of the Society for Scholarly Publishing (SSP) is "[t]o advance scholarly publishing and communication, and the professional development of its members through education, collaboration, and networking." SSP established The Scholarly Kitchen blog in February 2008 to keep SSP members and interested parties aware of new developments in publishing.
......................................
The Scholarly Kitchen is a moderated and independent blog. Opinions on The Scholarly Kitchen are those of the authors. They are not necessarily those held by the Society for Scholarly Publishing nor by their respective employers.
Follow

Get every new post delivered to your Inbox.

Join 19,084 other followers

%d bloggers like this: