Last Friday I interviewed Scopus’ Amanda Spiteri, who kindly gave me a userid and password to Scopus . She talked about the importance of “who’s citing who,” as she put it, and the technical difficulties in getting accurate matching algorithms, particularly when scientific papers don’t contain the world’s best bibliographies. Scientists are more interested in doing research than in creating beautiful bibliographies. She told me that most purchasers are content with a 10-year backfile, something that surprised me. We also wondered about the importance, going forward, of high impact journals, particularly with the increasing importance of open access journals and institutional repositories, and the quality issues implicit in the retraction of scientific papers, even from well-known journals such as Science .
After our conversation, I began thinking about how difficult it is to compare citation products. There’s been a lot written comparing Scopus’ Citation Tracker with Thomson ISI’s Web of Science . A few throw Google Scholar into their reviews. Barbara Quint contributed a Newsbreak last January. Peter Jacso chimed in with a very thoughtful and well-researched article. Cheryl LaGuardia compared Scopus and WoS for Library Journal .
It seems to me you can compare subject coverage, date ranges, interface design, search functionality, number of journals, type of format (citation, abstract, full text), number of hits resulting from a search, and cost. What you can’t systematically determine is quality of results. For that, we only have anecdotal evidence. Here’s one small anecdote.
Assured by Amanda that searching for yourself isn’t entirely a vanity search – many academics do it in support of grant applications – I checked both Scopus and WoS to see if anyone has cited my articles in ONLINE . Interestingly, I found that one of my editorials was cited both by an article in Medical Teacher and one in Teaching and Learning in Medicine. This fascinated me. What had I said that was of interest to medical teachers? But I became even more intrigued when Scopus said the editorial cited was “Information Professionals as Technologists,” from the July-August 2002 issue of ONLINE, while WoS disagreed, telling me it was “Excising Information,” published in the May-June 2002 issue of ONLINE. At least they agreed on the year.
I popped over to Google Scholar to see what it had to say. You can’t do citation searching at Google Scholar, but I did want to know if the medical journals were included there. On my first try, I could find only one of the articles in question. I was searching on author name. When I searched on words in the article title, the other one surfaced. Then I realized, once again, that online searching requires constant relearning and recalibration. I’ve been so thoroughly trained over the years by ISI to disregard first names, since its citation files only use surnames and first initials, that I never though to do the search any other way. To ISI, John Jones, Jane Jones, and James Jones are all JONES J. Not Google Scholar. I’d been searching for SB Issenberg, but in Google Scholar, he appears as S Barry Issenberg. Lesson learned.
But I still didn’t know which of my editorials had really been cited in those articles. To find out, I went to EBSCOhost’s Academic Search Elite. Both articles were included. Reading them, I determined that Scopus had the correct citation, Web of Science the incorrect one. I wouldn’t draw much in the way of conclusions from this, since it’s only one small anecdote, but it highlights how skeptical we information professionals need to be when searching for information from even the most reputable of producers.