The Open Citation Project - Reference Linking and Citation Analysis for Open Archives

See also Papers produced by the project

The effect of open access and downloads ('hits') on citation impact: a bibliography of studies

Latest articles on OA impact
This is an external feed from an unmoderated, multi-contributor source. Not all entries will appear in the classified bibliography
Find your way through the bibliography
Selected topic ALERTboxes: EXTENDED OA impact biblio rapid reader | NEW Reviews of OA impact studies
Latest additions
Studies with original data
Web tools for measuring impact | Comparative reviews
Background
The financial imperative: correlating research access, impact and assessment | Citation analysis, indexes and impact factors | Open access

Last updated 6 December 2010; first posted 15 September 2004. Please email additions, corrections or comments to Steve Hitchcock.

This bibliography cited in support of Lincoln University Research Committee policy of 'universal practice' for repository deposit, 30 March 2010

This bibliography cited in Times Higher Education Leader: Put all the results out in the open, 12 November 2009

A great resource! Eloy Rodrigues, University of Minho, via Twitter, Jul 17th 2009

This bibliography cited in support of Student Statement on The Right to Research (2009)

This bibliography cited to support the Stanford University School of Education Open Access Motion
Open Access Policy Resources
Q&A behind the Stanford OA policy (29 June 2008)

"incredible resource"
OA Librarian blog, Citation Impact Bibliography Resource (December 7, 2005)

"excellent bibliography"
Peter Suber, American Scientist Open Access Forum (28 September 2005) http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/4804.html

"This ongoing chronological bibliography may be worth bookmarking and checking every few months. There's very little annotation, but it's a good brief bibliography on a narrow - but important - subject."
Walt Crawford, Cites & Insights (November 2004, p13) http://citesandinsights.info/civ4i13.pdf

Citations of this bibliography found by Google Scholar
Web pages that link to this bibliography found by Google

Introduction to the bibliography

Despite significant growth in the number of research papers available through open access, principally through author self-archiving in institutional archives, it is estimated that only c. 20% of the number of papers published annually are open access. It is up to the authors of papers to change this. Why might open access be of benefit to authors? One universally important factor for all authors is impact, typically measured by the number of times a paper is cited (some older studies have estimated monetary returns to authors from article publication via the role citations play in determining salaries). Recent studies have begun to show that open access increases impact. More studies and more substantial investigations are needed to confirm the effect, although a simple example demonstrates the effect.

This chronological bibliography is intended to describe progress in reporting these studies; it also lists the Web tools available to measure impact. It is a focused bibliography, on the relationship between impact and access. It does not attempt to cover citation impact, or other related topics such as open access, more generally, although some key papers in these areas are listed as jump-off points for wider study.

Latest additions to the bibliography

OA impact biblio rapid reader

Top five most-cited papers from this bibliography, as measured by Google Scholar

1 Lawrence, S., Free online availability substantially increases a paper's impact, Nature, 31 May 2001 GS Biblio

2 Harnad, S. and Brody, T., Comparing the Impact of Open Access (OA) vs. Non-OA Articles in the Same Journals, D-Lib Magazine, Vol. 10 No. 6, June 2004 GS Biblio

3 Antelman, K., Do Open-Access Articles Have a Greater Research Impact? College and Research Libraries, 65(5):372-382, September 2004 GS Biblio

4 Eysenbach, G., Citation Advantage of Open Access Articles, PLoS Biology, Volume 4, Issue 5, May 2006 GS Biblio

5 Harnad, S., et al., The Access/Impact Problem and the Green and Gold Roads to Open Access: An Update, Serials Review, Vol. 34, No. 1, March 2008, 36-40 GS Biblio

Also highly cited

Hajjem, C., et al., Ten-Year Cross-Disciplinary Comparison of the Growth of Open Access and How it Increases Research Citation Impact, IEEE Data Eng. Bull., Vol. 28, No. 4, Dec. 2005 GS Biblio

Brody, T., et al., Earlier Web Usage Statistics as Predictors of Later Citation Impact, JASIST, Vol. 57, No. 8, 2006 GS Biblio

Kurtz, M. J., et al., The Effect of Use and Access on Citations, Information Processing and Management, 41 (6), Dec. 2005 GS Biblio

McVeigh, M. E., Open Access Journals in the ISI Citation Databases: Analysis of Impact Factors and Citation Patterns, Thomson Scientific, Oct. 2004 GS Biblio

@ 18 October 2010

A note on Scintilla links, where added. Scintilla aggregates Web sources including blogs, shared bookmarks, as well as PubMed data and citations for that paper. The service is provided by nature.com, and it works best, but not exclusively, for medical and science papers, especially those indexed in PubMed.

Added 6 December 2010 Davis. P. (2010)
Does Open Access Lead to Increased Readership and Citations? A Randomized Controlled Trial of Articles Published in APS Journals
The Physiologist, 53 (6), December 2010
Extracts. Introduction: In order to isolate the effect of access on readership and citations, we conducted a randomized controlled trial of open access publishing on articles published electronically in 11 APS journals. This report details the findings three years after the commencement of the experiment. Discussion: The results of this experiment suggest that providing free access to the scientific literature may increase readership (as measured by article downloads) and reach a larger potential audience (as measured by unique visitors), but have no effect on article citations. These results are consistent with an earlier report of the APS study after one year and the results of other scientific journals after two years. The fact that we observe an increase in readership and visitors for Open Access articles but no citation advantage suggests that scientific authors are adequately served by the current APS model of information dissemination, and second, that the additional readership is taking place outside this core research community.

Comment on this paper:
From ASIST SIGMETRICS email listserv, Re: Davis study (was: Open access publishing, article downloads and citations at 3years), from 23 November 2010. Use the Next in Topic link to follow the debate. Extracts:
Davis, P. (the author): Critics of our open access publishing experiment (read: Stevan Harnad) have expressed skepticism that we were too eager to report our findings and should have waited between 2 and 3 years. All of the articles in our study have now aged 3-years and we report that our initial findings were robust: articles receiving the open access treatment received more article downloads but no more citations.
Harnad, S., Phil Davis's dissertation results are welcome and interesting, and include some good theoretical insights, but insofar as the OA Citation Advantage is concerned, the empirical findings turn out to be just a failure to replicate the OA Citation Advantage, almost certainly due to the small sample size as well as the short time-span.
Davis, P., Your new interest in sample sizes implies -- although you don't seem willing to admit -- that an OA citation advantage is much, much smaller than initially reported. Early studies (including yours) estimated the citation effect to be somewhere between 50% and 500% -- ranges that should be easily detectable with smaller sample sizes such as our study. By focusing on the fact that I do not have the statistical power to detect very small differences is really an admission that an OA citation advantage -- if one truly exists -- can be largely explained by other theories (e.g. self-selection) and that the part attributable to free access is very small indeed.
Harnad, S., I've always been interested in sample sizes. That's why all of our studies have been based on samples that have been orders of magnitude bigger than (for example) yours. But let's not confuse effect-size and the sample-size needed to detect a statistically significant effect; that's not a question about effect size but about variability. The size of the OA citation advantage does indeed vary considerably (from field to field, year to year, and sample to sample). Overall, across all fields of scientific and scholarly research produced by universities and funded by funders, that adds up to a sizeable benefit to research, researchers, their institutions, their funders, and the public that funds the funders and for whose benefit the research is being done, and funded -- a benefit that is worth having, by mandating OA. That implication is very clear -- and it certainly is not the implication you cite in your December summary in the APS house journal, The Physiologist: "The fact that we observe an increase in readership and visitors for Open Access articles but no citation advantage suggests that scientific authors are adequately served by the current APS model of information dissemination." What your findings show is that there was no OA citation advantage in your (small) sample. Point taken. But the interpretation is a mighty stretch, if not an exercise in APS spin.
Waltman, L., Phil kindly provided me with the data of his study. In my view, Phil has convincingly shown that, at least for the journals and the time intervals he studied, there is no meaningful OA citation advantage but also (see later post by Waltman) that an OA citation advantage of reasonable size is unlikely to exist in the underlying population.
Harnad, S., It depends on what you mean by "the underlying population." If you mean that general population of articles, in multiple different fields, tested by multiple independent investigators, reporting a significant (and sometimes quite sizeable) OA citation advantage (with the exception of a very small number of negative or null outcomes), then Phil's study certainly has *not* shown "that an OA citation advantage of reasonable size is unlikely to exist" in *that* underlying population. It would take a null meta-analysis, not just one null outcome, to be able to show that. (Otherwise any repeatedly observed effect could be dismissed on the basis of one non-replication!) We too did a test of the self-selection hypothesis -- on a much larger sample across more fields and a longer time interval -- and we not only found "an OA citation advantage of reasonable size" for self-selected OA, but we found that the advantage was the same size for mandated OA. We accordingly conclude that "an OA citation advantage of reasonable size is likely to exist in the underlying population" if you test for it, and your sample is big enough and long enough -- except perhaps in Phil's sample of (mostly APS) journals...
Davis, P., Let's move to your new concern about generalizability: While I can't claim negative results across all fields and across all times, our randomized controlled trials (RCTs) did involve 36 journals produced by 7 different publishers in the medical, biological, and multi-disciplinary sciences, plus the social sciences and humanities. Yet, if you are basing your comparison solely on number of journals and number of articles, then you are completely missing the rationale for conducting the RCTs in the first place
Harnad, S., The many published comparisons are based on comparing OA and non-OA articles within the same journal and year. That's the studies that are simply testing whether there is an OA citation advantage. But the comparison you are talking about is the comparison between self-selected and imposed OA. Your study has shown that in your sample (consisting of OA imposed by randomization), there is no OA citation advantage (only an OA download advantage). But it has not shown that there is any OA self-selection advantage either. Without that, there is only the non-replication of the OA citation advantage.

Added 6 December 2010 Davis. P. (2010)
Access, Readership, Citations: A Randomized Controlled Trial Of Scientific Journal Publishing
eCommons@Cornell, 20 October 2010
From the abstract: This dissertation explores the relationship of Open Access publishing with subsequent readership and citations. It reports the findings of a randomized controlled trial involving 36 academic journals produced by seven publishers in the sciences, social sciences and humanities. At the time of this writing, all articles have aged at least two years. Articles receiving the Open Access treatment received significantly more readership (as measured by article downloads) and reached a broader audience (as measured by unique visitors), yet were cited no more frequently, nor earlier, than subscription-access control articles. A pronounced increase in article downloads with no commensurate increase in citations to Open Access treatment articles may be explained through social stratification, a process which concentrates scientific authors at elite, resource-rich institutions with excellent access to the scientific literature. For this community, access is essentially a non-issue.

Added 17 Feb 2010, updated 18 Oct 2010 Gargouri, Y., Hajjem, C., Lariviere, V., Gingras, Y., Brody, T., Carr, L. and Harnad, S. (2010)
Self-Selected or Mandated, Open Access Increases Citation Impact for Higher Quality Research
PLoS ONE 5(10): e13636, October 18, 2010, doi:10.1371/journal.pone.0013636
Also in ECS EPrints, 10 Feb 2010, http://eprints.ecs.soton.ac.uk/18493/ (this version includes the paper and full supplemental materials, including further analyses and responses to comments and feedback), and in arXiv, arXiv:1001.0361v2 [cs.CY], 3 Jan 2010
Abstract
Background. Articles whose authors have supplemented subscription-based access to the publisher's version by self-archiving their own final draft to make it accessible free for all on the web (Open Access, OA) are cited significantly more than articles in the same journal and year that have not been made OA. Some have suggested that this OA Advantage may not be causal but just a self-selection bias, because authors preferentially make higher-quality articles OA. To test this we compared self-selective self-archiving with mandatory self-archiving for a sample of 27,197 articles published 20022006 in 1,984 journals.
Methdology/Principal Findings. The OA Advantage proved just as high for both. Logistic regression analysis showed that the advantage is independent of other correlates of citations (article age; journal impact factor; number of co-authors, references or pages; field; article type; or country) and highest for the most highly cited articles. The OA Advantage is real, independent and causal, but skewed. Its size is indeed correlated with quality, just as citations themselves are (the top 20% of articles receive about 80% of all citations).
Conclusions/Significance. The OA advantage is greater for the more citable articles, not because of a quality bias from authors self-selecting what to make OA, but because of a quality advantage, from users self-selecting what to use and cite, freed by OA from the constraints of selective accessibility to subscribers only. It is hoped that these findings will help motivate the adoption of OA self-archiving mandates by universities, research institutions and research funders.

Comment on this paper:

Added 6 Dec 2010 Comments below refer to the PLoS ONE publication
Howard, J., Is There an Open-Access Citation Advantage? The Chronicle of Higher Education, 19 Oct 2010:
Seeks to reignite the earlier debate around the preprint of the new PLoS ONE paper. And succeeds - see reader responses attached to the article, some extracts below.
Harnad, S., Correlation, Causation, and the Weight of Evidence, Open Access Archivangelism, 20 Oct 2010: One can only speculate on the reasons why some might still wish to cling to the self-selection bias hypothesis in the face of all the evidence to date. The straightforward causal relationship is the default hypothesis, based on both plausibility and the cumulative weight of the evidence. Hence the burden of providing counter-evidence to refute it is now on the advocates of the alternative.
sk_griffhoven, October 20, 2010: The authors were unable to control for institutional effects in their model. While deposit mandates might be responsible for the results they report, they might not, and I dont see how mandates would outperform self-selection. Most importantly, there is no basis for making a causal claim. I agree with Philip David: the authors greatly overstate their results.
stevanharnad, October 21, 2010: The causal claim is not that mandated OA out-performs self-selected OA, but that self-selected OA does *not* out-perform mandated OA, hence OA is causal.
signofthefourwinds, October 21, 2010: It doesnt make sense that researchers suddenly give up their habit of consulting certain databases to go search in Google Scholar. What user behavior change accounts for the OA advantage?
stevanharnad, October 22, 2010: The OA Advantage is not just, or primarily, a convenience or laziness effect (though some of that no doubt contributes to it too): It is not that scholars have become sloppy, relying on google scholar instead of consulting more established databases. It is that when their institution cannot afford access to articles they need, they must make do with only those of them that they can access for free online.
Patrick Chardenet, October 22, 2010: I dont think that the problem is to know if Open Access reinforces or not the number of citations. It is rather a question of knowing if the measurement of science by the measurement of the number of citations has an interest for the scientific development.
stevanharnad, October 22, 2010: In a nutshell, citations are not the goal of research; the goal is that the research should be read, used and built upon, in further research and applications. And citations are a measure of that. But for research to be read, used and built upon, it has to be accessible. That is why and how OA increases citations.
Fenner, M., New in PLoS ONE: Citation rates of self-selected vs. mandated Open Access, PLoS Blogs, Gobbledygook, 19 Oct 2010: "I feel that the paper comes a little short. Yes, they did a very detailed analysis of the citation behavior, and take into account important cofactors. But the reader is left with the impression that mandatory self-archiving of post-prints in institutional repositories is the only reasonable Open Access strategy, and the introduction and discussion accordingly leave out some important arguments." Notable for the substantive discussion that follows between the blogger (Fenner) and one of the principal authors of the paper (Harnad).
Harnad, S., Comparing OA and Non-OA: Some Methodological Supplements, Open Access Archivangelism, 19 Oct 2010. Responds to Fenner: "If we have given "the impression that mandatory self-archiving of post-prints in institutional repositories is the only reasonable Open Access strategy," then we have succeeded in conveying the implication of our findings."

Comments below refer to the preprint
Davis, P., Does a Citation Advantage Exist for Mandated Open Access Articles? the scholarly kitchen, Jan 7, 2010; "Gargouri reports that institutionally-mandated OA papers received about a 15% citation advantage over self-selected OA papers, which seems somewhat counter-intuitive.  If better articles tend to be self-archived, their reasoning goes, we should expect that papers deposited under institutional-wide mandates would under-perform those where the authors select which articles to archive. The authors of this paper deal, rather unscientifically, with this inconvenient truth with a quick statistical dismissal  that their finding "might be due to chance or sampling error. In sum, this paper tests an interesting testable hypothesis on whether mandatory self-archiving policies are beneficial to their authors in terms of citations. Their unorthodox methodology, however, results in some inconsistent and counter-intuitive results that are not properly addressed in their narrative."
The following were among the comments added to the above blog by Davis.
Harnad, S., Jan 7, 2010: "Mandated OA Advantage? Yes, the fact that the citation advantage of mandated OA was slightly greater than that of self-selected OA is surprising, and if it proves reliable, it is interesting and worthy of interpretation. We did not interpret it in our paper, because it was the smallest effect, and our focus was on testing the Self-Selection/Quality-Bias hypothesis, according to which mandated OA should have little or no citation advantage at all, if self-selection is a major contributor to the OA citation advantage."
Gaule, P., Jan 7, 2010: "the paper does not appear to include controls for institutions of the authors of the control sample. This is particularly worrisome when comparing papers originating from CERN which arguably does cutting edge physics to the control papers. The key issue in this paper seems to be interpreting the mandated open access versus self-selected open access. The authors find and point out that the mandates actually result in compliance of around 60%. However, they have little to say on what is going on here and why papers end up in the compliant group or not. I am not sure what conclusions can be inferred from this comparison of two types of self-selection, at least one of which is not well understood."
Harnad, S., Jan 8, 2010: "(2) THE SPECIAL CASE OF CERN: With so few institutional mandates, its not yet possible to control for institutional quality. But CERN is indeed a special case; when it is removed, however, it does not alter the pattern of our results. (3) SELECTIVE COMPLIANCE? Mandate compliance is not yet 100%, so some form of self-selection still remains a logical possibility, but we think this is made extremely improbable when there is no decline in the OA Advantage even when mandates quadruple the OA rate from the spontaneous self-selective baseline of 15% to the current mandated average of 60%."
Schneider, J. W., Jan 8, 2010: "the claim of causality seems well beyond the mark. Neither former research nor the current regression design permits any casual claims."
Harnad, S., Jan 9, 2010: "CAUSALITY: We agree that causality is difficult to demonstrate with correlational statistics. However, we note that the hypothesis that (2a) making articles open access causes them to be more citeable and the hypothesis that (2b) being more citeable causes articles to be made open access are both causal hypotheses."
Harnad, S., Open Access: Self-Selected, Mandated & Random; Answers & Questions, Open Access Archivangelism, February 8, 2010: "What follows is what we hope will be found to be a conscientious and attentive series of responses to questions raised by Phil Davis about our paper (currently under refereeing) -- responses for which we did further analyses of our data (not included in the draft under refereeing)."

Added 18 October 2010 Herb, U. (2010)
OpenAccess Statistics: Alternative Impact Measures for Open Access documents? An examination how to generate interoperable usage information from distributed Open Access services
E-LIS, 25 Sep 2010. In: L'information scientifique et technique dans l'univers numérique. Mesures et usages. L'association des professionnels de l'information et de la documentation, ADBS, pp. 165-178
From the abstract: This contribution shows that most common methods to assess the impact of scientific publications often discriminate Open Access publications  and by that reduce the attractiveness of Open Access for scientists. Assuming that the motivation to use Open Access publishing services (e.g. a journal or a repository) would increase if these services would convey some sort of reputation or impact to the scientists, alternative models of impact are discussed. Prevailing research results indicate that alternative metrics based on usage information of electronic documents are suitable to complement or to relativize citation-based indicators. Furthermore an insight into the project OpenAccess- Statistics OA-S is given. OA-S implemented an infrastructure to collect document-related usage information from distributed Open Access Repositories in an aggregator service in order to generate interoperable document access information according to three standards (COUNTER, LogEc and IFABC).

Comment on this paper:
Harnad, S., Needed: OA -- Not Alternative Impact Metrics for OA Journals, Open Access Archivangelism, September 26. 2010: The article by Herb "is predicated on one of the oldest misunderstandings about OA: that OA ≡ OA journals ("Gold OA") and that the obstacle to OA is that OA journals don't have a high enough impact factor"

Added 18 October 2010 Suber, P. (2010)
Thinking about prestige, quality, and open access
SPARC Open Access Newsletter, issue #125, September 2, 2008
Brief extracts. Here are a dozen thoughts or theses about prestige and OA. I start with the rough notion that if journal quality is real excellence, then journal prestige is reputed excellence. (1) Universities reward faculty who publish in high-prestige journals, and faculty are strongly motivated to do so.  If universities wanted to create this incentive, they have succeeded. If journal prestige and journal quality can diverge, then universities and funders may be giving authors an incentive to aim only for prestige. If they wanted to create an incentive to put quality ahead of prestige, they haven't yet succeeded. (8) Universities tend to use journal prestige and impact as surrogates for quality. The excuses for doing so are getting thin.

Added 18 October 2010 Willinsky, J. (2010)
Open access and academic reputation
NISCAIR Online Periodicals Repository (NOPR), Annals of Library and Information Studies, 57 (3), Sep 2010, 296-302
Abstract: Open access aims to make knowledge freely available to those who would make use of it. High-profile open access journals, such as those published by PLoS (Public Library of Science), have been able to demonstrate the viability of this model for increasing an authors reach and reputation within scholarly communication through the use of such bibliographic tools as the Journal Impact Factor, conceived and developed by Eugene Garfield. This article considers the various approaches that authors, journals, and funding agencies are taking toward open access, as well as its effect on reputation for authors and, more widely, for journals and the research enterprise itself.

Simple and practical example

Citation analysis is specialised and difficult. To make the case for, or against, a claim such as 'open access increases impact' requires a lot of the reader, who may not be a specialist but who wants to try and understand the point at issue and decide if it has any relevance to him or her. The following simple example is included for this reason, not as proof but as evidence of the effect within a particular domain. Draw your own conclusions, and then read the more detailed evidence of the bibliography if you are still interested.

"Measuring the effect for physics or astronomy is easy. This link returns the number of articles published in the Astrophysical Journal in 2003 and their number of citations.

"This next link shows the number of these papers which are available OA in the arXiv, and their citations.

"The result is that 75% of the papers are in the arXiv, and they represent 90% of the citations, a 250% OA effect.

"By replacing ApJ with the mnemonic for any other physics or astronomy journal one can repeat the measurement; for Nuclear Physics A (NuPhA) one gets that 32% of the articles are in the arXiv, and they represent 78% of the citations, a 740% OA effect."
From Michael Kurtz, American Scientist Open Access Forum, 28 September 2005 http://users.ecs.soton.ac.uk/harnad/Hypermail/Amsci/4807.html
Note, the database links are 'live', i.e. they return the current database figures, not the exact figures on which Michael Kurtz would have based his calculations, but the percentages quoted are unlikely to change dramatically, in the short term at least.

Elucidation of calculation (by Stevan Harnad, figures valid on 22 July 2007)
For ApJ:
TOT: articles 2592 citations 70732
Arx: articles 1943 citations 62586 c/a 32.21 (rounded to 32)
Non: articles 649 citations 8146 c/a 12.55 (rounded to 13)
Then 32/13 = 2.5 (250%)

For NuPhA:
TOT: articles 1134 citations 4451
Arx: articles 344 citations 3225 c/a 9.375
Non: articles 790 citations 1226 c/a 1.552
Then 9.375/1.553 = 6.041 (600%)

Michael Kurtz comments: "The differences in (NuPhA: 740% to 600% effect) results are because the database has changed over the past two years since I did it. There is a systematic error in the calculations for Nuclear Physics A (Elsevier does not give us the references) so the results will be higher than the true value. Physical Review C (Nuclear Physics) has an OA advantage number of 221%, the systematic in this case is small and in the other direction."

Studies with original data

Highlights

Reviews of OA impact studies

Reviews and summaries of reported studies on the Open Access citation advantage

Wagner, A. B., Open Access Citation Advantage: An Annotated Bibliography, Issues in Science and Technology Librarianship, No. 60, Winter 2010

Swan, A., The Open Access citation advantage: Studies and results to date, ECS EPrints, 17 Feb 2010

Davis, P., Studies on access: a review, arXiv:0912.3953v1 [cs.DL], 20 Dec 2009

Mertens, S., Open Access: Unlimited Web Based Literature Searching, Dtsch Arztebl Int., 106(43): October 23, 2009, 710-712

Craig, I. D., Plume, A. M., McVeigh, M. E., Pringle, J. and Amin, M., Do Open Access Articles Have Greater Citation Impact? A critical review of the literature, Publishing Research Consortium, undated (announced 17 May 2007), Journal of Informetrics, 1 (3): 239-248, July 2007

and some practical advice on how to respond to the OA citation advantage

Bowering Mullen, L., Increasing Impact of Scholarly Journal Articles: Practical Strategies Librarians Can Share, Electronic Journal of Academic and Special Librarianship, Spring 2008

Lawrence (2001) was the first to publish data recognising the trend for online publication to increase impact, confirmed for open access papers by the work of the Open Citation Project based on arXiv (e.g. Harnad and Brody, D-Lib, 2004), and by Kurtz et al. (2004a, 2003a) looking at the NASA Astrophysics Data System. Commenting on Harnad and Brody (D-Lib, 2004) in Open Access News, Peter Suber said:
This is an important article. It's the first major study since the famous Lawrence paper documenting the proposition that OA increases impact. It's also the first to go beyond Lawrence in scope and method in order to answer doubts raised about his thesis. By confirming that OA increases impact, it gives authors the best of reasons to provide OA to their own work (21 June 2004)
Broader collaborations have emerged to extend these findings (e.g. Brody et al. 2004).

Open access has become feasible because of the move towards online publication and dissemination. A new measure that becomes possible with online publication is the number of downloads or 'hits', opening a new line of investigation. Brody et al. have been prominent in showing there is a correlation between higher downloads and higher impact, particularly for high impact papers, holding out the promise not just for higher impact resulting from open access but for the ability to predict high impact papers much earlier, not waiting years for those citations to materialise (e.g. Brody and Harnad 2005). The effect can be verified with the Correlation Generator (below).

(Note. The latest listings might include preprints, or even pre-preprints. This area of study is effectively a work in progress, and as such the list is intended to raise awareness of the most recent results, even where these may not be definitive or final versions. Check back for definitive versions.)

Added 6 September 2010 Zawacki-Richter, O., Anderson, T. and Tuncay, N. (2010)
The Growing Impact of Open Access Distance Education Journals: A Bibliometric Analysis
The Journal of Distance Education / Revue de l'Éducation à Distance, 24 (3), 2010
From the Abstract: we examine 12 distance education journals (6 open and 6 published in closed format by commercial publishers). Using an online survey completed by members of the editorial boards of these 12 journals and a systematic review of the number of citations per article (N = 1,123) and per journal issue between 2003 and 2008, we examine the impact, and perceived value of the 12 journals. We then compute differences between open and closed journals. The results reveal that the open access journals are not perceived by distance eductation editors as significantly more or less prestigious than their closed counterparts. The number of citations per journal and per article also indicates little difference. However we note a trend towards more citations per article in open access journals. Articles in open access journals are cited earlier than in non-open access journals.

Added 6 September 2010 Kim, J. (2010)
Faculty self-archiving: Motivations and barriers
Journal of the American Society for Information Science and Technology, 16 Jul 2010
info:doi/10.1002/asi.21336
This paper is broader than open access impact, but one part of the investigation looked at it.
From the Discussion: A few interviewees did believe that self-archiving resulted in their research work being cited more frequently, although 13 interviewees were unsure about the positive relationship between self-archiving and the citation rate. Professors even considered self-archiving to serve other purposes, for example, to recruit graduate students, or to find collaborators, instead of increasing the impact of research. In fact, five interviewees expressed uncertainty regarding whether self-archiving would improve professional recognition. Four other interviewees did not expect self-archiving to increase academic recognition, as they believed this related more to the quality of research itself, rather than merely making it publicly accessible. These findings suggested that the majority of faculty participants in this study were unaware of the evidence of a citation advantage from OA previously identified by several studies. Without noticing the evidence, professors tend not to expect a citation advantage from self-archiving; however, they see benefits from the user side through self-archiving. This study shows that faculty have diverse opinions about citation rates and academic recognition related to self-archiving.

Added 6 September 2010 Strotmann, A. and Zhao, D. (2010)
Impact of Open Access on stem cell research: An author co-citation analysis
76th IFLA General Conference and Assembly, Gothenburg, Sweden, 22 Jun 2010
Abstract: We explore the impact of Open Access (OA) on stem cell research through a comparison of research reported in OA and in non-OA publications. Using an author co-citation analysis method, we find that (a) OA and non-OA publications cover similar major research areas in the stem cell field, but (b) a more diverse range of basic and medical research is reported in OA publications, while (c) biomedical technology areas appear biased towards non-OA publications. From the Introduction: many studies have investigated whether OA publication of research results has a positive effect on the citation ranking of those publications ... we approach the comparison between OA and non-OA publishing of research results from a somewhat different perspective. We explore whether there are substantial differences between the intellectual structure of a research field when viewed from either the point of view of the OA publications in that field or from that of its non-OA publications.

Added 6 September 2010 Jacques, T. S. and Sebire, N. J. (2010)
The impact of article titles on citation hits: an analysis of general and specialist medical journals
JRSM Short Reports, 1 (1), 2, 01 Jun 2010
info:doi/10.1258/shorts.2009.100020
More factors to consider in citation impact assessment.
From the Abstract: We hypothesized that specific features of journal titles may be related to citation rates. We reviewed the title characteristics of the 25 most cited articles and the 25 least cited articles published in 2005 in general and specialist medical journals including the Lancet, BMJ and Journal of Clinical Pathology. The title length and construction were correlated to the number of times the papers have been cited to May 2009. Results The number of citations was positively correlated with the length of the title, the presence of a colon in the title and the presence of an acronym. Factors that predicted poor citation included reference to a specific country in the title. Conclusions These data suggest that the construction of an article title has a significant impact on frequently the paper is cited. We hypothesize that this may be related to the way electronic searches of the literature are undertaken.

Added 9 June 2010 Herb, U. (2010)
Alternative Impact Measures for Open Access Documents? An examination how to generate interoperable usage information from distributed open access services
76th IFLA General Conference and Assembly, Gothenburg, Sweden, August 2010, paper available online 29 May 2010
From the Abstract: This contribution shows that most common methods to assess the impact of scientific publications often discriminate open access publications  and by that reduce the attractiveness of Open Access for scientists. Assuming that the motivation to use open access publishing services (e.g. a journal or a repository) would increase if these services would convey some sort of reputation or impact to the scientists, alternative models of impact are discussed. Prevailing research results indicate that alternative metrics based on usage information of electronic documents are suitable to complement or to relativize citation based indicators.

Added 9 June 2010 Giglia, E. (2010)
The Impact Factor of Open Access journals: data and trends
DHANKEN, digital repository of HANKEN research, 27 May 2010. In 14th International Conference on Electronic Publishing, Helsinki, 16-18 June 2010. Slides in E-LIS, 21 June 2010 http://eprints.rclis.org/18669/
From the Abstract: The aim of this preliminary work, focused on Gold Open Access, is to test the performance of Open Access journals with the most traditional bibliometric indicator  Impact Factor, to verify the hypothesis that unrestricted access might turn into more citations and therefore also good Impact Factor indices. Open Access journals are relatively new actors in the publishing market, and gaining reputation and visibility is a complex challenge. Some of them show impressive Impact Factor trends since their first year of tracking.

Added 17 May 2010 Calver, M. C. and Bradley, J. S. (2010)
Patterns of Citations of Open Access and Non-Open Access Conservation Biology Journal Papers and Book Chapters
Conservation Biology, published online: 23 Apr 2010
From the abstract: We compared the number of citations of OA and non-OA papers in six journals and four books published since 2000 to test whether OA increases number of citations overall and increases citations made by authors in developing countries. After controlling for type of paper (e.g., review or research paper), length of paper, authors' citation profiles, number of authors per paper, and whether the author or the publisher released the paper in OA, OA had no statistically significant influence on the overall number of citations per journal paper. Journal papers were cited more frequently if the authors had published highly cited papers previously, were members of large teams of authors, or published relatively long papers, but papers were not cited more frequently if they were published in an OA source. Nevertheless, author-archived OA book chapters accrued up to eight times more citations than chapters in the same book that were not available through OA, perhaps because there is no online abstracting service for book chapters. There was also little evidence that journal papers or book chapters published in OA received more citations from authors in developing countries relative to those journal papers or book chapters not published in OA. For scholarly publications in conservation biology, only book chapters had an OA citation advantage, and OA did not increase the number of citations papers or chapters received from authors in developing countries.

Comment on this paper:
Goldstein, R., Does open-access publishing increase future citations of a study? Conservation Maven blog, Apr 26, 2010: "The study however suffers from some limitations. Given that conservation is a highly applied science, a main goal of research is to inform conservation practice. It seems likely that open access publishing would be more important for conservation practitioners than researchers given that many research institutions have paid journal access for their staff and students. Another limitation of the study is that it looked at published research from as far back as the year 2000. However, the social landscape for disseminating information has changed dramatically over the last 5 - 10 years. Despite these limitations this study represents an important contribution to the debate about improving public accessibility to peer-reviewed research."
Responses to this blog:
Harnad, S.: "The study might indeed be evidence that there is no OA citation advantage in conservation biology, but it's more likely that the sample was simply too small."

Added 6 September 2010 Habibzadeh, F. and Yadollahie, M. (2010)
Are Shorter Article Titles More Attractive for Citations? Cross-sectional Study of 22 Scientific Journals
Croatian Medical Journal, 51 (2), April 2010
Open access is not the only factor affecting citation impact. Here is another factor that has received rather less attention. From the Abstract: Longer titles seem to be associated with higher citation rates. This association is more pronounced for journals with high impact factors. Editors who insist on brief and concise titles should perhaps update the guidelines for authors of their journals and have more flexibility regarding the length of the title.

Added 17 May 2010 Agerbæk, A. and Nielsen, K. (2010)
Factors in Open Access which Influence the Impact Cycle
ScieCom info, Vol 6, No 1, 2010 (issue notice posted 22 March 2010)
Short paper illustrating journal publishing flowcharts for non-open access (OA), gold OA and green OA, showing why, in principle, open access might lead to higher citations due to wider and earlier dissemination.

Added 17 May 2010 Wagner, A. B. (2010)
Open Access Citation Advantage: An Annotated Bibliography
Issues in Science and Technology Librarianship, No. 60, Winter 2010 (issue notice posted 16 March 2010)
The bibliography is divided into three sections:
- Review articles [5 reviews]
- Studies showing an open access citation advantage (OACA) [39 articles]
- Studies showing either no OACA effect or ascribing OACA to factors unrelated to OA publication [7 articles]
The following databases were searched ... results were cross-checked against an extensive, more general bibliography (this bibliography). It is interesting to note that no study has ever claimed that OA articles were cited less than TA articles. The research question still being debated is whether other factors explain the widely observed OACA (Open Access Citation Advantage) rather than the mere fact an article is open access.

Added 09 Mar 2010 Snijder, R. (2010)
The profits of free books - an experiment to measure the impact of Open Access publishing
Google sites, undated, but first spotted in the wild 23 February 2010. In Learned Publishing, Vol. 23, No. 4, October 2010, 293-301
Abstract: to measure the impact of Open Access (OA) publishing of academic books, an experiment was set up. During a period of nine months three sets of books were disseminated through an institutional repository, the Google Book Search program or both channels. A fourth set was used as control group. Open Access publishing enhances discovery and online consultation. No relation could be found between OA publishing and citation rates. Contrary to expectations, OA publishing does not stimulate or diminish sales figure. The Google Book Search program is superior compared to the repository.

Comment on this paper:
Davis, P., When the Love of Books Doesnt Increase Sales or Citations Scholarly Kitchen, Sep 29, 2010: seeks further explanations of some of the findings.

Added 09 Mar 2010 Swan, A. (2010)
The Open Access citation advantage: Studies and results to date
ECS EPrints, 17 Feb 2010
Abstract: presents a summary of reported studies on the Open Access citation advantage. There is a brief introduction to the main issues involved in carrying out such studies, both methodological and interpretive. The study listing provides some details of the coverage, methodological approach and main conclusions of each study.

Comment on this paper:
Ed., 11 March 2010: expertly moderated review with a telling tabulation of results.
Davis, P., Rewriting the History of the Open Access Debate, the scholarly kitchen blog, Mar 11, 2010: Swan's summary is quite useful for those looking for a distillation of the research literature.  What is troubling with this document, however, is the narrative, and it is here that Swan creates a historical revision of the open access debate. In her report, Swan provides a score sheet for the open access game:

     Studies finding a positive open access citation advantage = 27
     Studies finding no open access citation advantage (or an OA citation disadvantage) = 4

Meta-analysis is set of powerful statistical techniques for analyzing the literature. Its main function is to increase the statistical power of observation by combining separate empirical studies into one über-analysis.  Its assumed, however, that the studies are comparable. This is not the case with the empirical literature on open access and citations. Conducting a meta-analysis on this disparate collection of studies is like taking a Veg-O-Matic to a seven-course dinner.  Not only does it homogenize the context (and limitations) of each study into a brown and unseemly mess, but it assumes that homogenization of disparate studies somehow results in a clearer picture of scientific truth.
Responses to this blog:
Swan, A., 11 Mar 2010: I am rather surprised that you feel entitled to challenge the development of my own expectations. Surely I am allowed to write about them as a matter of fact, since I had them? So, no rewriting of history there. Intelligent people dont need to be told what they should think about these things, or pointed to particular studies as the only ones that should matter. They can read all of the studies and reach their own conclusions, including about which ones have been carried out in the best way. The score sheet was the only sensible way of summarising the pages-long table. People often ask me how the studies tally up, so tally them up I did.
Wilson, T., 11 Mar 2010: Philip is right, the studies are so different and so many intervening variables have not been taken into consideration, that to conclude that OA leads to increased citation is simply, in the Scottish court judgement, not proven. Coming to a personal opinion on an issue is not the same as demonstrating, scientifically, the probable truth of a hypothesis.
Jim, Mar 11, 2010: In the the hierarchy of evidence a systematic review containing one or more well designed RCTs (randomized controlled trials) is classed as the highest form of evidence. Currently we appear to have one RCT which didnt find a positive OA citation effect, and then a variety of other studies, some better designed than others, some which found an OA effect, and some which didnt. Talk of a meta-analysis is, in my opinion, premature, and until then Id have to back the RCT.
Glass, G. V., Mar 12, 2010: having dedicated 35 years of my efforts to meta analysis and 20 to OA, I cant resist a couple of quick observations. Holding up one set of methods (be they RCT or whatever) as the gold standard is inconsistent with decades of empirical work in meta analysis that shows that perfect studies and less than perfect studies seldom show important differences in results. If the question at hand concerns experimental intervention, then random assignment to groups may well be inferior as a matching technique to even an ex post facto matching of groups. Randomization is not the royal road to equivalence of groups; its the road to probability statements about differences. Claims about the superiority of certain methods are empirical claims. They are not a priori dicta about what evidence can and can not be looked at.
Wilson, D., Update on Meta-Analysis of Studies on Open Access Impact Advantage, Open Access Archivangelism blog, March 17. 2010: "Meta-analysis could be put to good use in this area. It won't resolve the issue of whether the studies that Davis thinks are flawed are in fact flawed. It could explore the consistency in effect across these studies and whether the effect varies by the method used. Both would add to the debate on this issue."
Hooker, B., advantage, schmantage, Open Reading Frame blog, 18 April 2010: "The FUD merchants want to claim that, if no citation advantage exists, there is no point to Open Access: that unless OA papers are currently garnering more citations than their TA equivalents, current levels of access must be adequate; or that if OA papers, which presumably are read more, are not cited more, then OA must be a repository for the second rate. Hence the controversy: it's an easy way to obscure the debate, sending up a cloud of statistical argument like a fleeing cuttlefish squirting ink."

Added 17 Feb 2010 Giglia, E. (2010)
Più citazioni in Open Access? Panorama della letteratura con uno studio sull'Impact Factor delle riviste Open Access
E-LIS, 21 Jan 2010, also in CIBER 1999-2009, 2009 (Ledizioni), pp. 125-145
From the English abstract: This work aims to frame the international debate on the advantage citation of articles published in Open Access, then present and discuss the overall data sull'Impact Factor of open access journals, the result of an original study conducted in the Journal Citation Reports (Thomson Reuters). The basic idea is to test the performance of OA journals according to the traditional bibliometric indicators dell'Impact Factor, in order to test the hypothesis that unrestricted access may involve a greater number of citations and, therefore, also a good impact factor. The results seem to confirm: the 38,62% of Open Access journals included in the "Journal Citation Reports" is positioned in the first five percentile as an indicator when considering the Impact Factor. If you use the Immediacy Index is the percentage is 37.16%, while the second new indicator dell'Impact Factor over 5 years - however, only applies to 356 titles on 479 - the percentage rises to 40.05%.

Added 17 Feb 2010 Davis, P. (2009)
Studies on access: a review
arXiv:0912.3953v1 [cs.DL], 20 Dec 2009
Brief abstract: A review of the empirical literature on access to scholarly information. This review focuses on surveys of authors, article download and citation analysis.

Added 17 Feb 2010 Ibanez, A., Larranaga, P. and Bielza, C. (2009)
Predicting citation count of Bioinformatics papers within four years of publication
Bioinformatics, 25 (24), 3303-3309, 15 December 2009
info:pmid/19819886 | info:doi/10.1093/bioinformatics/btp585
From the abstract: "The possibility of a journal having a tool capable of predicting the citation count of an article within the first few years after publication would pave the way for new assessment systems. Results: This article presents a new approach based on building several prediction models for the Bioinformatics journal. These models predict the citation count of an article within 4 years after publication (global models). To build these models, tokens found in the abstracts of Bioinformatics papers have been used as predictive features, along with other features like the journal sections and 2-week post-publication periods." Comment: Bioinformatics is not an open access journal, so these results are not based on data for open access papers, but they may have parallels with methods for predicting citations and impact based on usage of OA papers (e.g. Brody et al., 2005). Data on which the results are based can be found at the authors' site. Without access to the full paper it is not clear what predictive features are being applied to achieve the claimed successful results: "In these new models, the average success rate for predictions using the naive Bayes and logistic regression supervised classification methods was 89.4% and 91.5%, respectively, within the nine sections and for 4-year time horizon."

Added 10 Dec 2009 Rand, D. G. and Pfeiffer, T. (2009)
Systematic Differences in Impact across Publication Tracks at PNAS
PLoS ONE, 4(12): e8092, December 1, 2009
Investigates citation counts for the three different publication tracks of the Proceedings of the National Academy of Sciences (PNAS). Open access is used as a control factor in the analysis; "To empirically investigate the impact of papers published via each track, we inspect 2695 papers published between June 1, 2004 and April 26, 2005, covering PNAS Volume 101 Issue 22 through Volume 102 Issue 17. For each paper, we examine Thomson Reuters Web of Science citation data as of October 2006 and May 2009, as well as page-view counts as of October 2006. We also note the track through which each paper was published, the topic classification of each paper, the date of publication, and whether each article was published as open access and/or as part of a special feature." In quantifying the size of the OA effect to control, the paper found that "similar to previous observations, Open Access papers receive approximately 25% more citations than non-Open Access papers (Median 2006 [2009] citations: Open access = 12.5 [38], non-Open access = 10 [30]; 75% percentile 2006 [2009] citations: Open access = 21 [61], non-Open access = 17 [50])."

Added 10 Dec 2009 Soong, S. (2009)
Measuring Citation Advantages of Open Accessibility
D-Lib Magazine, 15 (11/12), December 2009
Short study of a small collection of papers deposited after publication in the institutional repository of the Hong Kong University of Science and Technology (HKUST). "A total of 50 archived journal articles that already have 10 or more citation counts in Scopus were randomly selected for inclusion in this study." Claims to present an "easy-to-follow framework for citation impact analysis of open accessibility. This framework allows for direct measurement and comparison of citation rates before and after journal articles are made openly available." The method compares the citation performance of the same article over time pre- and post-open access rather than, as other studies of OA impact, comparing open access papers with non-open access papers from the same source.

Added 10 Dec 2009 Poor, N. (2009)
Global Citation Patterns of Open Access Communication Studies Journals: Pushing Beyond the Social Science Citation Index
International Journal of Communication, Vol. 3, 2009
From the abstract: Connectivity and citations, as used by a large number of scholars in different fields, are a common measure of the health of a discipline. This paper shows the citation patterns for a multinational sample of open access journals in Communication Studies. Their citations are similar to those of the main communication journals, but with more international citations. Differences in the citation patterns are attributable to the international nature of the sampled journals, not to their open access status. From the conclusion: The citation pattern of these open access journals is the same as that for non-open access journals, which is how it should be if open access journals are going to be of the same quality as more established, non-open access journals (recall that open access does require peer-review). The journals in the sample are not in a separate citation space, and they take part in the larger conversation of the field. As such, this indicates, to a certain extent, the health of these journals (they are not isolates in the citing direction), which, in turn, is a decent indicator for the health of the field.

Added 17 Feb 2010 Akre, O., Barone-Adesi, F., Pettersson, A., Pearce, N., Merletti, F., and Richiardi, L. (2009)
Differences in citation rates by country of origin for papers published in top-ranked medical journals: do they reflect inequalities in access to publication?
Journal of Epidemiology and Community Health, 24 Nov 2009
info:pmid/19934169 | info:doi/10.1136/jech.2009.088690
Investigates the connection between citations and access, but not open access, by country and income with reference to the country of publication. From the abstract: "Methods: We obtained the number of citations and the corresponding authors country for 4724 papers published between 1998 and 2002 in the British Medical Journal, the Lancet, Journal of the American Medical Association, New England Medical Journal. Countries were grouped according to the World Bank classification and geographic location: low-middle income countries, European high-income countries, non-European high-income countries, UK and USA. Conclusions: Papers from different countries published in the same journal have different citation rates."

Added 10 Dec 2009 Mertens, S. (2009)
Open Access: Unlimited Web Based Literature Searching
Dtsch Arztebl Int., 106(43): October 23, 2009, 710-712
Reviews the findings of most of the principal papers found in this bibliography

Added 15 July 2009 Kousha, K. and Abdoli, M. (2009)
The citation impact of Open Access Agricultural Research: a comparison between OA and Non-OA publications (pdf 12pp)
World Library And Information Congress: 75th IFLA General Conference and Council, 23-27 August 2009, Milan, Italy. Also in Online Information Review, Vol. 34, No. 5, 2010, 772-785 http://dx.doi.org/10.1108/14684521011084618
Blogged summary, Open Access enhances accessibility and citation impact, International Association of Agricultural Information Specialists, 13 July 2009: "The results showed that there is an obvious citation advantage for self-archived agriculture articles as compared to non-OA articles." - "results indicate that self-archived research articles published in the non-OA agriculture journals could attract nearly two times more citations than their non-OA counterparts."

Added 15 Oct 2009 Lariviere, V. and Gingras, Y. (2009)
The impact factor's Matthew effect: a natural experiment in bibliometrics
arXiv.org, arXiv:0908.3177v1 [physics.soc-ph], 21 Aug 2009, also in Journal of the American Society For Information Science And Technology, 61 (2): 424-427, February 2010
Makes no mention of open access impact, but presents some interesting parallel results on journal impact factors, in this case that publication in higher impact journals can result in higher citations for a given paper.

Added 15 Oct 2009 Asif-ul Haque and Ginsparg, P. (2009)
Positional Effects on Citation and Readership in arXiv
arXiv.org, arXiv:0907.4740v1 [cs.DL], 27 Jul 2009
in Journal of the American Society for Information Science and Technology, Vol. 60 No. 11, 2203 - 2218, published online: 22 Jul 2009

Comment on this paper:
Davis, P., Downloads, Citations, and Positional Effects in the arXiv, the scholarly kitchen blog, Jul 29, 2009: "Expanding and confirming an earlier study (Dietrich 2007, see also Dietrich 2008) on the positional effects in the arXiv"
Brooks, T., Timing and location count when announcing particle physics results, symmetry breaking blog, August 7, 2009: (links this paper with his own co-authored paper, Gentil-Beccot, Mele and Brooks) "So, by studying the systems that HEP uses to communicate (arXiv, SPIRES, and journals) we see that physicists in HEP are quite savvy about the communication tools they use and the ways those tools work. Researchers understand that work on arXiv is unpublished initially, but will eventually be peer-reviewed, and are willing to cite it during the interim period. They understand that the versions on arXiv are generally updated to match the final journal article, and prefer to use the consistent interface and access provided by arXiv. They understand that the arXiv daily listings are important and heavily read, thus they are willing to go to some lengths to submit papers right at 4:00:01 to get them on the top of the lists."

Added 15 Oct 2009 Greyson, D., Morgan, S., Hanley, G. and Wahyuni, D. (2009)
Open access archiving and article citations within health services and policy research
E-LIS, 14 Jul 2009, in Journal of the Canadian Health Libraries Association (JCHLA) / Journal de l'Association des bibliothèques de la santé du Canada (JABSC), 2009, vol. 30, no. 2, 51-58
From the abstract: This paper contributes to growing body of research exploring the “OA advantage” by employing an article-level analysis comparing citation rates for articles drawn from the same, purposively selected journals. We used a two-stage analytic approach designed to test whether OA is associated with (1) likelihood that an article is cited at all and (2) total number citations that an article receives, conditional on being cited at least once. Adjusting for potential confounders: number of authors, time since publication, journal, and article subject, we found that OA archived articles were 60% more likely to be cited at least once, and, once cited, were cited 29% more than non-OA articles.This paper contributes to growing body of research exploring the “OA advantage” by employing an article-level analysis comparing citation rates for articles drawn from the same, purposively selected journals. We used a two-stage analytic approach designed to test whether OA is associated with (1) likelihood that an article is cited at all and (2) total number citations that an article receives, conditional on being cited at least once. Adjusting for potential confounders: number of authors, time since publication, journal, and article subject, we found that OA archived articles were 60% more likely to be cited at least once, and, once cited, were cited 29% more than non-OA articles.
See also this poster (1pp) with the same title, E-LIS, 14 Jul 2009, in Canadian Health Libraries Association/ Association des bibliothèques de la santé du Canada (CHLA/ ABSC) Conference 2009, Winnepeg, Manitoba (Canada), May 30 - June 3, 2009

Added 15 Oct 2009 Joint, N. (2009)
The Antaeus column: does the “open access” advantage exist? A librarian's perspective
Library Review, Vol. 58, No. 7, 2009, 477-481
From the summary: Findings – The paper finds that many of the original arguments for the benefits of open access have fallen by the wayside; but that, in spite of this, there is a good evidence that an “open access advantage” does exist. The application of straightforward library statistical counting measures which are traditionally used to evaluate user benefits of mainstream services is just as effective an evaluation tool as more sophisticated citation analysis methods.

Added 15 July 2009 Gentil-Beccot, A., Mele, S., Brooks, T. (2009)
Citing and Reading Behaviours in High-Energy Physics. How a Community Stopped Worrying about Journals and Learned to Love Repositories
arXiv.org, arXiv:0906.5418v1 [cs.DL], v1, 30 Jun 2009
From the abstract: The analysis of citation data demonstrates that free and immediate online dissemination of preprints creates an immense citation advantage in HEP, whereas publication in Open Access journals presents no discernible advantage. In addition, the analysis of clickstreams in the leading digital library of the field shows that HEP scientists seldom read journals, preferring preprints instead.

Comment on this paper:
Hodgkinson, M., RE: Article on arXiv, liblicense mail list, 3 July 2009: "There is a blip upwards in citations of about 0.15 citations per article per month immediately upon publication, which does indicate some remaining effect of journal publication, but this increase is gone within 12 months."
Harnad, S., OA in High Energy Physics Arxiv Yields Five-Fold Citation Advantage, Open Access Archivangelism blog, July 12. 2009: "This is an important study, and most of its conclusions are valid (with caveats) ... From the fact that when there is a Green OA version available, users prefer to consult that Green OA version rather than the journal version, it definitely does not follow that journals are no longer necessary."

Added 15 July 2009 Lansingh, V. C. and Carter, M. J. (2009)
Does Open Access in Ophthalmology Affect How Articles are Subsequently Cited in Research? (abstract only, subscription required)
Ophthalmology, 116(8):1425-1431, August 2009, available online 22 June 2009 Scintilla
From the abstract: Examination of 480 articles in ophthalmology in the experimental protocol and 415 articles in the control protocol. ... Four subject areas were chosen to search the ophthalmology literature in the PubMed database ... Searching started in December of 2003 and worked back in time to the beginning of the year. The number of subsequent citations for equal numbers of both open access (OA) and closed access (CA) (by subscription) articles was quantified using the Scopus database and Google search engine. A control protocol was also carried out to ascertain that the sampling method was not systematically biased by matching 6 ophthalmology journals (3 OA, 3 CA) using their impact factors, and employing the same search methodology to sample OA and CA articles. The total number of citations was significantly higher for open access articles compared to closed access articles for Scopus. However, univariate general linear model (GLM) analysis showed that access was not a significant factor that explained the citation data. Author number, country/region of publication, subject area, language, and funding were the variables that had the most effect and were statistically significant. Control protocol results showed no significant difference between open and closed access articles in regard to number of citations found by Scopus ... Unlike other fields of science, open access thus far has not affected how ophthalmology articles are cited in the literature.

Comment on this paper:
Davis, P., Open Access Not the Focus in Ophthalmology, Scholarly Kitchen blog, Jul 14, 2009: "this article builds upon a growing wealth of evidence suggesting that the so-called "open access citation advantage" is merely a spurious relationship. It should serve as a gentle reminder that correlation does not equal causation."

Added 15 Oct 2009 Ostrowska, A. (2009)
Open Access Journals Quality – How to Measure It?
INFORUM 2009: 15th Conference on Professional Information Resources, Prague, May 27-29, 2009

Added 15 July 2009 Lin, S.-K. (2009)
Full Open Access Journals Have Increased Impact Factors (editorial)
Molecules, 2009, 14(6):2254-2255

Added 15 July 2009 Mukherjee, B. (2009)
The hyperlinking pattern of open-access journals in library and information science: A cited citing reference study
Library & Information Science Research, 31 (2), April 2009, 113-125
This paper appears to be another take on this study

Added 29 April 2009 Tiwari, A. (2009)
Citation Trend Line For PLoS Journals
Fisheye Perspective blog, April 25, 2009
A short illustrated blog on predicting the impact of a new journal. The author, a bioscientist, evaluates two PLoS (OA) journals using Scopus Journal Analyzer. Using the service's Trend Line and % Not Cited parameters the author predicts that one, a new journal that doesn't yet have an official impact factor, will soon rival the other, which does: "I am sure it's impact factor (or quality or what ever you love) is going to be same or may be much more." Does not claim to be statistically sound.

Added 29 April 2009 Gargouri, Y. and Harnad, S. (2009)
Logistic regression of potential explanatory variables on citation counts
Preprint 11/04/2009
Logistic regression analysis on the correlation between citation counts (as dependent variable) and a set of potential correlator/predictor variables.
Result: Published journal papers that are self-archived in institutional repositories - in this study the repositories mandate deposit, obviating the self-selection bias postulated by some to be a factor in self-archiving - can achieve a citation advantage whether published in journals of high and low impact factor (IF): "Overall, OA is correlated with a significant citation advantage for all journal IF intervals".

Added 29 April 2009 Watson, A. B. (2009)
Comparing citations and downloads for individual articles
Journal of Vision, April 3, 2009 Volume 9, Number 4, Editorial i, Pages 1-4
Measures the correlation between downloads and citations counts for articles in Journal of Vision: "Download statistics provide a useful indicator, two years in advance, of eventual citations."

Added 29 April 2009 Bollen, J., Van de Sompel, H., Hagberg, A., Bettencourt, L., Chute, R., Rodriguez, M. A. and Balakireva, L. (2009)
Clickstream Data Yields High-Resolution Maps of Science
PLoS ONE, 4(3): e4803, March 11, 2009 Scintilla
See also Nature news article on this paper, 9 March 2009: "A striking difference in the usage maps is that journals in the humanities and social sciences figure much more prominently than in citation-based maps. The difference partly arises because Bollen's study covers a wider literature than the citation databases, which are biased towards natural sciences journals. "By including practitioners we capture a much wider sample of the scholarly community," adds Bollen. Usage maps are also more up to date than citation ones because the inherent delay in publication means it takes at least two years before a paper will start to gather citations in sufficient numbers to be meaningful. Anthony van Raan argues that this more current view may in fact represent today's "fashions", rather than trends that will endure."
These findings are not based on OA journals or papers, but highlight the emerging value of clicks, or hits, as possible contributory factors for online impact metrics.

Comment on this paper:
Davis, P., Usage Map of Science, the scholarly kitchen blog, Mar 16, 2009: "While the mapping of journal relationships using citations is several decades old, using clickstream data makes this article novel."

Added 15 July 2009 Bernius, S. and Hanauske, M. (2009)
Open Access to Scientific Literature - Increasing Citations as an Incentive for Authors to Make Their Publications Freely Accessible
Institute for Information Systems, Frankfurt University, publications 2009, in 42nd Hawaii International Conference on System Sciences (HICSS '09), 5-8 Jan. 2009, pp. 1-9 http://dx.doi.org/10.1109/HICSS.2009.335
Summary of results from also in
Bernius, S., Hanauske, M., König, W. and Dugall, B. (2009)
Open Access Models and their Implications for the Players on the Scientific Publishing Market (see section 1.2)
Economic Analysis and Policy Journal, Vol. 39, No. 1, March 2009

Added 29 April 2009 Åström, F. (2009)
Citation patterns in open access journals
OpenAccess.se and the National Library of Sweden, February 25, 2009.
"Fewer analyses have investigated whether OA and non-OA journals in the same research fields are citing the same literature; and to what extent this reflects whether it is the same kind (and thus comparable) research that is published in the two forms of scholarly publications. ... The citation structures in the journals were analysed through MDS maps building on co-citation analyses, as well as a more thorough comparison investigating overlaps of cited authors and journals between the different journals. ... The results of the analyses suggests that it is hard to draw any overall conclusions on the matter of whether research published in OA journals is likely to have a larger citation impact or not."
This conclusion is unsurprising since the study did not measure impact but mapped citation patterns between journals. It is suggested that these mappings could improve understanding when comparing the impact of OA and non-OA journals.

Added 29 April 2009 Gaulé, P. (2009)
Access to the scientific literature in India
CEMI Working Paper 2009-004, February 23, 2009, in In Journal of the American Society for Information Science and Technology, Vol. 60, Issue 12, 2548 - 2553, published online: 8 Oct 2009
Abstract: This paper uses an evidence-based approach to assess the difficulties faced by developing country scientists in accessing the scientific literature. I compare backward citations patterns of Swiss and Indian scientists in a database of 43'150 scientific papers published by scientists from either country in 2007. Controlling for fields and quality with citing journal fixed effects, I find that Indian scientists (1) have shorter references lists (2) are more likely to cite articles from open access journals and (3) are less likely to cite articles from expensive journals. The magnitude of the effects is small which can be explained by informal file sharing practices among scientists.

Comment on this paper:
Davis, P., No Journal Access? Email the Author, Colleague, the scholarly kitchen blog, Nov 18, 2009: Davis considers how lack of access to science publications affects research in developing countries, focussing on this paper by Gaulé, and briefly considering two other papers (Frandsen, Evans and Reimer) on the topic.

Added 26 February 2009 Evans, J. A. and Reimer, J. (2009)
Open Access and Global Participation in Science (full text requires subscription; summary only)
Science, Vol. 323. No. 5917, 20 February 2009, 1025 Scintilla
From the paper: "The influence of OA is more modest than many have proposed, at c.8% for recently published research, but our work provides clear support for its ability to widen the global circle of those who can participate in science and benefit from it."
Listen to Science podcast interview with James Evans
See also articles on this paper:
Dolgin, E., Online access = more citations, The Scientist, 19th February 2009 (free registration required): "When the authors looked just at poorer countries, however, they found that the influence of open access was more than twice as strong. For example, in Bulgaria and Chile, researchers cited nearly 20% more open access articles, and in Turkey and Brazil, the number of citations rose by more than 25%. Free online availability "is not a huge driver of science in the first world, but it shapes parts of science in the rest of world," Evans told The Scientist."
Xie, Y., Open, electronic access to research crucial for global reach, ars technica, February 19, 2009

Comment on this paper:
Harnad, S., Open Access Benefits for the Developed and Developing World: The Harvards and the Have-Nots, Open Access Archivangelism blog, Feb 19, 2009: "Evans & Reimer's study (E & R) is particularly timely and useful. It shows that a large portion of the Open Access citation impact advantage comes from providing the developing world with access to the research produced by the developed world. Using a much bigger database, E & R refute (without citing!) a recent flawed study (Frandsen 2009) that reported that there was no such effect ... Last, there is the question of the effect of access embargoes. It is important to note that E & R's results are not based on immediate OA but on free access after an embargo of up to a year or more. Theirs is hence not an estimate of the increase in citation impact that results from immediate Open Access; it is just the increase that results from ending Embargoed Access."
See also this blog on the 'silly spin' of the NSF press release on the paper and on the 'welter of misunderstandings' of a follow-up blog based on the press release in the Chronicle of Higher Education.
Davis, P., Open Access and Global Participation in Science, the scholarly kitchen blog, Feb 19, 2009: "Advocates for open access will see this article as supporting their cause. But those who spend time reading the methodology will notice that message is not as clear as the article implies."
Eisen, M., Letter to the editor in Science, it is NOT junk blog, July 17, 2009: "the 8% statistic that Evans and Reimer highlight is misleading. The authors supporting online material (figure S1C) clearly shows that the impact of free access on citations is heavily dependent on the age of the article at the time free access was provided. In particular, when articles were made freely available within 2 years of publication, their citations increased by almost 20%. This far more dramatic effect is the one scientists and journals should consider when deciding when to provide free access. If this decision is to be made purely on the basis of citation impact, the upward trend of the curve in figure S1C argues strongly in favor of minimal delays."
Four Letters on this paper are published in the Science issue of 17 July 2009, Vol 325, issue 5938, including this one by Eisen (and Salzberg), as well as others by Davis ('Increased Citations Not Guaranteed'), Burdett ('The Self-Selection Effect'), and Alberts ('The Sooner the Better'), with a repsonse by the original authors.

Added 26 February 2009 Bollen, J., Van de Sompel, H., Hagberg, A. and Chute, R. (2009)
A principal component analysis of 39 scientific impact measures
arXiv.org, arXiv:0902.2183v1 [cs.CY], 12 Feb. 2009, in PLoS ONE 4(6): e6022, http://dx.doi.org/10.1371/journal.pone.0006022

Comment on this paper:
Davis, P., Scientific Impact Measures Compared, the scholarly kitchen blog, Feb 17, 2009: "While this manuscript represents phenomenal empirical work, 'scientific impact' on philosophical grounds will always remain a complex construct; and because of its complexity, it will resist a single measure. We may all agree for practical purposes that it be redefined with a new counting tool. But that new tool is simply a different view of an enormous and complex beast."

Added 29 April 2009 Castillo, M. (2009)
Citations and Open Access: Questionable Benefits
American Journal of Neuroradiology, February 2009 Scintilla
An editorial.
(Ed. For the record, but I cannot indicate what is questionable about OA, in this author's view, as I can't access any part of this, not even an abstract.)

Added 26 February 2009 Norris, M. (2009)
The citation advantage of open access articles
PhD thesis, Loughborough University Institutional Repository, 2009-01-15
Michael Norris has been named Highly Commended Award winner of the 2008 Emerald/EFMD Outstanding Doctoral Research Award in the Information Science category for this doctoral thesis.
Two published papers (JASIST, ElPub) are based on this work.

Added 15 July 2009 Frandsen, T. F. (2009)
The effects of open access on un-published documents: A case study of economics working papers
HAL: hprints-00352359, version 2, 12 January 2009, Journal of Informetrics (2009) in press

Added 13 January 2009 O'Leary, D. E. (2008)
The relationship between citations and number of downloads
Decision Support Systems, Vol. 45, No. 4, November 2008, 972-980, available online 11 April 2008 (full text requires subscription; abstract only)
Broadly agrees with earlier findings (e.g. Brody et al.) about the correlation - 'strong positive statistically significant relationship' - between downloads and citations for digital papers, notably for the most-downloaded, 'top' papers, in this case based on data for a single, focussed source, the journal Decision Support Systems.

Added 29 April 2009 Mukherjee, B. (2008)
Do open-access journals in library and information science have any scholarly impact? A bibliometric study of selected open-access journals using Google Scholar (full text requires subscription; abstract only)
Journal of the American Society for Information Science and Technology, Vol. 60, No. 3, March 2009, 581-594, published online: 16 Dec 2008
From the abstract: "Using 17 fully open-access journals published uninterruptedly during 2000 to 2004 in the field of library and information science, the present study investigates the impact of these open-access journals in terms of quantity of articles published, subject distribution of the articles, synchronous and diachronous impact factor, immediacy index, and journals' and authors' self-citation."
The paper does not appear to reveal any comparative findings (OA vs non-OA).

Added 24 November 2008 Tenopir, C. and King, D. W. (2008)
Electronic Journals and Changes in Scholarly Article Seeking and Reading Patterns
D-Lib Magazine, Vol. 14 No. 11/12, November/December 2008
From the abstract: "Reading patterns and citation patterns differ, as faculty read many more articles than they ultimately cite and read for many purposes in addition to research and writing. The number of articles read has steadily increased over the last three decades, so the actual numbers of articles found by browsing has not decreased much, even though the percentage of readings found by searching has increased. Readings from library-provided electronic journals has increased substantially, while readings of older articles have recently increased somewhat. Ironically, reading patterns have broadened with electronic journals at the same time citing patterns have narrowed."

Comment on this paper:
Harnad, S., Open Access Allows All the Cream to Rise to the Top, Open Access Archivangelism blog, 19 November 2008: "confirmation of the finding (of Kurtz and others [Evans, Lariviere et al.]) -- that as more articles become accessible, more articles are indeed accessed (and read), but fewer articles are cited (and those are cited more) -- is best explained by the increased selectivity made possible by that increased accessibility ... Open Access (OA) allows all the cream to rise to the top; accessibility is no longer a constraint on what to cite. (One of the reasons the top articles are more likely to be made OA is precisely that they are also more likely to be cited more if they are made OA!)"
Davis, P., Scientists Reading More, Citing Less, the scholarly kitchen blog, 24 November 2008: "What is clear from their data is that scientists are reading more articles from more journals. On face value, this would seem to imply that scientists would be citing a greater diversity of articles. On the other hand, Tenopir and King also report that scientists are using searching tools, citations, and people more often to help them decide what to read. This would imply that scientists are becoming more focused in their information seeking behavior, which would lead to a reduction of citation diversity."

Added 24 November 2008 Gaule, P. and Maystre, N. (2008)
Getting cited: does open access help?
Ecole Polytechnique Fédérale de Lausanne, CEMI-WORKINGPAPER-2008-007, November 2008
also available from RePEc http://ideas.repec.org/p/cmi/wpaper/cemi-workingpaper-2008-007.html
Explains the 'widely held belief' that free availability of scientific articles increases the number of citations they receive thus: "Since open access is relatively more attractive to authors of higher quality papers, regressing citations on open access and other controls yields upward-biased estimates." Findings are based on a sample of 4388 biology papers published between May 2004 and March 2006 by Proceedings of the National Academy of Sciences (PNAS). "Using an instrumental variable approach, we find no significant effect of open access. Instead, self-selection of higher quality articles into open access explains at least part of the observed open access citation advantage." Note, OA in PNAS is itself self-selective by virtue of its OA charging structure (i.e. payment is required for the published article to be OA, but not if it is not OA).

Comment on this paper:
Davis, P., Should You Pay To Get Cited?, the scholarly kitchen blog, 18 November 2008: "This study adds to a growing literature that casts doubt on early research suggesting that free access to the scientific literature leads to increased citations. Self-selection, whereby higher quality articles are made freely available, is beginning to seem a much more plausible explanation."

Added 26 February 2009 Frandsen, T. F. (2008)
Attracted to open access journals: a bibliometric author analysis in the field of biology
Hprints, Nordic arts and humanities e-print archive, HAL: hprints-00328270, version 1, 10 October 2008
also in Journal of Documentation, January 2009 http://www.emeraldinsight.com/Insight/viewContentItem.do?contentType=Article&contentId=1766883

Comment on this paper:
Davis, P., Open Access: No Benefit for Poor Scientists, the scholarly kitchen blog, Jan 14, 2009: "Open Access has a moral agenda: to increase the flow of scientific information to researchers in developing nations. Yet a new study suggests that authors in developing countries are no more likely to write papers for Open Access journals and are no more likely to cite Open Access articles." Appended to Davis' blog, Gaule, P., 14 Jan. 2009: "Ms Frandsen's conclusion that 'authors from developing countries do not cite open access more than authors from developed countries' is not based on solid evidence. While she reports the p-value and not the standard errors, it is clear from her regression results that she cannot statistically rule out the possibility that authors from developing countries may be more likely to cite open access journals."
Harnad, S., Comparing OA/non-OA in Developing Countries, Open Access Archivangelism blog, January 14. 2009: "The Frandsen study focused on OA journals, not on OA articles. It is problematic to compare OA and non-OA journals, because journals differ in quality and content, and OA journals tend to be newer and fewer than non-OA journals (and often not at the top of the quality hierarchy). In contrast, most studies that have compared OA and non-OA articles within the same journal and year have found a significant citation advantage for OA. It is highly unlikely that this is only a developed-world effect; indeed it is almost certain that a goodly portion of OA's enhanced access, usage and impact comes from developing-world users."
Chan, L., Comparing OA/non-OA in Developing Countries, Open Access Archivangelism blog, January 14. 2009: "The Frandsen study focuses on biology journals and I am not sure what percentage of them are available to DC researchers through HINARI/AGORA. This would explain why researchers in this area would not need to rely on OA materials as much. Citation behaviour is complex indeed and more studies on OA's impact in the developing world are clearly needed. Davis's eagerness to pronounce that there is "No Benefit for Poor Scientists" based on one study is highly premature."

Added 26 February 2009 Frandsen, T. F. (2008)
The integration of open access journals in the scholarly communication system: Three science fields
Hprints, Nordic arts and humanities e-print archive, HAL: hprints-00326285, version 1, 2 October 2008
also in Information Processing & Management, January 2009 http://dx.doi.org/10.1016/j.ipm.2008.06.001
From the abstract: "This study is an analysis of the citing behaviour in (open access) journals within three science fields: biology, mathematics, and pharmacy and pharmacology. The integration of OAJs in the scholarly communication system varies considerably across fields. The implications for bibliometric research are discussed."

Added 13 January 2009 De Groote, S. L (2008)
Citation patterns of online and print journals in the digital age
J. Med. Libr. Assoc., 2008 October; 96(4): 362-369 Scintilla
From the abstract: "Journals available in electronic format were cited more frequently in publications from the campus whose library had a small print collection, and the citation of journals available in both print and electronic formats generally increased over the years studied."

Added 10 November 2008 Kousha, K. (2008)
Characteristics of Open Access Web Citation Network: A Multidisciplinary Study
Proceedings of WIS 2008, Fourth International Conference on Webometrics, Informetrics and Scientometrics & Ninth COLLNET Meeting (Berlin, 28 July - 1 August 2008), edited by H. Kretschmer and F. Havemann, October 2008

Added 10 November 2008, updated 29 April 2009 Lariviere, V., Gingras, Y. and Archambault, E. (2008)
The decline in the concentration of citations, 1900-2007
arXiv.org, arXiv:0809.5250v1 [physics.soc-ph], 30 Sep 2008 and in Journal of the American Society for Information Science and Technology, Vol. 60, No. 4, April 2009, 858-862, published online: 29 Jan 2009
From the abstract: "This paper challenges recent research (Evans, 2008) reporting that the concentration of cited scientific literature increases with the online availability of articles and journals. ... contrary to what was reported by Evans, the dispersion of citations is actually increasing."

Comment on this paper:
Davis, P., Citation Controversy, the scholarly kitchen blog, 13 October 2008: "a new analysis taking aim at (Evans') diversity claim. ... Unlike Evan's article, this paper does not require knowledge of negative binomial regression, or any advanced statistics for that matter; and because of the simplicity and descriptive approach to their analysis, it is very convincing. Granted, Evans is using a different approach, looking at the effect of when journals became available online on citation behavior and whether commercial access or free access changes the outcome. For that reason, we should not discount the merit of Evan's paper. ... What makes this controversy interesting is that both studies make theoretical sense. Like many scientific controversies, the argument over citation diversity will move toward consensus and closure. For the meantime, the debate remains open."

Added 25 September 2008 Davis, P. M.
Author-choice open access publishing in the biological and medical literature: a citation analysis
arXiv.org, arXiv:0808.2428v1 [cs.DL], 18 Aug 2008, in Journal of the American Society for Information Science & Technology, Vol. 60, No. 1, January 2009, 3-8, published online: 25 Sep 2008 Scintilla
This study is a follow-up to the controlled trial of open access publishing published in the BMJ: "According to a study of 11 biological and medical journals that allow authors the choice of making their articles freely available from the publisher's website, few show any evidence of a citation advantage. For those that do, the effect appears to be diminishing over time. ... (the paper) analyzed over eleven thousand articles published in journals since 2003, sixteen hundred of these articles (15%) adopting the author-choice open access model."

Comment on this paper:
Harnad, S., Confirmation Bias and the Open Access Advantage: Some Methodological Suggestions for Davis's Citation Study, Open Access Archivangelism blog, August 25, 2008: "The outcome, confirming previous studies (on both paid and unpaid OA), is a significant OA citation advantage, but a small one (21%, 4% of it correlated with other article variables such as number of authors, references, and pages). The author infers that the size of the OA advantage in this biomedical sample has been shrinking annually from 2004-2007, but the data suggest the opposite." (This blog entry includes a line-by-line commentary on the full paper) For the author's replies to these points, see the responses to this blog.

Added 25 September 2008 Clauson, K. A., Veronin, M. A., Khanfar, N. M. and Lou, J. Q.
Open-access publishing for pharmacy-focused journals (full text requires subscription; summary only)
American Journal of Health-System Pharmacy, Vol. 65, No. 16, 1539-1544, August 15, 2008 Scintilla
From the conclusion. A very small number of pharmacy-focused journals adhere to the OA paradigm of access. However, journals that adopt some elements of the OA model, chiefly free accessibility, may be more likely to be cited than traditional journals. Pharmacy practitioners, educators, and researchers could benefit from the advantages that OA offers but should understand its financial disadvantages.
The same issue has an editorial by C. Richard Talley, Open-access publishing: why not? accessible only to subscribers

Comment on this editorial:
Suber, P., OA pharma journals cited more than non-OA pharma journals, Open Access News, 12 August 2008: "(Talley) acknowledges that OA to pharma journals "might indirectly improve public health", but tries to explain why AJHP is not OA. He starts with a potshot at the article by Clauson et al., which he just agreed to publish, by asserting flatly that "Davis et al. found that open-access publishing does not increase article citations." He doesn't attempt to reconcile the Clauson and Davis studies, and doesn't mention that the Davis study found no short-term citation increase while (link to this bibliography) dozens of previous studies have found long-term citation increases."

Added 10 November 2008 Henneken, E. A., Kurtz, M. J., Accomazzi, A., Grant, C. S., Thompson, D., Bohlen, E. and Murray, S. S. (2008)
Use of Astronomical Literature - A Report on Usage Patterns
arXiv.org, arXiv:0808.0103v1 [cs.DL], 1 Aug 2008
in Journal of Informetrics, Vol. 3, Issue 1, 1-90 (January 2009)

Added 4 August 2008 Davis, P.M., Lewenstein, B. V., Simon, D. H., Booth, J. G. and Connolly, M. J. L. (2008)
Open access publishing, article downloads, and citations: randomised controlled trial
BMJ, 2008;337:a568, published 31 July 2008 Scintilla

Comment on this paper:
Godlee, F., Open access to research (editorial), BMJ, 2008;337:a1051, 31 July 2008: "This week the BMJ publishes a paper that has nothing directly to do with medicine or health care. It does, however, have everything to do with access to research results, a topic that should interest authors and readers in any field. ... questions that reach to the very heart of the way in which scientists and clinicians communicate. ... questions also reach to the heart of how academics are acknowledged and rewarded."
Anderson, K. (2008), Open Access Doesn't Drive Citations, The Scholarly Kitchen blog, July 31, 2008: "First, a caveat. Phil Davis is a fellow blogger on this site. This study has important advantages over prior studies. It was randomized by the researchers, so authors or publishers didn't select which articles were made open access. It was done prospectively, so that the data were analyzed from point zero forward. These are crucial advantages in study design, in my opinion, and place this study head and shoulders above any prior study asserting a citation advantage. In fact, authors and editors often make studies they think are more significant free or push them online early. Retrospective, non-randomized studies fall prey to all sorts of problems because of these confounding effects."
Harnad, S. (2008), Davis et al's 1-year Study of Self-Selection Bias: No Self-Archiving Control, No OA Effect, No Conclusion, Open Access Archivangelism blog, July 31, 2008: "To show that the OA advantage is an artefact of self-selection bias (or of any other factor), you first have to produce the OA advantage and then show that it is eliminated by eliminating self-selection bias (or any other artefact). This is not what Davis et al. did. They simply showed that they could detect no OA advantage one year after publication in their sample. This is not surprising, since most other studies, some based on hundreds of thousands of articles, don't detect an OA advantage one year after publication either. It is too early. To draw any conclusions at all from such a 1-year study, the authors would have had to do a control condition, in which they managed to find a sufficient number of self-selected, self-archived OA articles (from the same journals, for the same year) that do show the OA advantage, whereas their randomized OA articles do not. In the absence of that control condition, the finding that no OA advantage is detected in the first year for this particular sample of 247 out of 1619 articles in 11 physiological journals is completely uninformative. The authors did find a download advantage within the first year, as other studies have found. This early download advantage for OA articles has also been found to be correlated with a citation advantage 18 months or more later. The authors try to argue that this correlation would not hold in their case, but they give no evidence (because they hurried to publish their study, originally intended to run four years, three years too early.)"
Eysenbach, G. (2008), Word is still out: Publication was premature, BMJ Rapid Responses, 1 August 2008: "While parts of the paper will be welcomed by most Open Access advocates as far as the access/usage data are concerned, showing (unsurprisingly) a significant increase in access and use of Open Access articles compared to non-OA articles, other parts of the paper are more controversial (to be diplomatic). Davis et al failed to show a citation advantage after 9-12 months, from which they conclude that "the citation advantage from open access reported widely in the literature may be an artifact of other causes." Jumping to these conclusions after only 9-12 months is quite irritating ... What surprised (and bothered) me especially is that Davis et al. cites my PNAS data published in PLoS Biology to justify his short observation period. While I indeed found already a statistically significant difference after only 4-10 months (1.2 citations in the nOA group versus 1.5 citations in the OA group), most citations appeared after 10-16 months (4.5 versus 6.4) and 16-22 months (8.9 vs 13.1). In summary, this would have been an important and much more credible paper if it would have been published in 2-3 years as opposed to a salami approach."
More Rapid Responses to this paper

Harnad, S., On Eggs and Citations, Open Access Archivangelism blog, August 29, 2008: "Failing to observe a platypus laying eggs is not a demonstration that the platypus does not lay eggs. ... Failing to observe a significant OA citation Advantage within a year of publication (or a year and a half -- or longer, as the case may be) with randomized OA does not demonstrate that the many studies that do observe a significant OA citation Advantage with nonrandomized OA are simply reporting self-selection artifacts (i.e., selective provision of OA for the more highly citable articles.) ... The many reports of the nonrandomized OA Citation Advantage are based on samples that were sufficiently large, and on a sufficiently long time-scale (almost never as short as a year) to detect a significant OA Citation Advantage. A failure to observe a significant effect with small, early samples, on short time-scales -- whether randomized or nonrandomized -- is simple that: a failure to observe a significant effect: Keep testing till the size and duration of your sample of randomized and nonrandomized OA is big enough to test your self-selection hypothesis (i.e., comparable with the other studies that have detected the effect)."

Added 24 November 2008 Levitt, J. M. and Thelwall, M. (2008)
Patterns of annual citation of highly cited articles and the prediction of their citation ranking: A comparison across subjects (full text requires subscription; abstract only)
Scientometrics, Vol. 77, No. 1 (2008) 41-46, published online: 24 July 2008
From the abstract: "For four of the six subjects, there is a correlation of over 0.42 between the percentage of early citations and total citation ranking but more highly ranked articles had a lower percentage of early citations. Surprisingly, for highly cited articles in all six subjects the prediction of citation ranking from the sum of citations during their first six years was less accurate than prediction using the sum of the citations for only the fifth and sixth year."
Open access is not a factor here. The highly-cited subject articles investigated here date from 1969-71. For open access papers, Brody et al. (2005) revealed a correlation to predict impact from much earlier data, i.e. download data for OA papers, before any citations.

Added 28 July 2008 Evans, J. A. (2008)
Electronic Publication and the Narrowing of Science and Scholarship (full text requires subscription; abstract only)
Science, Vol. 321, No. 5887, 18 July 2008, 395-399 Scintilla
From the abstract: "Using a database of 34 million articles, their citations (1945 to 2005), and online availability (1998 to 2005), I show that as more journal issues came online, the articles referenced tended to be more recent, fewer journals and articles were cited, and more of those citations were to fewer journals and articles. The forced browsing of print archives may have stretched scientists and scholars to anchor findings deeply into past and present scholarship. Searching online is more efficient and following hyperlinks quickly puts researchers in touch with prevailing opinion, but this may accelerate consensus and narrow the range of findings and ideas built upon."
This work was funded by the National Science Foundation. See the NSF press release and video interview with James Evans
See also the news feature Great minds think (too much) alike, The Economist, July 17th 2008

Comment on this paper:
Davis, P. (2008), The Paradox of Online Journals, The Scholarly Kitchen blog, July 18, 2008: "This does not mean that total citations are dropping, only that fewer articles get cited more. While it is tempting to eulogize the end of scholarship, this article may signify that the dissemination of science is working more efficiently than ever. The institution of science values the progress of discovery followed by a consensus and closure of debate. That more of the literature is effectively being ignored may not necessarily signal a bad thing (although it may concern those who are not read)."
Hooker, Bill (2008), An Open Access partisan's view, Open Reading Frame blog, 19 July 2008: "Evans seems to me to gloss over the question of what proportion of the online archives are freely available, and what effect that has on the phenomenon he is attempting to model. ... the driving force in Evans' suggested "narrow[ing of] the range of findings and ideas built upon" is not online access per se but in fact commercial access, with its attendant question of who can afford to read what. Evans' own data indicate that if the online access in question is free of charge, the apparent narrowing effect is significantly reduced or even reversed."
For collected comments on this paper, see Electronic Publication and the Narrowing of Science and Scholarship. Really? A Blog Around the Clock, July 19, 2008
Suber, P. (2008), Online researchers have access to more articles but cite fewer, Open Access News, July 17, 2008: "Evans' results also appear to conflict with a recent study by Arthur Eger."
Harnad, S. (2008), Are Online and Free Online Access Broadening or Narrowing Research? Open Access Archivangelism blog, August 4, 2008: "In one of the few fields where [the narrowing citations effect found by Evans] can be and has been analyzed thoroughly, astrophysics, which effectively has 100% Open Access (OA) (free online access) already, Michael Kurtz too found that with free online access to everything, reference lists became (a little) shorter, not longer, i.e., people are citing (somewhat) fewer papers, not more, when everything is accessible to them. ... Are online and free online access broadening or narrowing research? They are broadening it by making all of it accessible to all researchers, focusing it on the best rather than merely the accessible, and accelerating it."
Gingras, Y., Lariviere, V. and Archambault, E. (2008), Literature Citations in the Internet Era (Letter), Science, Vol. 323, No. 5910, 2 January 2009, 36: "(Evans's) conclusions are not warranted by (the) data. ... In fact, Evans's conclusions only reflect a transient phenomenon related to recent access to online publications and to the fact that the method used does not take into account time delays between citation year and publication year."

Added 25 September 2008, updated 13 January 2009 Norris, M., Oppenheim, C., and Rowland, F.
The citation advantage of open-access articles (full text requires subscription; abstract only)
Journal of the American Society for Information Science and Technology, Vol. 59, No. 12, 2008, 1963-1972, published online: 9 July 2008
also available from Loughborough University Institutional Repository, 2009-01-12 http://hdl.handle.net/2134/4083
From the abstract: "Of a sample of 4,633 articles examined, 2,280 (49%) were OA and had a mean citation count of 9.04 whereas the mean for (toll access) TA articles was 5.76. There appears to be a clear citation advantage for those articles that are OA as opposed to those that are TA. This advantage, however, varies between disciplines, with sociology having the highest citation advantage, but the lowest number of OA articles, from the sample taken, and ecology having the highest individual citation count for OA articles, but the smallest citation advantage. Tests of correlation or association between OA status and a number of variables were generally found to weak or inconsistent. The cause of this citation advantage has not been determined."

Added 28 July 2008 Eger, A. (2008)
Database statistics applied to investigate the effects of electronic information services on publication of academic research - a comparative study covering Austria, Germany and Switzerland
GMS Medizin - Bibliothek - Information, June 26, 2008
Findings on increased usage of online full text articles leading to increased publication, but says nothing on the effects of such access on citation practices

Updated 25 September 2008 Norris, M., Oppenheim, C. and Rowland, F. (2008)
Open Access Citation Rates and Developing Countries
12th International Conference on Electronic Publishing (ElPub 2008), Toronto, June 25-27, 2008
"the admittedly small number of citations from authors in developing countries do indeed seem to show a higher proportion of citations being given to OA articles than is the case for citations from developed countries."

Added 25 September 2008 Sheikh Mohammad, S.
Research impact of open access contributions across disciplines
12th International Conference on Electronic Publishing (ElPub 2008), Toronto, June 25-27, 2008

Added 10 November 2008 Dietrich, J. P. (2008)
Disentangling visibility and self-promotion bias in the arXiv: astro-ph positional citation effect
arXiv.org, arXiv:0805.0307v2 [astro-ph], 25 Jun 2008, in Publications of the Astronomical Society of the Pacific, 120 (869): 801-804

Comment on this paper:
Harnad, S., Self-Promotion Bias in Arxiv Deposit Listings, Open Access Archivangelism blog, 5 August 2008: "The authors rightly point out that in a high-output field like astrophysics, visibility is an important factor in usage and citations, and authors need alerting and navigation aids based on importance, relevance and quality, rather than on random timing and author self-promotion biasses. I would add that in fields -- whether high- or low-output -- that, unlike astrophysics, are not yet OA, accessibility itself probably has much the same sort of effect on citations that visibility does in an OA field like astrophysics. (Even maximized visibility cannot make articles accessible to those who cannot afford access to the full-text.)"
Davis, P., The Importance of Being First, the scholarly kitchen blog, 7 August 2008: " an interesting paper indeed, and has its theoretical explanation in attention economics. Physicists, like everyone else, look to the order of one's paper as a heuristic for deciding what to read. This is no different than using other tools, like citations or references as tools to guide a scientists to what is important."
see also Suber, P., Visibility beyond open access, SPARC Open Access Newsletter, issue #87, July 2, 2005

Added 26 May 2008 Cheng, W. H. and Ren, S. L. (2008)
Evolution of open access publishing in Chinese scientific journals (full text requires subscription; abstract only)
Learned Publishing, Vol. 21, No. 2, April 2008, 140-152
From the abstract: "Citation indicators of OA journals were found to be higher than those of non-OA journals."

Added 28 July 2008 Harnad, S., Brody, T., Vallières, F., Carr, L., Hitchcock, S., Gingras, Y., Oppenheim, C., Hajjem, C. and Hilf, E. R. (2008)
The Access/Impact Problem and the Green and Gold Roads to Open Access: An Update
Serials Review, Vol. 34, Issue 1, March 2008, 36-40, available online 6 March 2008 Scintilla
also available from ECS EPrints, 06 Jun 2008 http://eprints.ecs.soton.ac.uk/15852/
Update to the paper published in Serials Review, 30(4), 2004

Added 26 May 2008 Lokker, C., McKibbon, K. A., McKinlay, R.J., Wilczynski, N. L. and Haynes, R. B. (2008)
Prediction of citation counts for clinical articles at two years using data available within three weeks of publication: retrospective cohort study
BMJ, 2008;336:655-657 (22 March), published 21 February 2008 Scintilla
"Conclusion: Citation counts can be reliably predicted at two years using data within three weeks of publication."

Comment on this paper:
Harnad, S. (2008), Predicting later citation counts from very early data, American Scientist Open Access Forum, 21 April 2008: "This finding reinforces the importance of taking into account as many predictor metrics as possible, though a number of the metrics do seem specific to clinical medical articles. ... We might perhaps make a distinction between static and dynamic metrics. This study was based largely on static metrics, in that they are fixed as of the day of publication. Dynamic metrics like early downloads (which have also been found to predict later citations) were not included, nor were early citation growth metics (also predictive of later citations). ... To my mind, the article reinforces the importance of validating all these metrics, not just against one another, but against peer evaluations, in all fields, as in the RAE 2008 database"

Added 11 February 2008 Chu, H. and Krichel, T. (2008)
Downloads vs. Citations: Relationships, Contributing Factors and Beyond
E-LIS, 9 February 2008, in 11th Annual Meeting of the International Society for Scientometrics and Informetrics, Madrid, 25-27 June 2007
From the abstract: "In a nutshell, an infrastructure that encourages downloading at digital libraries would eventually lead to higher usage of their resources."

Added 26 May 2008 Turk, N. (2008)
Citation impact of Open Access journals (full text requires subscription; summary only)
New Library World, Vol. 109, No. 1/2, January/February 2008, 65-74
Review of the main research about citation impact of Open Access journals, focused on LIS journals.

Added 26 May 2008 Hardisty, D. J. and Haaga, D. A. F. (2008)
Diffusion of Treatment Research: Does Open Access Matter? (pdf 39pp)
Center for the Decision Sciences, Columbia University, in Journal of Clinical Psychology, Vol. 64(7), 1-19 (2008)
From the abstract: "In a pair of studies, mental health professionals were given either no citation, a normal citation, a linked citation, or a free access citation and were asked to find and read the cited article. After one week, participants read a vignette on the same topic as the article and gave recommendations for an intervention. In both studies, those given the free access citation were more likely to read the article, yet only in one study did free access increase the likelihood of making intervention recommendations consistent with the article."

Added 11 February 2008 Kousha, K. and Thelwall, M. (2007)
The Web impact of open access social science research (full-text requires subscription; otherwise abstract only)
Library & Information Science Research, Volume 29, Issue 4, December 2007, 495-507, available online 15 October 2007
preprint http://www.scit.wlv.ac.uk/~cm1993/papers/OpenAccessSocialSciencePreprint.doc (.doc 12pp)
From the abstract: "The results suggest that new types of citation information and informal scholarly indictors could be extracted from the Web for the social sciences."

Added 17 December 2007 Dietrich, J. P. (2007)
The Importance of Being First: Position Dependent Citation Rates on arXiv:astro-ph
arXiv.org, arXiv:0712.1037v1 [astro-ph], 6 December 2007, in Publications of the Astronomical Society of the Pacific, 120 (864): 224-228, February 2008
From the abstract: "We study the dependence of citation counts of e-prints published on the arXiv:astro-ph server on their position in the daily astro-ph listing. ... cannot exclude that increased visibility at the top of the daily listings contributes to higher citation counts as well."

Comment on this paper:
Kurtz, M. J. (2007), Systematics in Citation Statistics: Implications for the "OA Advantage", American Scientist Open Access Forum, 14 December 2007: "shows that systematic increases in citation rates on the order of the 2 to 1 "OA advantage" can be obtained by other, systematic means, such as author bias, as pointed out by myself, Moed and others. Note that OA plays no role in the Dietrich paper, all the articles are from arXiv. "

Added 13 September 2007 Kurtz, M. J. and Henneken, E. A. (2007)
Open Access does not increase citations for research articles from The Astrophysical Journal
arXiv.org, arXiv:0709.0896v1 [cs.DL], 6 September 2007
Abstract: We demonstrate conclusively that there is no "Open Access Advantage" for papers from the Astrophysical Journal. The two to one citation advantage enjoyed by papers deposited in the arXiv e-print server is due entirely to the nature and timing of the deposited papers. This may have implications for other disciplines.

Comment on this paper:
Harnad, S. (2007), Where There's No Access Problem There's No Open Access Advantage, Open Access Archivangelism blog, September 7, 2007: "K & H suggest: "[T]here is no 'Open Access Advantage' for papers from the Astrophysical Journal" because "in a well funded field like astrophysics essentially everyone who is in a position to write research articles has full access to the literature." This seems like a perfectly reasonable explanation for K&H's findings. Where there is no access problem, OA cannot be the cause of whatever higher citation count is observed for self-archived articles. K&H conclude that "[t]his may have implications for other disciplines." It should be evident, however, that the degree to which this has implications for other disciplines depends largely on the degree to which it is true in other disciplines that "essentially everyone who is in a position to write research articles has full access to the literature."

Added 22 August 2007 Sotudeh, H. and Horri, A. (2007)
The citation performance of open access journals: A disciplinary investigation of citation distribution models (full-text subscribers only; no abstract)
Journal of the American Society for Information Science and Technology, Vol. 58, No. 13, 2007, 2145-2156, published online August 17, 2007
From the conclusion: "To sum up, the similarity of the science system across OAJ and NOAJ boundaries has been confirmed. We see this as further evidence of OA's widespread recognition by scientific communities. However, because the magnitudes of the exponents found in this study are lower than what was previously observed for the whole system, OA may currently perform at a slightly lower level. According to the models used in this study, the citation distributions between fields are strongly disproportionate in Life Sciences and Engineering and Material Sciences, favoring larger fields in the former, but smaller fields in the latter. However, the distributions tend to be rather linear in the Natural Sciences."

Added 13 September 2007 Brody, T., Carr, L., Gingras, Y., Hajjem, C., Harnad, S. and Swan, A. (2007)
Incentivizing the Open Access Research Web: Publication-Archiving, Data-Archiving and Scientometrics
CTWatch Quarterly, Vol. 3, No. 3, August 2007

Added 22 August 2007 Lin, S.-K. (2007)
Editorial: Non-Open Access and Its Adverse Impact on Molecules
Molecules, 12, 1436-1437, 16 July 2007
The point of this short editorial is clear, that the difference between the OA and non-OA content in the journal Molecules is clearly reflected in higher citations for the former. The context could be clearer, however. The OA/non-OA history of the journal, especially prior to the period under review (2005-6), is not elaborated and familiarity with the journal is assumed.

Added 22 August 2007 Taylor, D. (2007)
Looking for a Link: Comparing Faculty Citations Pre and Post Big Deals
Electronic Journal of Academic and Special Librarianship, v.8 no.1 (Spring 2007)
Note. The Big Deal is where a library or consortium of libraries subscribes to a larger package of a publisher's journals than they would have if they had subscribed to journals individually. Big Deals are claimed to improve access for an institution's users. "Pre Big Deal, the percentage of citations to journals that are part of Big Deals but were previously not subscribed to was an average of 2.6%. Post Big Deal this increased to an average of 6.1%." There is no analysis or comment on how this result might be affected if it was considering open access.

Added 22 May 2007 Craig, I. D., Plume, A. M., McVeigh, M. E., Pringle, J. and Amin, M. (2007)
Do Open Access Articles Have Greater Citation Impact? A critical review of the literature
Publishing Research Consortium, undated (announced 17 May 2007), Journal of Informetrics, 1 (3): 239-248, July 2007

Comment on this paper:
Suber, P. (2007), Publishers doubt the OA impact advantage, Open Access News, 18 May 2007: "we shouldn't be surprised to see that good relevant literature that is easier to find and retrieve is cited more often than good relevant literature that is harder to find and retrieve. Or, if a careful study concluded that this view is false, then one might expect it to be more careful in summarizing the reasons why."
Harnad, S. (2007), Craig et al.'s review of the OA citation advantage, Open Access Archivangelism blog, 26 May 2007: "I also agree that not one of the studies done so far is without some methodological flaw that could be corrected. But it is also highly probable that the results of the methodologically flawless versions of all those studies will be much the same as the results of the current studies. That's what happens when you have a robust major effect, detected by virtually every study, and only ad hoc methodological cavils and special pleading to rebut each of them with. But I am sure those methodological flaws will not be corrected by these authors, because -- OJ Simpson's "Dream Team" of Defense Attorneys comes to mind -- Craig et al's only interest is evidently in finding flaws and alternative explanations, not in finding out the truth -- if it goes against their client's interests...
Iain D. Craig: Wiley-Blackwell
Andrew M. Plume, Mayur Amin: Elsevier
Marie E. McVeigh, James Pringle: Thomson Scientific"

Added 10 May 2007 Tonta, Y., Ünal, Y. and Al, U. (2007)
The Research Impact of Open Access Journal Articles
E-LIS, 30 April 2007, also in Proceedings ELPUB 2007, the 11th International Conference on Electronic Publishing, Vienna, 13-15 June 2007

Comment on this paper:
Harnad, S. (2007), OA citation impact study: No conclusions possible, American Scientist Open Access Forum, 1 May 2007: "No comparison was made with non-OA journals in the same fields. Hence it is impossible to say whether any of these differences have anything to do with OA. Fields no doubt differ in their average number of citations. Journals no doubt differ too, in subject matter, quality, and citation impct. And it is not clear whether the OA journals in each field are the top, medium or bottom journals, relative to the non-OA journals. No conclusions at all can be drawn from this study. The authors are encouraged to do the necessary controls."

Added 26 May 2008 Sharma, H. P. (2007)
Download plus citation counts - a useful indicator to measure research impact (correspondence, pdf 1pp)
Current Science, 92 (7): 873-873, April 10, 2007

Added 10 May 2007 Piwowar, H. A., Day, R. S. and Fridsma, D. B. (2007)
Sharing Detailed Research Data Is Associated with Increased Citation Rate
PLoS ONE, March 21, 2007
Principal Findings: "We examined the citation history of 85 cancer microarray clinical trial publications with respect to the availability of their data. The 48% of trials with publicly available microarray data received 85% of the aggregate citations. Publicly available data was significantly (p = 0.006) associated with a 69% increase in citations, independently of journal impact factor, date of publication, and author country of origin using linear regression."

Comment on this paper:
Suber, P. (2007), Open data and citation impact, Open Access News, 28 March 2007: " Many studies have shown a correlation between OA articles and citation impact. I believe this is the first study to document a similar correlation between OA data and citation impact. Spread the word to colleagues who are still hoarding data, waiting too long before releasing it, or unable to see any gain for themselves in data sharing."

Added 8 March 2007 Bergstrom, T. C. and Lavaty, R. (2007)
How often do economists self-archive?
eScholarship Repository, University of California, February 8, 2007

Comment on this paper:
Harnad, S. (2007), How often do economists self-archive? (fwd), American Scientist Open Access Forum, 9 February 2007: "An important paper"

Added 26 May 2008 Chapman, S., Nguyen, T. N. and White, C. (2007)
Press-released papers are more downloaded and cited (full text requires subscription; extract only)
Tobacco Control, 16 (1): 71-71, February 2007

Added 22 January 2007 Harnad, S. and Hajjem, C. (2007)
The Open Access Citation Advantage: Quality Advantage Or Quality Bias?
Author blog, Open Access Archivangelism, 21 January 2007
Does the OA Advantage (OAA) occur because authors are more likely to self-selectively self-archive articles that are more likely to be cited (self-selection "Quality Bias": QB), or because articles that are self-archived are more likely to be cited ("Quality Advantage": QA)? Preliminary evidence based on over 100,000 articles from multiple fields, comparing self-selected self-archiving with mandated self-archiving to estimate the contributions of QB and QA to the OAA shows: "Both factors contribute, and the contribution of QA is greater." Includes comment on Moed, H. (2006), The effect of 'Open Access' upon citation impact: An analysis of ArXiv's Condensed Matter Section.

Comment on this paper:
Laloe, F. and Harnad, S., Re: Quality Bias vs Quality Advantage, SIGMETRICS, see installments of this discussion on 20, 21 and 22 February 2007

Added 17 January 2007 Harnad, S. (2007)
Citation Advantage For OA Self-Archiving Is Independent of Journal Impact Factor, Article Age, and Number of Co-Authors
Author blog, Open Access Archivangelism, 17 January 2007
Further comment on Eysenbach, G. (2006), Citation Advantage of Open Access Articles: "The OA-self-archiving advantage remains a robust, independent factor."

Added 17 January 2007 Brody, T. (2007)
Evaluating Research Impact through Open Access to Scholarly Communication
PhD, Electronics and Computer Science, University of Southampton, May 2006, in ECS EPrints, 14 January 2007

Added 8 March 2007 McDonald, J. D. (2007)
Understanding Online Journal Usage: A Statistical Analysis of Citation and Use
Journal of the American Society for Information Science & Technology, 58(1): 39-50, January 1, 2007, also in Caltech Library System Papers and Publications, 18 May 2006

Added 28 July 2008 Knowlton, S. A. (2007)
Continuing use of print-only information by researchers
J Med Libr Assoc., 95(1): 83-88, January 2007
"to study the question, "Are researchers still accessing and using material issued only in print?," a group of journals was selected, and the impact factor of each was tracked over the period 1993-2003.
Conclusion: the online status of a journal is not sufficient to override all other considerations by researchers when they choose which material to cite."

Added 26 May 2008 Walters, G. D. (2006)
Predicting subsequent citations to articles published in twelve crime-psychology journals: Author impact versus journal impact (full text requires subscription; abstract only)
Scientometrics, 69 (3): 499-510, December 2006
"These results suggest that author impact may be a more powerful predictor of citations received by a journal article than the periodical in which the article appears."

Added 23 November 2006 Harnad, S. (2006)
The Self-Archiving Impact Advantage: Quality Advantage or Quality Bias?
Author blog, Open Access Archivangelism, 20 November 2006

Added 19 November 2006 Moed, H. F. (2006)
The effect of 'Open Access' upon citation impact: An analysis of ArXiv's Condensed Matter Section
ArXiv, Computer Science, cs.DL/0611060, 14 November 2006, in Journal of the American Society for Information Science and Technology, Vol. 58, No. 13, 2007, 2145-2156, published online August 30, 2007 http://dx.doi.org/10.1002/asi.20663 (subscriber access only to full text)
"This article statistically analyses how the citation impact of articles deposited in the Condensed Matter section of the preprint server ArXiv, and subsequently published in a scientific journal, compares to that of articles in the same journal that were not deposited in that archive. Its principal aim is to further illustrate and roughly estimate the effect of two factors, 'early view' and 'quality bias', upon differences in citation impact between these two sets of papers ... The analysis provided evidence of a strong quality bias and early view effect. Correcting for these effects, there is in a sample of 6 condensed matter physics journals studied in detail, no sign of a general 'open access advantage' of papers deposited in ArXiv. The study does provide evidence that ArXiv accelerates citation, due to the fact that that ArXiv makes papers earlier available rather than that it makes papers freely available."

Comment on this paper:
Harnad, S. (2007), The Open Access Citation Advantage: Quality Advantage Or Quality Bias?, Open Access Archivangelism, 21 January 2007
Moed, H. (2006) reply on OA to sigmetrics, SIGMETRICS list server, 8 December 2006: On Quality bias. (Harnad) wrote: "The fact that highly-cited articles (Kurtz) and articles by highly-cited authors (Moed) are more likely to be Arxived certainly does not settle the question of cause and effect: It is just as likely that better articles benefit more from Arxiving (QA) as that better authors/articles tend to Arxive/be-Arxived more (QB)". Citation rates may be influenced both by the 'quality' of the papers and by the access modality (deposited versus non-deposited). This is why I estimated author prominence on the basis of the citation impact of their non-archived articles only. But even then I found evidence that prominent, influential authors (in the above sense) are overrepresented in papers deposited in ArXiv. But I did more that that. I calculated Arxiv Citation Impact Differentials (CID) at the level of individual authors. Next, I calculated the median CID over authors publishing in a journal. How then do you explain my empirical finding that for some authors the CID is positive, for others it is negative, while the median CID over authors does not significantly differ from zero (according to a Sign test) for all journals studied in detail except Physical Review B, for which it is only 5 per cent? If there is a genuine 'OA advantage' at stake, why then does it for instance not lead to a significantly positive median CID over authors? Therefore, my conclusion is that, controlling for quality bias and early view effect, in the sample of 6 journals analysed in detail in my study, there is no sign of a general 'open access advantage' of papers deposited in ArXiv's Condensed Matter Section. ... I hope that more case studies will be carried out in the near future, applying the methodologies I proposed in my paper.
Harnad, S. (2006) The Self-Archiving Impact Advantage: Quality Advantage or Quality Bias?, Open Access Archivangelism, 20 November 2006

Added 13 September 2007 Bollen, J. and Van de Sompel, H. (2006)
Usage Impact Factor: the effects of sample characteristics on usage-based impact metrics
arXiv.org > cs > arXiv:cs/0610154v2 [cs.DL], 26 October 2006, in Journal of the American Society for Information Science and Technology, 59 (1): 136-149, January 1, 2008

Updated 23 November 2006 Mayr, P. (2006)
Constructing experimental indicators for Open Access documents
E-LIS, 05 October 2006, in Research Evaluation, special issue on 'Web indicators for Innovation Systems', Vol. 15, No. 2, 1 August 2006, 127-132
Author preprint, http://www.ib.hu-berlin.de/~mayr/arbeiten/mayr_RE06.pdf (pdf 9pp)

Added 19 November 2006 Henneken, E. A., Kurtz, M. J., Warner, S., Ginsparg, P., Eichhorn, G., Accomazzi, A., Grant, C. S., Thompson, D., Bohlen, E. and Murray, S. S. (2006)
E-prints and Journal Articles in Astronomy: a Productive Co-existence
ArXiv, Computer Science, cs.DL/0609126, 22 September 2006, in Learned Publishing, Vol. 20, No. 1, January 2007, 16-22

Comment and discussion:
Harnad, S. (2006) The Self-Archiving Impact Advantage: Quality Advantage or Quality Bias?, Open Access Archivangelism, 20 November 2006
Harnad, S. (2006) The Special Case of Astronomy, Open Access Archivangelism, October 14. 2006

Added 28 July 2008 Jacsó, P. (2006)
Open Access to Scholarly Full Text Documents (pdf 8pp)
Online Information Review
, 30(5) 2006, 587-594

Added 19 November 2006 Zhang, Y. (2006)
The Effect of Open Access on Citation Impact: A Comparison Study Based on Web Citation Analysis (abstract only)
Libri, September 2006 (Full text for subscribers)

Added 09 March 2010 Kurtz, M. and Brody, T. (2006)
The impact loss to authors and research
e-Prints Soton, 12 July 2006, in Jacobs, N. (ed.), Open Access: Key strategic, technical and economic aspects (Oxford, UK: Chandos Publishing)

Added 03 August 2006 Metcalfe, T. S. (2006)
The Citation Impact of Digital Preprint Archives for Solar Physics Papers
Solar Physics, Vol. 239, No. 1-2, December 2006, pp. 549-553
also in ArXiv, Astrophysics, astro-ph/0607079, 5 July 2006 http://arxiv.org/abs/astro-ph/0607079
"Most astronomers now use the arXiv.org server (astro-ph) to distribute preprints, but the solar physics community has an independent archive hosted at Montana State University. For several samples of solar physics papers published in 2003, I quantify the boost in citation rates for preprints posted to each of these servers. I show that papers on the MSU archive typically have citation rates 1.7 times higher than the average of similar papers that are not posted as preprints, while those posted to astro-ph get 2.6 times the average. A comparable boost is found for papers published in conference proceedings, suggesting that the higher citation rates are not the result of self-selection of above-average papers."

Added 03 August 2006 Henneken, E. A., Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant, C., Thompson, D., and Murray, S. S. (2006)
Effect of E-printing on Citation Rates in Astronomy and Physics
Journal of Electronic Publishing, Vol. 9, No. 2, Summer 2006, also in ArXiv, Computer Science, cs.DL/0604061, v2, 5 June 2006 http://arxiv.org/abs/cs/0604061
"It has been observed that papers that initially appear as arXiv e-prints get cited more than papers that do not. Using the citation statistics from the NASA-Smithsonian Astrophysics Data System, we confirm the findings from other studies, we examine the average citation rate to e-printed papers in the Astrophysical Journal, and we show that for a number of major astronomy and physics journals the most important papers are submitted to the arXiv e-print repository first.

Comment and discussion:
Harnad, S. (2006) The Self-Archiving Impact Advantage: Quality Advantage or Quality Bias?, Open Access Archivangelism, 20 November 2006
Harnad, S. (2006) The Special Case of Astronomy, Open Access Archivangelism, October 14. 2006

Added 03 August 2006 Kousha, K. and Thelwall, M. (2006)
Google Scholar Citations and Google Web/URL Citations: A Multi-Discipline Exploratory Analysis
E-LIS, 05 June 2006, also in Proceedings International Workshop on Webometrics, Informetrics and Scientometrics & Seventh COLLNET Meeting, Nancy (France), May 2006
"we built a sample of 1,650 articles from 108 Open Access (OA) journals published in 2001 in four science and four social science disciplines. We recorded the number of citations to the sample articles using several methods based upon the ISI Web of Science, Google Scholar and the Google search engine (Web/URL citations). For each discipline, we found significant correlations between ISI citations and both Google Scholar and Google Web/URL citations; with similar results when using total or average citations, and when comparing within and across (most) journals."

Added 16 May 2006 Eysenbach, G. (2006)
Citation Advantage of Open Access Articles
PLoS Biology, Volume 4, Issue 5, May 2006 Scintilla
Further evidence for the OA citation advantage, although quite critical of other studies with which its findings broadly agree. This example is based on a small, single journal sample (PNAS: Proceedings of the National Academy of Sciences). Since PNAS offers authors the choice of paying to provide open access to published papers and/or freely self-archiving, a 'Secondary analysis' considers the relative impact of each type of OA, although the number of papers involved is really too small to give this result the weight of the broader findings. The paper is accompanied by two editorials, one in the publishing journal, the other a self-published editorial by the author:
MacCallum, C. J. and Parthasarathy, H. (2006) Editorial: Citation Advantage of Open Access Articles, PLoS Biology, Volume 4, Issue 5, May 2006
Eysenbach, G. (2006) The Open Access Advantage, Journal of Medical Internet Research, 2006;8(2):e8

Comment on this paper:
Harnad, S. on Chronicle of Higher Education News Blog on this paper, 15 May 2006: "The Eysenbach study is certainly not "the first to compare open-access and non-open-access papers from the same journal". See [this bibliography]"
Harnad, S. eLetter, PLoS, Pipe-Dreams and Peccadillos, on the PLoS editorial, 16 May 2006: "There can be disagreement about what evidence one counts as "solid," but there can be little dispute that prior evidence derived from substantially larger and broader-based samples showing substantially the same outcome can hardly be described as "surprisingly hard to find". The only new knowledge from this small, journal-specific sample was (1) the welcome finding of how early the OA advantage can manifest itself, plus (2) some less clear findings about differences between first- and last-author OA practices, plus (3) a controversial finding that will most definitely need to be replicated on far larger samples in order to be credible: "The analysis revealed that self-archived articles are also cited less often than OA articles from the same journal.""
Author's response eLetter, PLoS, 17 May 2006: "None of the previous papers in [this bibliography] employed a similar methodology, working with data from a "gold-OA" journal. ... Regarding "larger samples" I think rigor and quality (leading to internal validity) is more important than quantity (or sample size). Going through the laborious effort to extract article and author characteristics for a limited number of articles (n=1492) in order to control for these confounders provides scientifically stronger evidence than doing a crude, unadjusted analysis of a huge number of online accessible vs non-online accessible articles, leaving open many alternative explanations. ... contrary to what Harnad said, this study is NOT at all "showing substantially the same outcome". On the contrary, the effect of green-OA -- once controlled for confounders -- was much less than what others have claimed in previous papers."
Harnad, S. blog, Confirming the Within-Journal OA Impact Advantage, 18 May 2006: "with the large, consistent within-journal OA/NOA differences found across all journals, all disciplines and all years in samples four orders of magnitude larger than Eysenbach's, it is not at all clear that controls for those "multiple confounders" are necessary in order to demonstrate the reality, magnitude and universality of the OA advantage. That does not mean the controls are not useful, just that they are not yet telling us much that we don't already know. ... the true measure of the SOA (Self-Archived OA) advantage today (at its 15% spontaneous baseline) is surely not to be found in PNAS but in the statistically far more numerous, hence far more representative full-spectrum of journals that do not yet offer POA (Payed OA). (I would be delighted if those journals took the Eysenbach findings as a reason for offering a POA option! But not at the expense of authors drawing the absurd conclusion -- not at all entailed by Eysenbach's PNAS-specific results -- that in the journals they currently publish in, SOA alone would not confer citation advantages at least as big as the ones we have been reporting.)"
Author's response WebCite blog, The OA debate between an "rchivangelist" and an OA researcher, 24 May 2006: ""Confounded" associations between two variables which falsely suggests causality can ONLY be ruled out if one controls for the confounder no matter how "strong, consistent" the effect appears ... Our "events" (citations) are determined by many different "causes", of which "access" is only one variable - many other variables, including confounders, have to be taken into account.
Harnad, S. blog, The Epidemiology of OA, 26 May 2006: considers each of Eysenbach's list of confounders, which ones might have an effect and which ones will be tested. "Stay tuned..."
Suber, P. SPARC Open Access Newsletter, issue #98, Gunther Eysenbach confirms the OA impact advantage, 2 June 2006: "There's some controversy about whether some earlier results, especially by Tim Brody, Chawki Hajjem, and Stevan Harnad, are the same or only similar to some of Eysenbach's results. But no one doubts that Eysenbach has new and valid results, or that he has persuasively advanced the case that OA helps authors and journals build their citation impact. While there have been many previous studies of the OA impact advantage, none has made the splash that Eysenbach's has."
Davis, P. Citation advantage of Open Access articles likely explained by quality differential and media effects, PLoS Biology, Responses, 16 January 2007: "The fact that OA articles were more likely to be featured on the front cover of PNAS and covered by the media suggests that other causal explanations may explain the OA advantage. Open Access may be a result - not a cause of - a quality differential which is amplified by the media. While Eysenbach's attempt to control other explanatory variables was excellent, what is needed are true randomized controlled studies of OA publishing."
Harnad, S. blog, Citation Advantage For OA Self-Archiving Is Independent of Journal Impact Factor, Article Age, and Number of Co-Authors, 17 January 2007: "Chawki Hajjem has now done a multiple regression analysis jointly testing (1) article age, (2) journal impact factor, (3) number of authors, and (4) OA self-archiving as separate factors for 442,750 articles in 576 (biomedical) journals across 11 years, and has shown that each of the four factors contributes an independent, statistically significant increment to the citation counts. The OA-self-archiving advantage remains a robust, independent factor.

Added 15 March 2006 Davis, P. M. and Fromerth, M. J. (2006)
Does the arXiv lead to higher citations and reduced publisher downloads for mathematics articles? (pdf 12pp)
draft manuscript, ArXiv.org, cs.DL/0603056, 14 March 2006, Scientometics, Vol. 71, No. 2. (May 2007)

Comment on this paper:
Authors Self-Selection bias "While our study confirms the same citation advantage reported by others, it does not attribute Open Access as the cause of more citations, but to Self-Selection. Open Access therefore may be a result, not a cause, of authors promoting higher-quality work."
Liblicense, 14 March 2006

Harnad, S. OAA a causal factor "I think your results are very interesting, but I don't think they have shown that the OA citation advantage (OAA) is all or mostly a self-selection Quality Bias (QB) correlate, rather than being causal. It is still quite plausible that the OAA is a genuine causal factor, but that it has a bigger effect on the high quality/citation end."
SIGMETRICS, 14 March 2006
Qualifying statements: Authors "I am however, troubled by individuals who make universal and unqualified statements like, "Open Access increases citations by 50-250%!"" Harnad, S. "I am such an individual, and I hold by that statement" Authors "The more precise answer is much more subtle, but I understand that a statement like, "open access may provide some citation benefit, but only for prestigious authors who publish in prestigious journals and whose article is already highly-cited", doesn't sound as convincing to administrators and policy makers." Harnad, S. "I don't think it is a correct statement of the thrust of the current body of findings on the OAA. It is merely your interpretation of the result of your own study in 4 maths journals!"
SIGMETRICS, 15 March 2006

Causation or association?: Antelman, K. "Data I collected for philosophy, political science, engineering and mathematics do not support this hypothesis that OA causes more citations for better articles only ... at that time I had not looked at the distribution of OA and non-OA articles by citations. Graphs of those results are posted at http://www.lib.ncsu.edu/staff/kantelman/OA_by_citations.xls. These data show OA citation advantage across all articles with more than zero citations. Authors "The data that Kristin illustrates do not show causation, only association." Harnad, S. The data Phil illustrates likewise do not show causation, only association. ... More fine-tuned causal tests are needed to decide.
Liblicense, from 20 March 2006

Added 16 March 2006 Harnad, S. (2006)
OA Impact Advantage = EA + (AA) + (QB) + QA + (CA) + UA
Author eprint, 14 March 2006, ECS EPrints repository, School of Electronics and Computer Science, University of Southampton

Added 03 August 2006 Mueller, P. S., Murali, N. S., Cha, S. S., Erwin, P. J. and Ghosh, A. K. (2006)
The effect of online status on the impact factors of general internal medicine journals
Netherlands Journal of Medicine, 64 (2): 39-44, February 2006
"becoming available online as FUTON (full text on the Net) is associated with a significant increase in journal impact factor."

Added 30 December 2005 Hajjem, C., Harnad, S. and Gingras, Y. (2005)
Ten-Year Cross-Disciplinary Comparison of the Growth of Open Access and How it Increases Research Citation Impact (pdf 8pp)
IEEE Data Engineering Bulletin, Vol. 28 No. 4, December 2005
also Author eprint, 16 December 2005 http://eprints.ecs.soton.ac.uk/11688/
"In 2001, Lawrence found that articles in computer science that were openly accessible (OA) on the Web were cited substantially more than those that were not. We have since replicated this effect in physics. To further test its cross-disciplinary generality, we used 1,307,038 articles published across 12 years (1992-2003) in 10 disciplines (Biology, Psychology, Sociology, Health, Political Science, Economics, Education, Law, Business, Management). The overall percentage of OA (relative to total OA + NOA) articles varies from 5%-16% (depending on discipline, year and country) and is slowly climbing annually. Comparing OA and NOA articles in the same journal/year, OA articles have consistently more citations, the advantage varying from 25%-250% by discipline and year."

Added 30 December 2005 Hajjem, C., Gingras, Y., Brody, T., Carr, L. and Harnad, S. (2005)
Open Access to Research Increases Citation Impact (.doc 12pp)
Author eprint, 16 December 2005, Technical Report, Institut des sciences cognitives, Université du Québec à Montréal

Added 30 December 2005 Sahu, D.K., Gogtay, N.J. and Bavdekar, S.B. (2005)
Effect of open access on citation rates for a small biomedical journal
Author eprint, December 1, 2005, in Fifth International Congress on Peer Review and Biomedical Publication, Chicago, September 16-18, 2005
"We assessed the influence of OA on citations rates for a small, multi-disciplinary journal which adopted OA without article submission or article access fee. DESIGN The full text of articles published since 1990 were made available online in 2001. Citations for these articles as retrieved using Web of Science, SCOPUS, and Google Scholar were divided into two groups - the pre-OA period (1990-2000) and the post-OA period (2001-2004). CONCLUSIONS Open access was associated with increase in the number of citations received by the articles. It also decreased the lag time between publication and the first citation. For smaller biomedical journals, OA could be one of the means for improving visibility and thus citation rates."

Added 27 September 2005 Zhao, D. (2005)
Challenges of scholarly publications on the Web to the evaluation of science -- A comparison of author visibility on the Web and in print journals (abstract only)
Information Processing and Management, 41:6, 1403-1418, December 2005
Compares author visibility between the Web and print journals as revealed from citation analysis based on a search for the term "XML" or "eXtensible Markup Language" using NEC Research Institute's CiteSeer, the entire ISI Science Citation Index (SCI) database, and journals indexed and classified in SCI as representing computer science research. The main finding: "The author ranking by number of citations that resulted from CiteSeer data is highly correlated with that obtained from SCI." i.e. it's not comparing OA impact vs non-OA but Web vs journal, and finds that authors, notably the top authors, are self-archiving and publishing papers in both places.

Added 11 February 2008 Coats, A. J. S. (2005)
Top of the charts: download versus citations in the International Journal of Cardiology (full-text requires subscription; otherwise abstract only)
International Journal of Cardiology, Volume 105, Issue 2, 2 November 2005, 123-125, available online 7 October 2005
From the abstract: "We have recorded the 10 top cited articles over a 12-month period and compared them to the 10 most popular articles being downloaded over the same time period. The citation-based listing included basic and applied, observational and interventional original research reports. For downloaded articles, which have shown a dramatic increase for the International Journal of Cardiology from 48,000 in 2002 to 120,000 in 2003 to 200,000 in 2004, the most popular articles over the same period are very different and are dominated by up-to-date reviews of either cutting-edge topics (such as the potential of stem cells) or of the management of rare or unusual conditions. There is no overlap between the two lists despite covering exactly the same 12-month period and using measures of peer esteem. Perhaps the time has come to look at the usage of articles rather than, or in addition to, their referencing."

Added 13 July 2005 Adams, J. (2005)
Early citation counts correlate with accumulated impact (abstract only)
Scientometrics, 63 (3): 567-581, June 2005
Working towards earlier prediction of impact. This paper is not OA and has just appeared but was written before Brody et al. (2005) revealed a correlation to predict impact from even earlier data, i.e. download data for OA papers, before any citations.

Added 26 September 2005 Moed, H. F. (2005)
Statistical Relationships Between Downloads and Citations at the Level of Individual Documents Within a Single Journal (abstract only)
Journal of the American Society for Information Science and Technology, 56(10): 1088-1097, published online 31 May 2005
"Statistical relationships between downloads from ScienceDirect of documents in Elsevier's electronic journal Tetrahedron Letters and citations to these documents recorded in journals processed by the (ISI) for the Science Citation Index (SCI) are examined. ... Findings suggest that initial downloads and citations relate to distinct phases in the process of collecting and processing relevant scientific information that eventually leads to the publication of a journal article." Does not investigate open access sources. Notes the need for caution in drawing conclusions on the frequency of paper downloads from formal citation patterns, and vice versa.

Updated 05 October 2005 Vaughan, L. and Shaw, D. (2005)
Web citation data for impact assessment: A comparison of four science disciplines (abstract only)
Journal of the American Society for Information Science and Technology, Vol. 56, No. 10, 1075 - 1087, published online 27 May 2005
appears to be an expansion of Can Web Citations be a Measure of Impact? An Investigation of Journals in the Life Sciences (abstract only)
ASIST 2004: Proceedings of the 67th ASIS&T Annual Meeting, Vol. 41 (Medford, USA: Information Today), pp. 516-526

Brody, T., Harnad, S. and Carr, L. (2005)
Earlier Web Usage Statistics as Predictors of Later Citation Impact
Author eprint, 18 May 2005, University of Southampton, School of Electronics and Computer Science, Journal of the American Association for Information Science and Technology, Volume 57, Issue 8, 2006, 1060-1072 (abstract)

Added 19 May 2005 Wren, J. D. (2005)
Open access and openly accessible: a study of scientific publications shared via the internet
BMJ, 330:1128, 12 April 2005 Scintilla

BMJ Rapid Responses to this article:
Carr, L., 15 April 2005 Open Access Misdefined "by misdefining Open Access this paper somehow ignores both the phenomenon that is being measured (Open Access-ibility) and the significant research that has risen around it."
Chan, C-h. and Ng, D. K., 13 May 2005 Technological problems in this study: PDF "The selection of only PDF reprint in this study made this correlation (between the time since initial publication and the probability of availability of reprint in non journal website) a little bit artificial."
All Rapid Responses

Wren's article also prompted this editorial
Suber, P. (2005)
Open access, impact, and demand
BMJ, 330:1097-1098, 14 May 2005

Added 13 April 2005 Belew, R. (2005)
Scientific impact quantity and quality: Analysis of two sources of bibliographic data (pdf 12pp)
Arxiv.org, cs.IR/0504036, 11 April 2005

Added 28 July 2008 De Groote, S. L., Shultz, M. and Doranski, M. (2005)
Online journals' impact on the citation patterns of medical faculty J Med Libr Assoc., 93 (2): 223-228, April 2005
From the conclusion: "It is possible that electronic access to information (i.e., online databases) has had a positive impact on the number of articles faculty will cite. Results of this study suggest, at this point, that faculty are still accessing the print-only collection, at least for research purposes, and are therefore not sacrificing quality for convenience."

Added 03 August 2006 Metcalfe, T. S. (2005)
The Rise and Citation Impact of astro-ph in Major Journals
ArXiv, Astrophysics, astro-ph/0503519, 23 March 2005
"I describe a simple method to determine the adoption rate and citation impact of astro-ph over time for any journal using NASA's Astrophysics Data System (ADS). I use the ADS to document the rise in the adoption of astro-ph for three major astronomy journals, and to conduct a broad survey of the citation impact of astro-ph in 13 different journals. I find that the factor of two boost in citations for astro-ph papers is a common feature across most of the major astronomy journals."

Updated 13 April 2005 Ongoing studies Hajjem, C. (2004-05)
Cover page for the range of studies highlighted below, Laboratoire de recherche en Sciences Cognitives, UQAM. (Text in French but graphs "self-explanatory"; see this comment for elaboration)

Updated 26 September 2005 Bollen, J., Van de Sompel, H., Smith, J. and Luce, R. (2005)
Toward alternative metrics of journal impact: A comparison of download and citation data (pdf 34pp)
Arxiv.org, cs.DL/0503007, 03 March 2005, in Information Processing and Management, 41(6): 1419-1440, December 2005

Added 5 January 2005 Ongoing study Brody, T., et al.
Citation Impact of Open Access Articles vs. Articles Available Only Through Subscription ("Toll-Access")
with downloadable graphs of '% Articles OA' and '% OA Advantage' by discipline and sub-discipline

Updated 31 January 2005 Schwarz, G. and Kennicutt Jr., R. C. (2004)
Demographic and Citation Trends in Astrophysical Journal Papers and Preprints (pdf 14pp)
Arxiv.org, astro-ph/0411275, 10 November 2004, Bulletin of the American Astronomical Society, Vol. 36, 1654-1663
See also a note from AAS Pub Board meeting, Tucson, November 3-4 2003
"Greg Schwarz (from the ApJ editorial office) reported some work he's doing tracking citation rates of papers published in the ApJ based on whether they were posted on astro-ph or not: ApJ papers that were also on astro-ph have a citation rate that is _twice_ that of papers not on the preprint server"
http://listserv.nd.edu/cgi-bin/wa?A2=ind0311&L=pamnet&D=1&O=D&P=1632

Added 03 August 2006 Havemann, F. (2004)
Eprints in der wissenschaftlichen kommunikation (Eprints in scientific communication)
Author eprint, 26 October 2004, presented at the Institute of Library Science, Humboldt University, Berlin, June 1, 2004
"the use of eprints can significantly accelerate the scientific communication. This was demonstrated by me with a small sample of articles in theoretical High Energy Physics published 1998 and 1999 in Physical Review D. Typically the eprints in this sample are available eight months before the printed issue is published. Three quarters of them are cited in eprints authored by other researchers before the journal issue appears (among them all highly cited eprints)."

Brody, T. (2004)
Citation Analysis in the Open Access World
Author eprint, October 4, 2004, in Interactive Media International

Added 9 November 2004 McVeigh, M. E. (2004)
Open Access Journals in the ISI Citation Databases: Analysis of Impact Factors and Citation Patterns
Thomson Scientific, October 2004

Added 29 September 2004 Antelman, K. (2004)
Do Open-Access Articles Have a Greater Research Impact?
College and Research Libraries, 65(5):372-382, September 2004
also Author eprint, E-LIS, 29 September 2004, http://eprints.rclis.org/archive/00002309/

Review of this article:
Lewis, S. P. (2006) Open Access Articles Have a Greater Research Impact Than Articles Not Freely Available, Evidence Based Library and Information Practice, 2006, 1:3

Added 17 May 2010 Comment on this paper:
Davis, P., Do Open-Access Articles Really have a Greater Research Impact? Letter to the editor, College & Research Libraries, Vol. 67, No. 2, March 2006: "The study of citation behavior is complex and involves multiple confounding, and interacting variables. Methodologically, it is very difficult to distinguish whether Open Access is an explanatory cause of increased access, or whether it is merely an artifact of other causal explanations such as article duplication or self-promotion. Do Open Access articles really have a greater research impact, as Antelman suggests? Yes, but Open Access may not be the cause."
Author's response: "While I intentionally phrased my conclusion as an association, rather than a causation (open-access articles have a greater research impact than articles that are not freely available), there clearly is an implied causation and I should have been more explicit that the data do not support that. The article was very much a product of its time, however, when there was little solid data that there even was an association between open access and increased citations. ... Since I did the study in C&RL, I have collected additional data that indicate that quality bias is real and significant, at least in the social sciences."
Davis, P.: In 2005, Wren conducted a massive automated study of the availability of author reprints on the public web. He reported two main conclusions: that articles available freely online yielded more citations; and that there was a high degree of association between high-prestige journals and frequency of author reprints. Journals with high Impact Factors (New England Journal of Medicine, Nature, Science, and Cell) were associated with a higher degree of author republishing than lower-impact journals. Wren went further to discuss possible causes of this difference and briefly discusses a trophy effect  the desire for researchers to display their accomplishments  which would explain why high impact publications are more common online. This is consistent with Antelmans findings, that the greatest impact of open access is with the most-cited articles.
Antelman, K.: Wren's study needs to be looked at carefully, however, because he did not look at the source of the open access copies he found, so the extent of trophy effect self-archiving cannot be assessed from his data. I also collected some data on the source of open access copies from three of the high-impact journals he looked atNEJM, Science and Natureand, while many articles from those journals are freely available online, many or most are not posted by the authors themselves, in particular, NEJM where only 12% of the open access articles were posted by authors or their institutions."

Updated 5 January 2005 Harnad, S., Brody, T., Vallieres, F., Carr, L., Hitchcock, S., Gingras, Y., Oppenheim, C., Stamerjohanns, H. and Hilf, E. (2004)
The Access/Impact Problem and the Green and Gold Roads to Open Access
Author eprint, 15 September 2004, in Serials Review, Vol. 30, No. 4, 310-314 (free access to published version during 2005)
Shorter version: The green and the gold roads to Open Access
Nature, Web Focus: access to the literature, May 17, 2004

Updated 26 September 2005 Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant, C. S., Demleitner, M., Murray, S. S. (2004b)
The Effect of Use and Access on Citations
Author eprint, September 2004, in Information Processing and Management, 41 (6): 1395-1402, December 2005

Comment and discussion:
Moed, H. (2006) reply on OA to sigmetrics, SIGMETRICS list server, 8 December 2006
Harnad, S. (2006) The Self-Archiving Impact Advantage: Quality Advantage or Quality Bias?, Open Access Archivangelism, 20 November 2006
Harnad, S. (2006) The Special Case of Astronomy, Open Access Archivangelism, October 14. 2006

Perneger, T. V. (2004)
Relation between online "hit counts" and subsequent citations: prospective study of research papers in the BMJ
BMJ, 329:546-547, 4 September 2004 Scintilla

BMJ Rapid Responses to this article:
Harnad, S. and Brody, T., 6 September 2004 Prior evidence that downloads predict citations "confirms what Tim Brody's online usage/citation correlator has been demonstrating for several years now across a number of areas in physics and mathematics.
All Rapid Responses

Prakasan, E. R. and Kalyane, V. L. (2004)
Citation analysis of LANL High-Energy Physics E-Prints through Science Citation Index (1991-2002)
Author eprint, E-LIS, 26 August 2004

Added 13 April 2005 Murali, N. S., Murali, H. R., Auethavekiat, P., Erwin, P. J., Mandrekar, J. N., Manek, N. J. and Ghosh, A. K. (2004)
Impact of FUTON and NAA Bias on Visibility of Research
Mayo Clinic Proceedings, Vol. 79, No. 8, 1001-1006, August 2004
Notes and comment: FUTON = full text on the Net; NAA = no abstract available
This is not an article on how Open Access increases impact but on how *Online* Access increases impact. The effects are related, but one is a licensing effect, not an OA effect.

Added 10 May 2007 Davis, P. M. (2004)
For Electronic Journals, Total Downloads Can Predict Number of Users
portal: Libraries and the Academy, Vol. 4, No. 3, July 2004, 379-392

Harnad, S. and Brody, T. (2004a)
Comparing the Impact of Open Access (OA) vs. Non-OA Articles in the Same Journals
D-Lib Magazine, Vol. 10 No. 6, June 2004
Replicates the Lawrence effect -- OA increases impact -- in physics.

Pringle, J. (2004)
Do Open Access Journals have Impact?
Nature, Web Focus: access to the literature, May 7, 2004

Testa, J. and McVeigh, M. E. (2004)
The Impact of Open Access Journals: A Citation Study from Thomson ISI (pdf 17pp)
Author eprint, 14 April 2004

Kurtz, M. J. (2004)
Restrictive access policies cut readership of electronic research journal articles by a factor of two (pdf 2pp)
Harvard-Smithsonian Centre for Astrophysics, Cambridge, MA
Poster presentation at National Policies on Open Access (OA) Provision for University Research Output: an International meeting, Southampton, 19 February 2004

Brody, T., Stamerjohanns, H., Harnad, S., Gingras, Y. and Oppenheim, C. (2004)
The Effect of Open Access on Citation Impact (pdf 1pp)
Poster presentation at National Policies on Open Access (OA) Provision for University Research Output: an International meeting, Southampton, 19 February 2004

Updated 5 January 2005 Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant, C. S., Demleitner, M. and Murray, S. S. (2004a)
Worldwide Use and Impact of the Nasa Astrophysics Data System Digital Library
Author eprint, January 28, 2004, in Journal of the American Society for Information Science and Technology, Vol. 56, No. 1, 36-45, published online 20 September 2004

Hitchcock, S., Brody, T., Gutteridge, C., Carr, L. and Harnad, S. (2003b)
The Impact of OAI-based Search on Access to Research Journal Papers
Author eprint, 15 September 2003, in Serials, Vol. 16, No. 3, November 2003, 255-260

Hitchcock, S., Woukeu, A., Brody, T., Carr, L., Hall, W. and Harnad, S. (2003a)
Evaluating Citebase, an open access Web-based citation-ranked search and impact discovery service
Technical Report ECSTR-IAM03-005, School of Electronics and Computer Science, University of Southampton, July 2003

Added 28 October 2004 Bollen, J., Vemulapalli, S. S., Xu, W. and Luce, R. (2003)
Usage Analysis for the Identification of Research Trends in Digital Libraries
D-Lib Magazine, Vol. 9, No. 5, May 2003

Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant, C., Demleitner, M., Murray, S. S., Martimbeau, N. and Elwell, B. (2003b)
The NASA Astrophysics Data System: Sociology, Bibliometrics, and Impact
Author eprint, March 2003, Journal of the American Society for Information Science and Technology, submitted for publication

Updated 23 February 2005 Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant, C. S., Demleitner, M., Murray, S. S., Martimbeau, N. and Elwell, B. (2003a)
The Bibliometric Properties of Article Readership Information
Author eprint, March 2003, in Journal of the American Society for Information Science and Technology, 56 (2): 111-128, January 15, 2005

Added 17 December 2007 Drenth, J. P. H. (2003)
More reprint requests, more citations? (subscriber access to full text)
Scientometrics, Vol. 56, No. 2, February 2003, 283-286, revised version published online August 2006
From the abstract: "This study aims to correlate the number of reprint requests from a 10-year-sample of articles with the number of citations. ... Articles that received most reprint requests are cited more often."

Darmoni, S. J., et al. (2002)
Reading factor: a new bibliometric criterion for managing digital libraries
Journal of the Medical Library Association, Vol. 90, No. 3, July 2002

Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant, C. S., Thompson, D. M., Bohlen, E. H. and Murray, S. S. (2002)
The NASA Astrophysics Data System: Obsolescence of Reads and Cites (pdf 8pp)
Library and Information Services in Astronomy IV, edited by B. Corbin, E. Bryson, and M. Wolf, July 2002

Bollen, J. and Luce, R. (2002)
Evaluation of Digital Library Impact and User Communities by Analysis of Usage Patterns
D-Lib Magazine, Vol. 8, No. 6, June 2002

Lawrence, S. (2001)
Free online availability substantially increases a paper's impact
Nature, 31 May 2001
see also Online or invisible, an extended version of the Nature article self-archived by the author
The first major findings on the impact effect documented in this bibliography, it remains the most cited paper in the bibliography. Note, its results concern online access rather than open access. At that time the focus was on the transition from print to electronic publication, with Lawrence quantifying the improved access that resulted based on a sample of c.120k computer science papers from 1,494 'venues'.

Added 04 October 2005 Anderson, K., Sack, J., Krauss, L. and O'Keefe, L. (2001)
Publishing Online-Only Peer-Reviewed Biomedical Literature: Three Years of Citation, Author Perception, and Usage Experience
Journal of Electronic Publishing, Vol. 6, No. 3, March 2001
One of the first studies of the citation effect of online against offline publication, rather than of open access against non-OA. Provides data for one journal and a small number of articles over a three year period year. This paper was added to the bibliography following this correspondence:

Banks, P. "does not find the same citation advantage for online publications claimed by Harnad and his colleagues." Harnad, S. "Please weigh findings against the preponderance of evidence."
American Scientist Open Access Forum, Open access to research worth £1.5bn a year, 28 September 2005

Updated 04 October 2005 Odlyzko, A. M. (2000)
The rapid evolution of scholarly communication
PEAK 2000: Economics and Usage of Digital Library Collections conference, Ann Arbor, MI, March 2000.
Also in Learned Publishing, 15(1), 7-19, January 2002 here. Author eprint http://www.dtc.umn.edu/~odlyzko/doc/rapid.evolution.pdf
Notes the growing usage of information in electronic form (c.f. print forms) and of journal papers from non-journal sites (e.g. eprints), and presents evidence that usage increases when access is more convenient

Youngen, G. K. (1998)
Citation Patterns to Electronic Preprints in the Astronomy and Astrophysics Literature
Library and Information Services in Astronomy III, ASP Conference Series, Vol. 153, 1998
see also
Citation Patterns to Traditional and Electronic Preprints in the Published Literature
College & Research Libraries, September 1998

Youngen, G. (1998)
Citation Patterns Of The Physics Preprint Literature With Special Emphasis On The Preprints Available Electronically
Author eprint, UIUC Physics and Astronomy library, c. 5 November 1998, presented at ACRL/STS on 6/29/97

Web tools for measuring impact

Citebase Search "Search and citation analysis tool for the free, online research literature" http://citebase.eprints.org/ User service, free
see
Added 28 July 2008 Jacsó, P. (2004) CiteBase Search, Online, Sep/Oct 2004, 57-58
Brody, T. (2003) Citebase Search: Autonomous Citation Database for e-print Archives, sinn03 conference on Worldwide Coherent Workforce, Satisfied Users - New Services For Scientific Information, Oldenburg, Germany, September 2003
Hitchcock, S., et al. (2003a) Evaluating Citebase, an open access Web-based citation-ranked search and impact discovery service
Correlation Generator http://citebase.eprints.org/analysis/correlation.php
Generates a graph (or table) of the correlation between citation impact and usage impact from the Citebase database
see Brody, T. and Harnad, S. 2005 (in prep.)
Citeseer "Scientific literature digital library" http://citeseer.ist.psu.edu/ User service, free
Updated 26 May 2008 Now available as CiteSeerx, or "Next Generation CiteSeer" http://citeseerx.ist.psu.edu
see
Jacsó, P., (2005) CiteSeer, Thomson Gale, November 2005
Lawrence, S., Giles, C. L., Bollacker, K. (1999), Digital Libraries and Autonomous Citation Indexing, IEEE Computer, Vol. 32, No. 6, 67-71, 1999

Elsevier Scopus Bibliographic database covering 13,450 peer-reviewed titles http://www.scopus.com/ User service
see
Added 28 July 2008 Jacsó, P. (2007) Scopus (2008 Winter Release), Gale, Reference Reviews, Péter's Digital Reference Shelf, November 2007
Added 15 March 2006 Burnham, J. F. (2006) Scopus database: a review, Biomedical Digital Libraries, 3:1, 8 March 2006
Added 15 March 2006 Dess, H. M. (2006) Scopus, Issues in Science and Technology Librarianship, Winter 2006
Added 15 May 2006 Quint, B. (2006) Elsevier’s Scopus Introduces Citation Tracker: Challenge to Thomson ISI’s Web of Science?, Newsbreaks, January 23, 2006
Added 13 September 2007 Goodman, D. and Deis, L. (2006) Update on Scopus, The Charleston Advisor, Vol. 7, No. 3, January 2006, 42-43
Jacsó, P. (2004) Scopus, Thomson Gale, September 2004
see also Comparative reviews

Google Scholar Find articles from academic publishers, preprint repositories and universities, as well as scholarly articles across the web (presents citations as separate results) http://scholar.google.com/ User service, free
see

Added 28 July 2008
Harzing, A. W.K., van der Wal, R. (2008) Google Scholar as a new source for citation analysis, Ethics in Science and Environmental Politics, Vol. 8, No. 1, June 03, 2008, 61-73
Added 28 July 2008 Jacsó, P. (2008) The pros and cons of computing the h-index using Google Scholar. Online Information Review, 32(3) 2008, 437-452
Added 26 May 2008 Meier, J. J. and Conkling, T. W. (2008) Google Scholar's Coverage of the Engineering Literature: An Empirical Study, Journal of Academic Librarianship, Vol. 34, No. 3, May 2008, 196-201 (full text requires subscription; abstract only)
Added 28 July 2008 Jacsó, P. (2008) Google Scholar, Online, Mar/Apr 2008, 53-54
Added 13 September 2007 Harzing, A.-W. (2007) Reflections on Google Scholar, Harzing.com, fifth version, 6 September 2007
about the citation analysis software Publish or Perish and its relation with Google Scholar
Added 13 September 2007 Quint, B. (2007) Changes at Google Scholar: A Conversation With Anurag Acharya, NewsBreaks, August 27, 2007
rare public interview with the low-profile 'designer and missionary' behind Google Scholar
Added 22 August 2007 Mayr, P. and Walter, A.-K. (2007) An exploratory study of Google Scholar, arXiv.org > cs > arXiv:0707.3575v1 [cs.DL], July 24, 2007, in Online Information Review, Vol. 31, No. 6 (2007), 814-830. Author preprint also available from http://www.ib.hu-berlin.de/~mayr/arbeiten/OIR-Mayr-Walter-2007.pdf
Added 10 May 2007 Robinson, M. L. and Wusteman, J. (2007) Putting Google Scholar to the test: a preliminary study, author eprint, also in Program: Electronic Library and Information Systems, Vol. 41, Issue 1, February 2007, 71-80
Added 15 May 2006 Sadeh, T. (2006) Google Scholar Versus Metasearch Systems, HEP Libraries Webzine, issue 12, March 2006
"thoughtful and informative ... altogether the best overview of Google Scholar, other large federated search systems such as Scirus, and library-based metasearch tools I've seen." Reviewed by Tennant, R., Current Cites, January 2006 issue
Added 15 March 2006 Burright, M. (2006) Google Scholar -- Science & Technology, Issues in Science and Technology Librarianship, Winter 2006
Added 28 February 2006 Noruzi, A. (2005) Google Scholar: the new generation of citation indexes (pdf 11pp), E-LIS, 11 February 2006, in LIBRI 55(4): 170-180
Jacsó, P., (2005) Google Scholar and The Scientist, commenting on his interview in Perkel, J., The Future of Citation Analysis (abstract only), The Scientist, October 24, 2005
Jacsó, P. (2005) Google Scholar (Redux), Thomson Gale, June 2005
Myhill, M. (2005) Google Scholar, Charleston Advisor, Vol. 6, No. 4, April 2005
Added 22 August 2007 Giustini, D. and Barsky, E. (2005) A look at Google Scholar, PubMed, and Scirus: comparisons and recommendations, Journal of the Canadian Health Libraries Association/Journal de l'Association des bibliothèques de la santé du Canada (JCHLA / JABSC) 26: 85-89 (2005) (pdf 5pp)
Jacsó, P. (2004) Google Scholar Beta, Thomson Gale, December 2004
see also Comparative reviews

ISI Web of Science Cited reference searching of 8,700 high impact research journals http://www.isinet.com/products/citation/wos/ User service
see
Added 28 July 2008 Jacsó, P. (2007) Web of Science, Gale, Reference Reviews, Péter's Digital Reference Shelf, January 2007
Jacsó, P. (2004) Web of Science Citation Indexes, Thomson Gale, August 2004
see also Comparative reviews

Added 11 December 2006 Rexa.Info Covers the computer science research literature. Rexa is "a sibling to CiteSeer, Google Scholar, Academic.live.com, the ACM Portal. It's chief enhancement is that Rexa knows about more first-class, de-duplicated, cross-referenced object types: not only papers and their citation links, but also people, grants, topics" http://rexa.info/ User service, free (login required)

Added 13 April 2006 Windows Live Search Academic Beta version. Indexes content related to computer science, physics, electrical engineering, and related subject areas, with more than 6 million records from approximately 4300 journals, 2000 conferences and ArXiv.org. In collaboration with Citeseer http://academic.live.com/ User service, free
see
Added 26 May 2008 Nadella, S. (2008) Book search winding down, Live Search, The official blog of the Live Search team at Microsoft, May 23, 2008
"Live Search Books and Live Search Academic projects ... will be taken down next week. Books and scholarly publications will continue to be integrated into our Search results, but not through separate indexes."
Added 28 July 2008 Jacsó, P. (2008) Live Search Academic, Gale, Reference Reviews, Péter's Digital Reference Shelf, April 2008
Added 28 July 2008 Jacsó, P. (2006) Windows Live Academic, Online, Sep/Oct 2006, 59-60
Added 15 May 2006 Quint, B. (2006) Windows Live Academic Search: The Details, Newsbreaks, April 17, 2006
Added 13 April 2006 Sherman, C. (2006) Microsoft Launches Windows Live Academic Search, SearchEngineWatch.com, April 12, 2006

Added 15 May 2006 Citations in Economics not intended for direct user access; instead is made available to RePEc services such as Socionet, EconPapers and IDEAS. Uses Citeseer software http://citec.repec.org/ Data service, free
Rank working papers series and journals in Economics http://citec.repec.org/s/
see
Barrueco Cruz, J. M. and Krichel, T. (2004) Building an autonomous citation index for grey literature: the economics working papers case (pdf 12pp), E-LIS, 01 February 2005, also in Proceedings GL6: Sixth International Conference on Grey Literature, New York, December 2004

CrossRef Forward linking service tool allows CrossRef member publishers to display cited-by links in their primary content, Data service
CrossRef and Atypon announce forward linking service (press release) June 8, 2004
Institute of Physics becomes first journals publisher to implement 'cited-by' links using CrossRef's Forward Linking service: Time travel with IOP journals (IOP press release) 14 March, 2005

Forthcoming ISI Web Citation Index User service
see
Added 15 May 2006 Martello, A. (2006) Selection of Content for the Web Citation Index: Institutional Repositories and Subject-Specific Archives, Thomson.com, undated
Pringle, J. (2005) Partnering helps institutional repositories thrive, KnowledgeLink Newsletter, February 2005
Citeseer's replacement? List server mailing, 18 March 2004
Quint, B. (2004) Thomson ISI to Track Web-Based Scholarship with NEC’s CiteSeer, Information Today Newsbreaks, March 1, 2004

Comparative reviews

Added 13 January 2009 Norris, M., Oppenheim, C. and Rowland, F. (2008)
Finding open access articles using Google, Google Scholar, OAIster and OpenDOAR
Online Information Review, Vol. 32, No. 6, 2008, 709-715
also available from Loughborough University Institutional Repository, 2009-01-12 http://hdl.handle.net/2134/4084
From the abstract: "Google, Google Scholar, OAIster and OpenDOAR were used to try to locate OA versions of peer reviewed journal articles drawn from three subjects (ecology, economics, and sociology). The paper shows the relative effectiveness of the search tools in these three subjects. The results indicate that those wanting to find OA articles in these subjects, for the moment at least, should use the general search engines Google and Google Scholar first rather than OpenDOAR or OAIster."

Added 28 July 2008 Jacsó, P. (2008)
The Plausibility of Computing the H-index of Scholarly Productivity and Impact Using Reference Enhanced Databases
Online Information Review, 32(2) 2008, 266-283
"aims to provide a general overview of the three largest, cited-reference-enhanced, multidisciplinary databases (Google Scholar, Scopus, and Web of Science) for determining the h-index. The practical aspects of determining the h-index also need scrutiny, because some content and software characteristics of reference-enhanced databases can strongly influence the h-index values."

Added 26 May 2008 Meho, L. I. and Rogers, Y. (2008) Citation Counting, Citation Ranking, and h-Index of Human-Computer Interaction Researchers: A Comparison between Scopus and Web of Science, E-LIS, 10 March 2008, in Journal of the American Society for Information Science and Technology, 59 (11): 1711-1726, September 2008

Added 26 May 2008 Kloda, L. A. (2007) Use Google Scholar, Scopus and Web of Science for Comprehensive Citation Tracking, Evidence Based Library and Information Practice, 2(3): 87-90, 2007, also in E-LIS, 21 September 2007 http://eprints.rclis.org/archive/00011437/

Added 26 May 2008 Schroeder, R. (2007) Pointing Users Toward Citation Searching: Using Google Scholar and Web of Science, portal: Libraries and the Academy, Vol. 7, No. 2, April 2007, 243-248 (full text requires subscription)

Added 13 September 2007 Goodman, D. and Deis, L. (2007) Update on Scopus and Web of Science, The Charleston Advisor, Vol. 8, No. 3, January 2007, 15-18

Added 10 May 2007 Meho, L. I. and Yang, K. (2006) A New Era in Citation and Bibliometric Analyses: Web of Science, Scopus, and Google Scholar, arXiv.org, Computer Science, cs/0612132, 23 Dec 2006, published as Impact of data sources on citation counts and rankings of LIS faculty: Web of science versus scopus and google scholar, in Journal of the American Society for Information Science and Technology, Vol. 58, No. 13, 2007, 2105-2125

Added 11 December 2006 Fingerman, S. (2006) Web of Science and Scopus: Current Features and Capabilities, Issues in Science and Technology Librarianship, Fall, 2006

Added 17 January 2007 Neuhaus, C. and Daniel, H.-D. (2006) Data sources for performing citation analysis: An overview, ETH E-Collection, June 30, 2006, Journal of Documentation, accepted for publication
Reports the limitations of Thomson Scientific’s citation indexes and reviews the characteristics of the citation-enhanced databases Chemical Abstracts, Google Scholar and Scopus.

Comment on this paper:
Pikas, C. K. "I think this article will be very helpful, but the extension that seems necessary right now is to CrossRef data. Many publishers such as the Optical Society of America via Optics Infobase provide forward and backward citations using CrossRef. When trying to *approach* comprehensiveness, I felt I had to look there as well as Scopus, Web of Science, Google Scholar, and CA. In fact, I found many citations there that were unique -- but this is not scientific, merely anecdotal." Sigmetrics listserv, 14 Dec 2006

Bakkalbasi, N., Bauer, K., Glover, J. and Wang, L. (2006) Three options for citation tracking: Google Scholar, Scopus and Web of Science, Biomedical Digital Libraries, June 29, 2006

Added 8 March 2007 Bosman, J., van Mourik, I., Rasch, M., Sieverts, E. and Verhoeff, H. (2006) Scopus reviewed and compared, Igitur repository, Utrecht University, June 2006
The coverage and functionality of the citation database Scopus, including comparisons with Web of Science and Google Scholar

Wenzel, E. (2006) Google Scholar beta, ZDNet, May 2, 2006
Brief comparison of Google Scholar and Microsoft Live Academic Search

Bailey, C. W. Jr (2006) A Simple Search Hit Comparison for Google Scholar, OAIster, and Windows Live Academic Search, Digital Koans, author blog, April 13, 2006
A simple but revealing experiment: "It should be clear that a sample of one search term is a very crude measure".

Pauly, D. and Stergiou, K. I. (2005) Equivalence of results from two citation analyses: Thomson ISI’s Citation Index and Google’s Scholar service (pdf 3pp), Ethics in Science and Environmental Politics, 22 December 2005, 33-35

Comment on this paper:
""outperform" means many things. OK, in this study the citation counts were close but the searchability of material on WOS is much stronger." ResourceShelf, February 25, 2006

Added 23 November 2006 Jacso, P. (2005) Comparison and analysis of the citedness scores in Web of Science and Google Scholar (pdf 10pp), Digital Libraries: Implementing Strategies and Sharing Experiences, Lecture Notes In Computer Science, 3815: 360-369, 2005, Proceedings of the 8th International Conference on Asian Digital Libraries, ICADL 2005, Bangkok, Thailand, December 12-15, 2005

Roth, D. L. (2005) The emergence of competitors to the Science Citation Index and the Web of Science (pdf 6pp), Current Science Online, Vol. 89, No. 9, 10 November 2005

Jacsó, P. (2005) As we may search – Comparison of major features of the Web of Science, Scopus, and Google Scholar citation-based and citation-enhanced databases (pdf 11pp), Current Science Online, Vol. 89, No. 9, 10 November 2005

Bauer, K. and Bakkalbasi, N. (2005) An Examination of Citation Counts in a New Scholarly Communication Environment, D-Lib Magazine, 11(9), September 2005.
Compares citation counts provided by Web of Science, Scopus, and Google Scholar.

Comment on this paper:
Stegmann, J. Clearer picture? "the authors of this interesting paper should, perhaps, take into account the ISI Proceedings database because the conference papers indexed therein are included together with their references. ... This would give a clearer picture of what one gets from free services like Google Scholar and from products which have to be licensed." Sigmetrics listserv, 16 September 2005
Author response: Focused and cross-disciplinary "We decided to concentrate on three multi-disciplinary databases only, as a way of focusing our work. We wanted to lay the foundation for a study that will compare subject areas: hence our decision to look at databases that cross many subject areas." Sigmetrics listserv, 16 September 2005

LaGuardia, C. (2005) Scopus vs. Web of Science, Library Journal, 130(1); 40, 42, January 15, 2005

Deis, L. F. and Goodman, D. (2005) Web of Science (2004 version) and Scopus, Charleston Advisor, Vol. 6, No. 3, January 2005

Background

The financial imperative: correlating research access, impact and assessment

There is another dimension to the open access advantage. If open access increases impact, then it will also increase research income and funding. It has been shown in the UK that there is a correlation between research assessment ratings and citation counts, and higher ratings means more money for the higher rated research groups. Of course, if all papers were made open access by their authors, the relative effect would disappear. First-mover advantage anyone?

"Research impact translates into money: employment, salary, tenure money, as well as research-funding money: (1) RAE rank correlates with substantial top-sliced funding, (2) it also correlates highly (0.91) with citation counts, and (3) self-archiving increases citation counts by 50-250+%. Do you really think that any researcher who is *aware* of those three correlations is being rational if he doesn't self-archive?" Stevan Harnad

Added 6 September 2010 Li, J., Sanderson, M., Willett, P., Norris, M. and Oppenheim, C. (2010)
Ranking of Library and Information Science Researchers: Comparison of Data Sources for Correlating Citation Data and Expert Judgments
Journal of Informetrics, 16 Jun 2010
Open access provides scope for new citation-based metrics, but these would have to be tested and validated against current, preferred methods of assessment. This paper is not focussed on open access, but it shows how such testing and validation might be performed.
From the Abstract: This paper studies the correlations between peer review and citation indicators when evaluating research quality in library and information science (LIS). Forty two LIS experts provided judgments on a five-point scale of the quality of research published by 101 scholars; the median rankings resulting from these judgments were then correlated with h-, g- and H-index values computed using three different sources of citation data: Web of Science (WoS), Scopus and Google Scholar (GS).

Added 06 September 2010 Aguillo, I. F., Ortega, J. L., Fernández, M. and Utrilla, A. M. (2010)
Indicators for a webometric ranking of open access repositories
Scientometrics, Vol. 82, No. 3, March 2010, 477-486, published online: 6 February 2010

Added 09 Mar 2010 Harnad, S., Carr, L., Swan, A., Sale, A. and Bosc, H. (2009)
Maximizing and Measuring Research Impact Through University and Research-Funder Open-Access Self-Archiving Mandates
ECS EPrints, 08 Dec 2009, in Wissenschaftsmanagement, 15 (4), 36-41

Added 09 Mar 2010 Aguillo, I. (2009)
Measuring the institution's footprint in the web
Library Hi Tech, Vol. 27, No. 4, 2009, 540-556
DOI: 10.1108/073788309

Added 09 June 2010 Allen, L., Jones, C., Dolby, K., Lynn, D. and Walport, M. (2009)
Looking for Landmarks: The Role of Expert Review and Bibliometric Analysis in Evaluating Scientific Publication Outputs
PLoS ONE, 4(6): e5910, June 18, 2009 doi:10.1371/journal.pone.0005910

Added 09 Mar 2010 Corbyn, Z. (2009)
Hefce backs off citations in favour of peer review in REF
Times Higher Education, 18 June 2009

Comment on this paper:
Readers' comments attached to the report:
Hull, R. "I would now like to know exactly which stupid, thoughtless person first proposed the hair-brained idea to use citations??"
Harnad, S. http://cogprints.org/1683/
Hefce backs off citations in favour of peer review in REF, ASIS&T; Sigmetrics listserv, Jun 24, 2009:
Bensman, S. J. "that is pretty much how Garfield recommended citations should be used and how they are used in US evaluations. You dont use citations by themselves but to balance your subjective judgments."
Harnad, S."Gene is of course right that citations alone are not and never were enough for research evaluation; they not only need to be "balanced" against subjective (peer expert) evaluations, but they need to be formally validated against them, discipline by discipline."

Added 26 February 2009, updated 29 April 2009 Houghton, J., Rasmussen, B., Sheehan, P., Oppenheim, C., Morris, A., Creaser, C., Greenwood, H., Summers, M. and Gourlay, A. (2009)
Economic implications of alternative scholarly publishing models: Exploring the costs and benefits
JISC, 27 January 2009

Comment on this paper:
Jacobs, N., Economic case for open access, JISC Information Environment Team blog, 27 Jan. 2009: "The findings suggest that there are both considerable cost savings to be made by the HE sector by moving to open access, and significant benefits to the UK economy to be gained by doing so. Both the potential cost savings and the benefits run into hundreds of millions of pounds. ... a complex piece of work, with potentially large implications"
Publishers' joint statement, 13 Feb. 2009: "claims that if adopted universally an exclusively open access business model would generate large savings in the system costs for scholarly communication in the UK in our view remain unproven. We expect that there will be a more detailed critique* in due course"
*STM, PA & ALPSP respond to Houghton JISC Report, 7 April 2009, JISC quote/comment response, undated but first blogged with selected extracts 25 April 2009

Added 10 November 2008 Oppenheim, C. (2008)
Out with the old and in with the new: The RAE, bibliometrics and the new REF (first page pdf; full text requires subscription)
Journal of Librarianship and Information Science, 40 (3): 147-149, September 2008

Added 24 November 2008 Cho, S.-R. (2008)
New evaluation indexes for articles and authors' academic achievements based on Open Access Resources (full text requires subscription; abstract only)
Scientometrics, Vol. 77, No. 1 (2008) 91-112, published online: 24 July 2008

Added 10 November 2008 Oppenheim, C. and Summers, M. A. C.
Citation counts and the Research Assessment Exercise, part VI: Unit of assessment 67 (music)
Information Research, 13 (2), paper 342, June 2008

Added 28 July 2008 Adler, R., Ewing, J. (Chair) and Taylor, P. (2008)
Citation Statistics (pdf 26pp)
Joint Committee on Quantitative Assessment of Research, International Mathematical Union, IMU-ICIAM-IMS, 6/11/2008

Comment on this paper:
Citation statistics, American Scientist Open Access Forum
Armbruster, C., 11 June 2008: "after reading the report, I would caution against dismissing it. Science and scientists should be concerned about the politicisation of metrics. Politicisation comes from governments and research funders but is also going on inside academic institutions. Moreover, in a general sense the citation and usage metrics currently available are not 'fit for purpose'. Worse still, politicisation carries with it the significant risk of arresting the development of tools for metric research evaluation. ... All we have at the moment are some 'quick fix metrics'. And these are increasingly used to make and legitimate all kinds of decisions. It is thus welcome that mathematicians and statisticians scrutinise current practices and show up the lack of validity and reliability of many measures, technical faults as well as the misguided judgements of peers, university management, funding agencies and government."
Oppenheim, C., 12 June 2008: "it fails to address the fundamental issue, which is: citation and other metrics correlate superbly with subjective peer review. Both methods have their faults, but they are clearly measuring the same (or closely related) things. Ergo, if you have evaluate research in some way, there is no reason NOT to use them!  It also keeps referring to examples from the field of maths, which is a very strange subject citation-wise."
Harnad, S., Citation Statistics: International Mathematical Union Report, Open Access Archivangelism blog, June 15, 2008: "what all this valuable, valid cautionary discussion overlooks is not only the possibility but the empirically demonstrated fact that there exist metrics that are highly correlated with human expert rankings. It follows that to the degree that such metrics account for the same variance, they can substitute for the human rankings. The substitution is desirable, because expert rankings are extremely costly in terms of expert time and resources. Moreover, a metric that can be shown to be highly correlated with an already validated variable predictor variable (such as expert rankings) thereby itself becomes a validated predictor variable. And this is why the answer to the basic question of whether the RAE's decision to convert to metrics was a sound one is: Yes."
Bensman, S. J., IMU Critique of Citation Analysis, SIGMETRICS, 27 June 2008: "I checked the distribution of mathematics journals by impact factor in the 2007 SCI JCR. It was as I suspected. The range of impact factors was only from 0.108 to 2.739--extraordinarily low and tight--and the top journals on the impact factor had no review articles. This is suggestive of an extremely random citation pattern with no development of consensual paradigms. Therefore, math acts like a humanities in terms of its literature use, and citation analysis is probably not applicable to this discipline. If citation analysis is used, it has to be backed by other measures."
Singleton, A., Book Review, Learned Publishing, Vol. 21, No. 4, October 2008, 329-331: "For anyone experienced in these matters, or even in scholarly publishing or library and information science (LIS) generally, it consists mainly of statements of the fairly obvious - and a lot of repetition. ... Overall this is a curate's egg of a report. I think we deserve better. Since it comes in the name of three prestigious institutions, one might have expected the review to be more thorough and, perhaps, the text to be better edited, but for policymakers the examples and simplicity of statements will, I think, make it a useful document."

Added 28 July 2008 Harnad, S. (2008)
Validating research performance metrics against peer rankings
Ethics in Science and Environmental Politics, Vol. 8, No. 1, June 03, 2008, 103-107

Added 28 July 2008 Taraborelli, D. (2008)
Soft peer review. Social software and distributed scientific evaluation (pdf 12pp)
In Proceedings of the 8th International Conference on the Design of Cooperative Systems (COOP 08), Carry-Le-Rouet, France, May 20-23, 2008
From the abstract: "I analyze the contribution that social bookmarking systems can provide to the problem of usage-based metrics for scientific evaluation. I suggest that collaboratively aggregated metadata may help fill the gap between traditional citation-based criteria and raw usage factors."

Added 13 January 2009 Pringle, J. (2008)
Trends in the use of ISI citation databases for evaluation
Learned Publishing, Vol. 21, No. 2, April 2008, 85-91
Abstract: "This paper explores the factors shaping the current uses of the ISI citation databases in evaluation both of journals and of individual scholars and their institutions. Given the intense focus on outcomes evaluation, in a context of increasing 'democratization' of metrics in today's digital world, it is easy to lose focus on the appropriate ways to use these resources, and misuse can result."

Added 11 February 2008 Armbruster, C. (2008)
Access, Usage and Citation Metrics: What Function for Digital Libraries and Repositories in Research Evaluation?
Social Science Research Network, February 05, 2008
From the abstract: "This systematic appraisal of the future role of digital libraries and repositories for metric research evaluation proceeds by investigating the practical inadequacies of current metric evaluation before defining the scope for libraries and repositories as new players. Services reviewed include: Leiden Ranking, Webometrics Ranking of World Universities, COUNTER, MESUR, Harzing POP, CiteSeer, Citebase, RePEc LogEc and CitEc, Scopus, Web of Science and Google Scholar."

Added 26 May 2008 Surya, M., D'Este, P. and Neely, A. (2008)
Citation Counts: Are They Good Predictors of RAE Scores? A bibliometric analysis of RAE 2001
Cranfield QUEprints, 31.01.2008

Comment on this paper:
RAE/REF research, JISC-REPOSITORIES, 15 February 2008
Oppenheim, C. "I've had a first look at it; it uses methods not previously employed in such studies and without a full explanation of how the research was conducted. That's not to say it is invalid, but it is a lost opportunity, having collected so much data, not to have followed standard methods or to explain things more fully."
Johnson, I. M. "shows an inadequate understanding of the contextual issues. Moreover, most of the discussion in section 5 appears to have little or no basis in the preceding data."
Harnad, S. "This pilot study has some methodological weaknesses. The remedy for the ostensible problems encountered in this study is for the panel rankings in the parallel metric/panel RAE 2008 to be analysed in a full-scale multiple regression study using as rich and diverse as possible a spectrum of predictive metrics (not just citation counts!), discipline by discipline."

Added 10 May 2007, Updated 17 July 2009 Harnad, S. (2007)
Open Access Scientometrics and the UK Research Assessment Exercise
ArXiv, Computer Science, cs.IR/0703131, 26 March 2007. Preprint of invited keynote address to 11th Annual Meeting of the International Society for Scientometrics and Informetrics, Madrid, 25-27 June 2007
also in ECS EPrints, 29 March 2007 http://eprints.ecs.soton.ac.uk/13804/
Latest version ECS EPrints, 27 Feb 2009 http://eprints.ecs.soton.ac.uk/17142/, in Scientometrics, 79 (1), 2009, 147-156, published online: 13 November 2008, http://dx.doi.org/10.1007/s11192-009-0409-z

Added 19 November 2006 Steele, C., Butler, L. and Kingsley, D. (2006)
The publishing imperative: the pervasive influence of publication metrics
ANU Institutional Repository, 30 October 2006, also in Learned Publishing, 19(4): 277-290, October 2006

Added 19 November 2006 Houghton J. and Sheehan, P. (2006)
The Economic Impact of Enhanced Access to Research Findings
Centre for Strategic Economic Studies. Victoria University. July 2006
See also
Houghton, J., Steele, C. and Sheehan, P. (2006)
Research Communication Costs In Australia: Emerging Opportunities And Benefits
Department of Education, Science and Training (DEST), Australia, September 2006

Comment on The Economic Impact of Enhanced Access to Research Findings:
Harnad, S. Maximising the Return on Research "These estimates agree substantially with prior estimates that have been made (e.g., for the UK, Canada and Australia)." American-Scientist-Open-Access-Forum, 9 August 2006

Added 13 September 2007 Shadbolt, N., Brody, T., Carr, L. and Harnad, S. (2006)
The Open Research Web: A Preview of the Optimal and the Inevitable
ECS EPrints, 02 May 2006, in Open Access: Key Strategic, Technical and Economic Aspects, Jacobs, N., Ed., chapter 21 (Oxford: Chandos Publishing)

Added 26 September 2005 Harnad, S. (2005)
Maximising the Return on UK's Public Investment in Research
Author eprint, September 14, 2005
Attempts to monetise 'lost' impact: "The online-age practice of self-archiving has been shown to increase citation impact by a dramatic 50-250%, but so far only 15% of researchers are doing it spontaneously. Citation impact is rewarded by universities (through promotions and salary increases) and by research-funders like RCUK (through grant funding and renewal) at a conservative estimate of £46 per citation. ... As a proportion of the RCUK’s yearly £3.5bn research expenditure (yielding 130,000 articles x 5.6 = 761,600 citations), our conservative estimate would be 50% x 85% x £3.5.bn = £1.5bn worth of loss in potential research impact (323,680 potential citations lost)."
See also
Australia is not maximising the return on its research investment (ETD2005, Sydney) for the same estimate applied to the potential lost return ($425M) there.

Comment on this paper:
Rowland, I. "If I am the one millionth author (or the 10,000th research group or the 100th nation) to publish open access, that comparative advantage must quickly decline, approaching zero as the last few laggards pile in". Author response: "Ian Rowland is exactly right that the OA impact advantage (currently 50-250%) will shrink as we approach 100% OA. Right now we are at 15% OA, and the advantage is in part -- no one can say how large a part -- a *competitive* advantage of the minority 15% OA -- the head-start vanguard -- over the laggard 85% non-OA majority. ... The OA impact advantage consists of at least the following 6 factors ..."
American Scientist Open Access Forum, OA advantage = EA + (AA) + (QB) + QA + (CA) + UA, 15 September 2005

Waters, D. "Mr. Harnad may well be on to an important subject and line of argument in suggesting that citations are a kind of return on investment. However, close inspection of the concepts and logic of his argument suggests that he is quite a bit further from proving his case than he seems to have convinced himself that he is." Author response: "No proof here: Just conservative estimates."
American Scientist Open Access Forum, Open access to research worth 1.5bn a year, 27 September 2005

Day, M. (2004)
Institutional repositories and research assessment (pdf 29pp)
Author eprint (v. 0.1), 2 December 2004

Harnad, S. (2003)
Maximizing university research impact through self-archiving
Jekyll.com, No. 7, December 2003

Harnad, S. (2003)
Enhance UK research impact and assessment by making the RAE webmetric
Author eprint, in Times Higher Education Supplement, 6 June 2003, p. 16

Harnad, S., Carr, L., Brody, T. and Oppenheim, C. (2003)
Mandated online RAE CVs linked to university eprint archives: Enhancing UK research impact and assessment
Ariadne, issue 35, April 2003

Smith, A. and Eysenck, M. (2002)
The correlation between RAE ratings and citation counts in psychology (pdf 12pp)
Technical Report, Psychology, Royal Holloway College, University of London, June 2002

Holmes, A. and Oppenheim, C. (2001)
Use of citation analysis to predict the outcome of the 2001 Research Assessment Exercise for Unit of Assessment (UoA) 61: Library and Information Management
Information Research, Vol. 6, No. 2, January 2001

Harnad, S. (2001)
Research Access, Impact and Assessment (longer version)
Author eprint, in Times Higher Education Supplement, 1487: p. 16., 2001

Garfield, E. (1988)
Can Researchers Bank on Citation Analysis? (pdf 10pp)
Current Comments, No. 44, October 31, 1988
attached (pp 3-10)
Diamond, Jr., A. M. (1986)
What is a Citation Worth?
J. Hum. Resour., 21:200-15, 1986
Garfield comments on studies that attempt to quantify the reward system of science in terms of monetary returns to author salaries from article publication and citations, reprinting one of those studies

Citation analysis, indexes and impact factors

Notes. Important work in this area builds on Eugene Garfield's pioneering work in the 1950s. Although producing some remarkably successful tools for measuring the impact of the scholarly literature, this area is not without controversy. This short list presents a cross-section underlining these issues, with a view to understanding how such long-established approaches might adapt to online data, and how possible shortcomings might be overcome.

Added 6 September 2010 Priem, J. and Hemminger, B. (2010)
Scientometrics 2.0: Toward new metrics of scholarly impact on the social Web
First Monday, 15 (7), Jul 2010

Added 6 September 2010 Herb, U., Kranz, E., Leidinger, T. and Mittelsdorf, B. (2010)
How to assess the impact of an electronic document? And what does impact mean anyway?: Reliable usage statistics in heterogeneous repository communities
E-LIS, 10 Jun 2010. In OCLC Systems & Services, Vol. 26, No. 2, 2010, 133-145
From the Abstract: Purpose - Usually the impact of research and researchers is quantified by using citation data: either by journal-centered citation data as in the case of the journal impact factor (JIF) or by author-centered citation data as in the case of the Hirsch- or h-index. This paper aims to discuss a range of impact measures, especially usage-based metrics, and to report the results of two surveys. Originality/value - This paper delineates current discussions about citation-based and usage-based metrics. Based on the results of the surveys, it depicts which functionalities could enhance repositories, what features are required by scientists and information professionals, and whether usage-based services are considered valuable. These results also outline some elements of future repository research.

Added 6 September 2010 Repanovici, A. (2010)
Measuring the visibility of the University's scientific production using GoogleScholar, "Publish or Perish" software and Scientometrics
76th IFLA General Conference and Assembly, Gothenburg, Sweden, 07 Jun 2010
From the Abstract: The first Romanian institutional repository was implemented at Transilvania University of Brasov. As part of the undertaken research, the visibility and the impact of the university's scientific production was measured using the scientific methods of scientometry, as a fundamental instrument for determining the international value of an university as well as for the statistical evaluation of scientific research results. The results showed that an open access institutional repository would significantly add to the visibility of the university's scientific production. In this article we define the scientific production and productivity and present the main indicators for the measurement of the scientific activity. Google Scholar was used as a scientometric database which can be consulted free of charge on the Internet and which indexes academic papers from institutional repositories, identifying also the referenced citations. The free Publish or Perish software can be used as an analysis instrument for the impact of the research, by analysing the citations through the h-index. We present the methodology and the results of an exploratory study made at the Transilvania University of Brasov regarding the h-index of the academic staff.

Added 9 June 2010 Neff, B. and Olden, J. (2010)
Not So Fast: Inflation in Impact Factors Contributes to Apparent Improvements in Journal Quality
BioScience, 60 (6), 455-9, June 2010
From the Abstract: Here we propose that impact factors may be subject to inflation analogous to changes in monetary prices in economics. The possibility of inflation came to light as a result of the observation that papers published today tend to cite more papers than those published a decade ago. We analyzed citation data from 75,312 papers from 70 ecological journals published during 1998-2007. We found that papers published in 2007 cited an average of seven more papers than those published a decade earlier. This increase accounts for about 80% of the observed impact factor inflation rate of 0.23. In examining the 70 journals we found that nearly 50% showed increases in their impact factors, but at rates lower than the background inflation rate. Therefore, although those journals appear to be increasing in quality as measured by the impact factor, they are actually failing to keep pace with inflation.

Comment on this paper:
Davis, P., Impact Factor Inflation: When an Increase is Actually a Decrease, Scholarly Kitchen, Jul 12, 2010: Assesses the contribution of this paper to measuring impact factor inflation.

Added 6 September 2010 Repanovici, A. (2010)
Measuring the visibility of the universities scientific production using scientometric methods
6th WSEAS/IASME International Conference on Educational Technologies (EDUTE '10), 03 May 2010
Abstract: Paper presents scientometry as a science and a fundamental instrument for determining the international value of an university as well as for the statistical evaluation of scientific research results. The impact of the research measurable through scientometric indicators is analyzed. Promoting the scientific production of universities through institutional digital repositories deals with the concept of scientific production of the university and the development of scientific research in information society. These concepts are approached through the prism of marketing methods and techniques. The digital repository is analyzed as a PRODUCT, destined for promoting, archieving and preserving scientific production. Find out more about the author and the paper here.
The record and abstract page for the paper does not currently link to the full text.

Added 6 September 2010 Wardle, D. (2010)
Do Faculty of 1000 (F1000) ratings of ecological publications serve as reasonable predictors of their future impact?
Ideas in Ecology and Evolution, 3, 2010, 11-15
Commentary article.
From the Abstract: There is an increasing demand for an effective means of post-publication evaluation of ecological work that avoids pitfalls associated with using the impact factor of the journal in which the work was published. One approach that has been gaining momentum is the Faculty of 1000 (hereafter F1000) evaluation procedure, in which panel members identify what they believe to be the most important recent publications they have read. Here I focused on 1530 publications from 7 major ecological journals that appeared in 2005, and compared the F1000 rating of each publication with the frequency with which it was subsequently cited. ... Possible reasons for the F1000 process failing to identify high impact publications may include uneven coverage by F1000 of different ecological topics, cronyism, and geographical bias favoring North American publications. As long as the F1000 process cannot identify those publications that subsequently have the greatest impact, it cannot be reliably used as a means of post-publication evaluation of the ecological literature.

Comment on this paper:
Davis, P., Post-Publication Review: Does It Add Anything New and Useful? Scholarly Kitchen, Jul 14, 2010: Questions the value of post-publication review with reference to Wardle's paper and evidence from two other recent articles.

Added 9 June 2010 Bornmann, L. and Daniel, H.-D. (2010)
The citation speed index: A useful bibliometric indicator to add to the h index
Authors' server, undated but notice posted to Sigmetrics listserv 26 March 2010, in Journal of Informetrics, accepted for publication
This topic would appear to be a natural complement to OA citation effects, but the paper does not mention any.

Added 9 June 2010 Horwood, L. and Robertson, S. (2010)
Role of bibliometrics in scholarly communication
VALA2010 15th Biennial Conference and Exhibition, Melbourne, 9 Feb 2010

Added 9 Mar 2010 Bar-Ilan, J. (2009)
A Closer Look at the Sources of Informetric Research
CYBERmetrics, 13 (1), 23 Dec 2009
From the introduction: The Web has existed for twenty years only, yet the large majority of the data sources for informetric research are available through the Web. ISI's Web of Science was launched in 1997 ... In November 2004 two additional major citation databases appeared on the Web: Elsevier's Scopus and Google Scholar ... and there are easily accessible and often open-source software tools that enable to collect and analyze large quantities of data even on a personal computer. It has become easy to conduct "desktop or poor-man's bibiliometrics". The data for informetric research have never been perfect, but now that informetric analysis can be conducted with much greater ease than before, it is even more important to understand the limitations and problems of data sources and methods and to assess the validity of the results. In the following sections I discuss some limitations of the existing sources.

Added 9 June 2010 Gonzalez-Pereira, B., Guerrero-Bote, V., Moya-Anegon, F. (2009)
The SJR indicator: A new indicator of journals' scientific prestige
ArXiv, arXiv:0912.4141v1 [cs.DL], 21 Dec 2009

Added 9 Mar 2010 Ball, K. (2009)
The Indexing of Scholarly Open Access Business Journals
Electronic Journal of Academic and Special Librarianship, 10 (3), Winter 2009
"this study focusses on the business and management field and assess the extent to which scholarly open access journals in this discipline are currently being indexed by both commercial and non-commercial indexing services. Of the commercial indexing services, Ebscos Business Source Complete covers by far the largest number of open access journals. For business researchers working in an academic environment, Business Source Complete, with its more sophisticated searching and browsing capabilities and deeper historical coverage, is probably the best one-stop option for retrieving scholarly materials from both the subscription-based and OA literature. However, from a simple quantity perspective, OA business journals are being most extensively indexed by OA indexing services, in particular, Google Scholar and Open J-Gate."

Added 9 Mar 2010 Neylon, C. and Wu, S. (2009)
Article-Level Metrics and the Evolution of Scientific Impact
PLoS Biology, 7 (11), 17 Nov 2009

Added 9 Mar 2010 Moed, H. (2009)
Measuring contextual citation impact of scientific journals
ArXiv, arXiv:0911.2632v1 [cs.DL], 13 Nov 2009. Also in Journal of Informetrics (to appear)
About journal impact, and not directly about open access. From the abstract: "This paper explores a new indicator of journal citation impact, denoted as source normalized impact per paper (SNIP). It measures a journal's contextual citation impact, taking into account characteristics of its properly defined subject field, especially the frequency at which authors cite other papers in their reference lists, the rapidity of maturing of citation impact, and the extent to which a database used for the assessment covers the field's literature. It aims to allow direct comparison of sources in different subject fields."

Added 15 Oct 2009 Armbruster, C. (2009)
Whose Metrics? On Building Citation, Usage and Access Metrics as Information Service for Scholars
SSRN Social Science Research Network, 31 Aug 2009
Services mentioned: Journal impact factor, journal usage factor, GoPubMed, SSRN CiteReader, RePEc LogEc, RePEc CitEc, SPIRES, Harzing POP, Webometrics, ISI Web of Knowledge, Scopus, Google Scholar, Citebase, CiteSeer X, CERIF

Added 15 Oct 2009 Stock, W. (2009)
The Inflation of Impact Factors of Scientific Journals
ChemPhysChem, 10 (13), 17 Aug 2009, 2193-6

Added 9 Mar 2010 Patterson, M. (2009)
PLoS Journals  measuring impact where it matters
PLoS blog, 2009-07-13
On why PLoS is will no longer highlight the journal impact factor. Instead it will present a range of metrics focussed on the published paper, including individual citation counts from various sources, blog and bookmark counts, links and searches, as illustrated by Peter Binfield in the PLoS one community blog (March 31, 2009): "rather than updating the PLoS Journal sites with the new numbers, weve decided to stop promoting journal impact factors on our sites all together. Its time to move on, and focus efforts on more sophisticated, flexible and meaningful measures."

Added 9 Mar 2010 Canos Cerda, J. H., Campos, M. L. and Nieto, E. M. (2009)
What's Wrong with Citation Counts?

D-Lib Magazine, Vol. 15 No. 3/4, March/April 2009
From the abstract: "We argue that a new approach based on the collection of citation data at the time the papers are created can overcome current limitations, and we propose a new framework in which the research community is the owner of a Global Citation Registry characterized by high quality citation data handled automatically."

Added 26 February 2009 Cross, J. (2009)
Impact factors - the basics
The E-Resources Management Handbook (2006 - present), UKSG, this chapter published online: 03 February 2009

Added 10 November 2008 Leydesdorff, L. (2008)
How are new citation-based journal indicators adding to the bibliometric toolbox?
Author preprint, undated, (announced 31 Oct 2008), in Journal of the American Society for Information Science and Technology, Vol. 60, No. 7, 2009, 1327-1336, published online: 2 Feb 2009 http://dx.doi.org/10.1002/asi.21024
From the abstract: "The launching of Scopus and Google Scholar, and methodological developments in social-network analysis have made many more indicators for evaluating journals available than the traditional impact factor, cited half-life, and immediacy index of the ISI. In this study, these new indicators are compared with one another and with the older ones."

Added 9 June 2010 Final Impact: What Factors Really Matter? (VIDEO) (2008)
Scholarly Communication Program, Columbia University, October 30, 2008
Panelists: Marian Hollingsworth, Thomson Reuters; Jevin West, Eigenfactor.org; and Johan Bollen, Los Alamos National Laboratory

Added 10 November 2008 Radicchi, F., Fortunato, S. and Castellano, C. (2008)
Universality of citation distributions: towards an objective measure of scientific impact
arXiv.org, arXiv:0806.0974v2 [physics.soc-ph], 5 Jun 2008 (v1), last revised 27 Oct 2008
in Proceedings of the National Academy of Sciences of The United States of America, 105 (45): 17268-17272, Nov. 11 2008

Comment on this paper:
Davis, P., Universal Citations, the scholarly kitchen blog, Nov 3, 2008: "differences between disciplines can be quickly remediated by a simple, intuitive calculation: divide the number of citations to a paper by the average number of citations to all papers in its discipline for that year. The effect is stunning, and seems to hold irrespective of the publication year studied. Looking at the effect of this normalization is like looking at ducks lining up in a row."

Added 10 November 2008 Banks, M. A. and Dellavalle, R. (2008)
Emerging Alternatives to the Impact Factor
E-LIS, 05 September 2008, also in OCLC Systems & Services, 24(3)

Added 10 November 2008 Brumback, R. A. (2008)
Worshiping false idols: the impact factor dilemma
J. Child Neurol., Vol. 23, No. 4, April 2008, 365-367
"the opacity in Thomson Scientific's refusal to reveal the details of their calculations only serves to increase suspicion about possible data manipulations. ... Now would seem to be the appropriate time for the academic community to demand valid metrics to assess published scientific material"

Comment on this paper:
Pringle, J., Correcting the Record, J. Child Neurol., Vol. 23, No. 9, September 2008, 1092: "reiterates statements made by Michael Rossner about a supposed discrepancy in our database relating to the impact factor calculations for the Journal of Experimental Medicine. The Rossner editorial, though republished several times, each time repeats the same discredited assertions. I refer your readers to the corrections published by Thomson Scientific"
Brumback, R. A., Response to Correspondence, J. Child Neurol., Vol. 23, No. 9, September 2008, 1092-1094: "it is disappointing that Pringle chose to use typical faulty reasoning by attacking my citing of the article by Rossner et al (which was just 1 of the total 32 references) rather than addressing the real issues raised in my editorial. ... Concerns about the journal impact factor values are not new and have been voiced for more than a decade (but mostly to deaf ears)8-38 ... Although it would be preferable to have journal indicators produced by an independent not-for-profit organization, the Thomson Scientific journal impact factor could be acceptable if Thomson Scientific were carefully to address each specific concern that has been raised and then provide a truly open transparent reporting system with resultant nonmanipulable values."

Added 26 May 2008 Althouse, B. M., West, J. D., Bergstrom, T. C. and Bergstrom, C. T. (2008)
Differences in Impact Factor Across Fields and Over Time
eScholarship Repository, California Digital library, Department of Economics, University of California Santa Barbara, Departmental Working Papers, paper 2008-4-23, April 23, 2008. In Journal of the American Society for Information Science and Technology, Vol. 60, No. 1, 27-34, published online: 21 Aug 2008

Comment on this paper:
Davis, P., Impact Factor Inflation: When an Increase is Actually a Decrease, Scholarly Kitchen, Jul 12, 2010: Assesses the contribution of this paper to measuring impact factor inflation.

Added 11 February 2008 Kosmopoulos, C. et Pumain, D. (2007)
Citation, Citation, Citation: Bibliometrics, the web and the Social Sciences and Humanities
Cybergeo, Science et Toile, article 411, mis en ligne le 17 decembre 2007, modifie le 18 janvier 2008
From the abstract: "The paper reviews the main (bibliometric) data bases and indicators in use. It demonstrates that these instruments give a biased information about the scientific output of research in Social Sciences and Humanities."

Comment on this paper:
Krichel, T. (2007), bibliometrics and open access solutions, Budapest Open Access Initiative: BOAI Forum Archive, 25 December 2007: "I am somewhat saddened to read that this survey does not discuss CitEc, which I think is the largest open-access citation index in the social sciences."

Added 28 July 2008 Rossner, M., Van Epps, H. and Hill, E. (2007)
Show me the data (editorial)
The Journal of Cell Biology, Vol. 179, No. 6, 1091-1092, published online December 17, 2007
"Just as scientists would not accept the findings in a scientific paper without seeing the primary data, so should they not rely on Thomson Scientific's impact factor, which is based on hidden data. As more publication and citation data become available to the public through services like PubMed, PubMed Central, and Google Scholar®, we hope that people will begin to develop their own metrics for assessing scientific quality rather than rely on an ill-defined and manifestly unscientific number."

Comment on this paper:
Pendlebury, D. A., Thomson Scientific Corrects Inaccuracies In Editorial, Thomson Reuters Citation Impact Forum, undated: "Rossner, Van Epps, and Hill (argue) that Thomson Scientific's impact factor measure for the evaluation of journals should not be trusted since an article data set purchased from Thomson by The Rockefeller University Press did not exactly replicate the Journal Citation Reports data for its own -- and selected other -- journals. When these data were questioned by The Rockefeller University Press, Thomson staff explained precisely the content of the data, as well as its derivation and use. Unfortunately for the readers of the Rossner editorial, the authors misunderstood much and as a result, misled readers about several matters, not only regarding the data but what Thomson representatives did and said from June to September 2007 in many email exchanges."

Added 22 August 2007 Citrome, L. (2007)
Impact Factor? Shmimpact Factor! The Journal Impact Factor, Modern Day Literature Searching, and the Publication Process
Psychiatry, 4(5):54-57, 2007

Added 17 January 2007 Bornmann, L. and Daniel, H.-D. (2007)
What do citation counts measure? A review of studies on citing behavior
Author eprint, undated, Journal of Documentation, accepted for publication

Added 17 January 2007 Meho, L. I. (2006)
The Rise and Rise of Citation Analysis
Author eprint, dLIST, 31 December 2006, Physics World, January 2007
"Provides a historical background of citation analysis, impact factor, new citation data sources (e.g., Google Scholar, Scopus, NASA's Astrophysics Data System Abstract Service, MathSciNet, ScienceDirect, SciFinder Scholar, Scitation/SPIN, and SPIRES-HEP), as well as h-index, g-index, and a-index."

Added 19 November 2006 Electronic Publishing Services and Oppenheim, C. (2006)
UK scholarly journals: 2006 baseline report: An evidence-based analysis of data concerning scholarly journal publishing, see Area 4: Citations, impact factors and their role
Research Information Network, Research Councils UK and the Department of Trade & Industry, October 3, 2006

Comment:
Harnad, S. (2006) Critique of EPS/RIN/RCUK/DTI "Evidence-Based Analysis of Data Concerning Scholarly Journal Publishing", Open Access Archivangelism, October 9. 2006

Added 3 May 2007 Ewing, J. (2006)
Measuring Journals
Notices of the AMS, Vol. 53, No. 9, October 2006, 1049-1053
"in many respects usage statistics are even more flawed than the impact factor, and once again, the essential problem is that there are no explicit principles governing their interpretation. ... while usage statistics are only slightly useful, their misuse can be enormously damaging."

Comment:
Velterop, J. RE: UKSG Usage Factor Research - an Update, liblicense, March 9. 2007: "Ewing further says that "Distrust of 'subjective' scholarly judgment is a modern disease -- one that is profoundly anti-intellectual." I would add that blind trust in 'objective' measurements is equally profoundly anti-intellectual."
Davis, P. RE: UKSG Usage Factor Research - an Update, liblicense, March 10. 2007: "Like citations, usage statistics do not give us an absolute notion of value of journals or articles, yet they do provide us with a measure of utility, and for the sciences, utility is a very powerful measure for how ideas get transmitted through communities and are incorporated into current research. Unlike citations, usage statistics give us a sense of the community of readers (which include authors) and not just the author community. Article downloads provide a robust estimate of the size of user communities, and are also predictive of future citations. In fact, a single week of article downloads from BMJ can predict citations five years later."

Added 8 March 2007 Garfield, E. (2006)
Commentary: Fifty years of citation indexing
International Journal of Epidemiology, 2006 35(5):1127-1128, published online September 19, 2006

Added 03 August 2006 PLoS Medicine Editors (2006)
The Impact Factor Game: It is time to find a better way to assess the scientific literature
PLoS Medicine, Vol. 3, No. 6, June 2006

Added 15 May 2006 Altbach, P. G. (2006)
The Tyranny of Citations
Inside Higher Ed, May 8, 2006

Added 28 February 2006 Noruzi, A. (2006)
The Web Impact Factor: a critical review (pdf, 10pp)
E-LIS, February 9, 2006, in The Electronic Library, 24 (2006)
"Web Impact Factor (WIF) is a quantitative tool for evaluating and ranking web sites ... search engines provide similar possibilities for the investigation of links between web sites/pages to those provided by the academic journals citation databases from the Institute of Scientific Information (ISI). But the content of the Web is not of the same nature and quality as the databases maintained by the ISI."

Added 28 February 2006 Bollen, J., Rodriguez, M. A. and Van de Sompel, H. (2006)
Journal Status (pdf, 16pp)
Arxiv, 9 January 2006
"By merely counting the amount of citations and disregarding the prestige of the citing journals, the ISI IF is a metric of popularity, not of prestige. We demonstrate how a weighted version of the popular PageRank algorithm can be used to obtain a metric that reflects prestige. ... Furthermore, we introduce the Y-factor which is a simple combination of both the ISI IF and the weighted PageRank, and find that the resulting journal rankings correspond well to a general understanding of journal status."

Added 03 August 2006 Moed, H.F. (2005)
Citation analysis of scientific journals and journal impact measures
Current Science, 89 (12): 1990-1996, December 25, 2005

Added 28 February 2006 Dong, P., Loh, M. and Mondry, A. (2005)
The "impact factor" revisited
Biomedical Digital Libraries, December 2005
This is a review, so the findings are not new, but this is perhaps the first such paper to reflect on the effect of free and online availability on journal impact factors, among other IF-related issues.

Added 30 December 2005 Hardy, R., Oppenheim, C., Brody, T. and Hitchcock, S. (2005)
Open Access Citation Information (.doc, 105pp)
Author eprint, November 11, 2005, JISC Committee for the Information Environment (JCIE) Scholarly Communication Group, September 2005
Describes a proposal to increase the exposure of open access materials and their references to indexing services, and to motivate new services by reducing setup costs.

Added 8 March 2007 Perkel, J. M. (2005)
The Future of Citation Analysis
The Scientist, Vol. 19, No. 20, October 24, 2005
"The challenge is to track a work's impact when published in nontraditional forms"

Added 30 December 2005 Monastersky, R. (2005)
Impact Factors Run Into Competition
Chronicle of Higher Education, October 14, 2005

Comment on this article:
Harnad, S. IFs: solution is obvious "Although Richard Monasterky describes a real problem -- the abuse of journal impact factors -- its solution is so obvious -- (a) wealth of powerful new resources are on the way for measuring and analyzing research usage and impact online" American Scientist Open Access Forum, 10 October 2005
Bensman, S. J. Good copy, bad science "I found his article to be unfair, since he concentrated on the shenanigans that are being played with impact factor and supposed errors of ISI in constructing impact factor. This makes for good copy but bad science." Sigmetrics listserv, 18 November 2005
Leydesdorff, L. Discipline-specific impact factor "Monasterky's article lists a number of problems with the ISI-impact factor. However, he fails to mention that the average impact factors vary among fields of science. For example, impact factors in toxicology are considerably lower than in immunology. ... A fix to these problems might be a discipline-specific impact factor. ... Using ISI's Journal Citation Reports, I created the raw materials to make maps of the citation neighborhoods of all the journals." Sigmetrics listserv, 16 September 2005

Added 28 February 2005 Garfield, E. (2005)
The Agony and the Ecstasy - ­The History and Meaning of the Journal Impact Factor (pdf, 22pp)
International Congress on Peer Review and Biomedical Publication, Chicago, September 16, 2005
Garfield's typically dry, data-filled but essential take on JIFs.

Publishers promote impact factors of OA journals
BioMed Central "Open access journals get impressive impact factors" 23 June 2005
Public Library of Science "The first impact factor for PLoS Biology - 13.9" 27 June 2005
See also this discussion of these announcements on SPARC Open Access Forum, prompted by Elsevier's response from Tony McSean, followed by David Goodman, Charles Bailey, (both 8 July) and Matthew Cockerill (10 July), or see this summary of the discussion: "BMC’s Impact Factors: Elsevier’s Take and Reactions to It", Digital Koans (Charles Bailey's Weblog), 11 July 2005, including Peter Suber's conclusion: "It’s important to distinguish the citation impact of an individual article from a journal impact factor. The BMC-Elsevier debate is about the latter. But OA is more likely to rise and fall according to the former."

Abbasi, K. (2004)
Let's dump impact factors
BMJ, Vol. 329, 16 October 2004
BMJ Rapid Responses to this editorial; also see this list response

Baudoin, L., Haeffner-Cavaillon, N., Pinhas, N., Mouchet, S. and Kordon, C. (2004)
Bibliometric indicators: realities, myth and prospective (abstract only, full paper in French)
Med Sci (Paris), 20 (10):909-15, October 2004

Jacsó, P. (2004)
The Future of Citation Indexing - Interview with Dr. Eugene Garfield (pdf 3pp)
Author eprint, in Online, January 2004

Cockerill, M. J. (2004)
Delayed impact: ISI's citation tracking choices are keeping scientists in the dark
BMC Bioinformatics 2004, 5:93, 12 July 2004

Added 26 September 2005 Shin E. J. (2003)
Do Impact Factors change with a change of medium? A comparison of Impact Factors when publication is by paper and through parallel publishing (abstract only)
Journal of Information Science, 29 (6): 527-533, 2003
"it is found that Impact Factors of (journals from the period) 2000 and 2001 were significantly higher than those of 1994 and 1995 in the journals published by parallel publishing (combination journals–simultaneous publication of paper and electronic journals). In particular, the Impact Factors of the combination journals increased after the journals transformed their available media from paper journals to combination ones."

Walter, G., Bloch, S., Hunt, G. and Fisher, K. (2003)
Counting on citations: a flawed way to measure quality
MJA, 2003, 178 (6): 280-281

Borgman, C. L. and Furner, J. (2002)
Scholarly Communication and Bibliometrics, author preprint (pdf 45pp)
Author eprint, in Annual Review of Information Science and Technology, Vol. 36, edited by B. Cronin, 2002

Guédon, J.-C. (2001)
In Oldenburg’s Long Shadow: Librarians, Research Scientists, Publishers, and the Control of Scientific Publishing
Creating the Digital Future, Proceedings of the 138th Annual Meeting, Association of Research Libraries, Toronto, Ontario, May 23-25, 2001

Garfield, E. (1999)
Journal impact factor: a brief review
CMAJ, 161 (8), October 19, 1999

Wouters, P. (1999)
The Citation Culture (pdf 290pp)
PhD Thesis, University of Amsterdam, 1999

Garfield, E. (1998)
The use of journal impact factors and citation analysis in the evaluation of science
Author eprint, presented at the 41st Annual Meeting of the Council of Biology Editors, Salt Lake City, UT, May 4, 1998

Seglen, P. O. (1997)
Why the impact factor of journals should not be used for evaluating research
BMJ, 314:497, 15 February 1997

Garfield, E. (1973)
Citation Frequency as a Measure of Research Activity and Performance (pdf 3pp)
Author eprint, in Essays of an Information Scientist, 1: 406-408, 1962-73, Current Contents, 5, January 31, 1973

Garfield, E. (1955)
Citation Indexes for Science: A New Dimension in Documentation through Association of Ideas
Author eprint, in Science, Vol:122, No:3159, p.108-111, July 15, 1955

Open access

Notes. In printed form, little of the published research literature was free. With more material beginning to appear on the Web from the mid-1990s, more became freely available. Open access is in a sense a formalisation of that process, a recognition that all published, refereed scholarly papers could and should be freely accessible in some form to everyone online without compromising the quality and integrity of the literature. That is the goal. This simple idea, especially when focussed on this very specific literature, seems to have been quite difficult to grasp for many bound by the old, pre-online ways of thinking. Despite the often antithetical tone of the debate, progress has been rapid since the landmark of the Budapest Open Access Initiative in February 2002, even impinging on prospective government policies by 2003 (e.g. Martin Sabo's Public Access to Science Act; UK House committee releases its report on open access; Major development in providing OA to taxpayer-funded research). It has all been brilliantly logged by Peter Suber inOpen Access News (http://www.earlham.edu/~peters/fos/fosblog.html), but for a very quick overview the following papers are sufficient.

Suber, P. (updated)
Open Access Overview

Swan, A. (2007)
Open Access and the Progress of Science
American Scientist, April-June 2007
Swan justified open access in support of her 'progress' article in a list discussion. See blogged extracts from that discussion.

Swan, A. (2006)
Open Access: Why should we have it?
presented at "Zichtbaar onderzoek. Kan Open Archives daarbij helpen?" / Visible research. Can OAI help? Organised by AWI (Flemish Ministry for Economy, Enterprise, Science, Innovation and Foreign Trade) and VOWB (Flemish Organisation of Scientific Research Libraries), May 2006

Swan, A. (2005)
Open Access
JISC, Briefing Paper, 1 April 2005

Suber, P. (2004)
A Primer on Open Access to Science and Scholarship
Author eprint, in Against the Grain, Vol. 16, No. 3, June 2004

Harnad, S. (2004)
The Green Road to Open Access: A Leveraged Transition
American Scientist Forum, January 07, 2004

Suber, P. (2003)
Removing the Barriers to Research: An Introduction to Open Access for Librarians
Author eprint, in College & Research Libraries News, 64, February, 92-94, 113


^Top
<Home
The OpCit Project
This page produced and maintained by the Open Citation project. Contact us