2017 impact factors and the ongoing debate
As the 2017 journal impact factors are released, Harpal Minhas, head of new business development in our publishing directorate, takes a look at the ongoing discussions surrounding their value as a metric.
Clarivate has just released the 2017 journal impact factors (JIFs). Many publishers often meet the annual release of these metrics with eager anticipation. This includes ourselves, because we want to see how these metrics reflect the reach and alignment of our products with the needs of our communities and how we compare with similar publishers.
The JIF was originally conceived by Thomson ISI (now Clarivate Analytics) as a tool to allow librarians to identify journals for purchase.1 Today, many consider it to be a measure of the quality, impact and popularity of a journal based on citations of papers published within that journal.
However, JIFs come with a set of issues that are currently the focus of significant debate and concern around the world, including:
- The JIF is an average. Not all papers receive average citations, some receive significant citations and many receive few or no citations.1
- JIFs are field-specific and different fields garner different levels of citation.
- JIFS are retrospective and provide no perspective of the current situation of a journal.
- In spite of their origin, JIFs have become, to some extent, surrogates for the evaluation of research, and by extension researchers, so they can be used to evaluate individual performance and to inform institutional and governmental decisions with respect to income or funding.
- All metrics can be gamed and manipulated. JIFs are no exception.
- There is a lack of transparency around the data used to calculate JIFs, which is not publicly available.
As a consequence of these issues, there are initiatives with significant momentum campaigning for the complete removal of JIFs and any journal-based metrics from the scientific sector, including:
- DORA – the San Francisco Declaration of Research Assessment. The goal for DORA signatories is for scientific output and research to be measured accurately and evaluated wisely based on appropriate scientific assessment criteria. They do not believe JIFs are appropriate metrics to allow this.2 The Wellcome Trust, Gates Foundation, several scientific societies, a number of commercial and non-commercial publishers , and many other institutions also discourage the use of JIFs and the newly formed UK funding agency UKRI (UK Research & Innovation) may sign up to the DORA declaration.3
- SSPA – the Scientific Societies Publishers Association is an initiative focused on building awareness of, and support for, the publication of scientific research by scientist-run scientific societies.4 The SSPA discourages the use of JIFs.
That said, JIFs are still used significantly and although there are many initiatives working to address how metrics could be used in more sophisticated ways and that are developing broader metrics (for example Altmetric) these have not yet surpassed the JIF. Other JIF-like metrics have also been developed, such as the Relative Citation Ratio (RCR)5 by the National Institutes of Health, and CiteScore6 by Elsevier (both launched in 2016). Neither of these appear to have received any significant traction or usage so far. In 2017, an editorial in Nature Biomedical Engineering proposed the Impact Quotient, which focuses on the top 1% of published papers. This is a simpler metric, but inevitably favours prestigious, high impact factor journals.7
Our own research has shown that there are significant differences of opinion amongst researchers, research institutions, funders and policy makers on the value of JIFs, particularly in geographical terms. In broad terms, among scientists and funders in countries that are growing their research outputs, expertise and infrastructure (for example India and China), the JIF remains popular, as they believe it provides them with a simple measure of the quality and impact of a journal.
As a society publisher, our primary focus is on quality of content and service, and this is based upon the excellent expertise imparted to our processes by our Associate Editors and Editorial Boards. We believe we have a responsibility to serve all our communities across the world and in the absence of other metrics that have the same appeal to our author communities, we will for the time being, continue to publish and communicate JIFs for our journals. However, we will also communicate other relevant metrics such as those provided by Altmetric and downloads as appropriate.
This does not mean we are ignoring the issues surrounding the use of the JIF as a metric. We have implemented the use of Altmetric on our journals as another measure of impact and we will shortly be working with Kudos to help authors increase the visibility of their research. We submitted to the UK funding agency about the balance between the use of quantitative and non-quantitative metrics in the Research Excellence Framework (REF). There is a wider discussion about the impact of research for society and the economy, for example in the UK Research Excellence Framework, and in the development of the next EU Framework Programme, Horizon Europe. At the Royal Society of Chemistry, we have been at the forefront of articulating the impact of chemistry, from its value to the UK economy, to its central role in tackling the global societal challenges of our time.
We acknowledge that JIFs are certainly not a perfect tool and we may modify our approach to JIFs based upon the needs of our community as appropriate in the future. Even Clarivate themselves are further exploring this issue and looking at wider metrics. We also acknowledge that impact factors are not a proxy for expertise or proper and appropriate assessment of individuals or institutions, and should not be used in this way. Until we have alternative metrics to measure the impact of journals, we will continue to use those currently available to us with the necessary caveats.
References
1. The Agony and the Ecstasy— The History and Meaning of the Journal Impact Factor,
Eugene Garfield, Chairman Emeritus, Thomson ISI
International Congress on Peer Review And Biomedical Publication Chicago, September 16, 2005
2. https://sfdora.org/read/
3. Sir Mark Walport, Speaking at a Higher Education Forum Event in London, February 20th, 2018.
4. https://byscientistsforscience.org/
5. PLoS Biol. 2016 Sep 6;14(9):e1002541. DOI: 10.1371/journal.pbio.1002541. eCollection 2016 Sep.
6. CiteScore: a new metric to help you track journal performance and make decisions, Hans Zijlstra and Rachel McCullough, Elsevier
7. https://scholarlykitchen.sspnet.org/2017/07/31/one-percent-club/
Press office
- Tel:
- +44 (0) 20 7440 3351
- Email:
- Send us an email