Dementia journal metrics: Useful or just wrong?
Does it matter where you publish your work? The value of our publications is often evaluated differently depending upon where it is published. If you’re applying for promotion, you’re competing for funding, or you’re trying to secure tenure then it’s likely that the journals that you’ve published in will influence your chances of success. It’s also possible that where you publish your work will influence how much attention it gets from the wider scientific or clinical community and how many times it’s cited by others. Many of us target the ‘top’ journals for these reasons.
Set against this is the concern that metrics designed to capture journal quality are simplistic and may distort or even damage science. It’s clear that the journals that people publish in are often used as a shortcut to evaluate the quality of their work by people who haven’t actually read the articles. So should we take journal metrics into account or not? Can we learn anything useful from these metrics, or should we just ignore them, keep calm and carry on? In this blog I take a look at the latest journal metrics for journals directly related to dementia and consider their potential value.
Let’s start with Journal Impact Factors (JIF). I don’t want to dwell on the technical nature of the metrics here, but it’s based upon the “yearly mean number of citations of articles published in the last two years”. If we rank journals directly related to Alzheimer’s disease and/or dementia according to their 2020 JIF (from Clarivate.com) and ignore relevant though more general journals in medicine and science we get the following:
Alzheimer’s & Dementia gets a gold medal, Alzheimer’s Research & Therapy gets silver, and then Journal of Prevention of Alzheimer’s Disease (JPAD) and Journal of Alzheimer’s Disease get a joint bronze. There are lots of limitations of Journal Impact Factors, and the Journal Citation Index (JCI) has been proposed as a more complicated but potentially better metric: “The Journal Citation Indicator (JCI) is the average Category Normalized Citation Impact (CNCI) of citable items (articles & reviews) published by a journal over a recent three year period. The average JCI in a category is 1.” The same journals ranked according to their 2020 JCI:
Again Alzheimer’s & Dementia is in first place, and Alzheimer’s Research & Therapy is in second place. It’s less obvious between the remaining journals which is ‘best’ though. One remaining problem is that these journals include the odd blockbuster paper that is cited hundreds or thousands of times and skews these metrics. If you’re wanting to get your work cited, rather than just impress your colleagues, then it might be more useful how often articles are typically cited in each journal. Fortunately, data is also available for the median number of times original research articles are cited:
The pattern is similar, but you’ll notice that the differences between the journals are far less extreme. A similar pattern is seen when journals are ranked according to the median number of citations for review articles:
You’ll note that reviews tend to be more highly cited in general than original research articles. The type and topic of the article that you publish may make more difference to how many times it’s cited than where it’s published.
Another limitation to be aware of is that official ranking metrics do not cover all journals. For example, JIF and JCI are produced on the basis of journals indexed in Clarivate’s Web of Science. The Alzheimer’s Association publishes two further journals which do not have official JIF or JCI: (1) Alzheimer’s & Dementia: Diagnosis, Assessment & Disease Monitoring; and (2) Alzheimer’s & Dementia: Translational Research & Clinical Interventions. Alternative journal ranking metrics might therefore be informative to allow us to evaluate these journals. One such alternative is the SCImago Journal Rank (SJR) indicator, which is “a measure of the scientific influence of scholarly journals that accounts for both the number of citations received by a journal and the importance or prestige of the journals where the citations come from. A journal’s SJR is a numeric value indicating the average number of weighted citations received during a selected year per document published in that journal during the previous three years.” Again, a similar pattern is seen when the journals are ranked, though the two additional Alzheimer’s Association journals are ranked fairly highly:
So what does all this mean? Alzheimer’s & Dementia is clearly the highest-ranked dementia journal at the moment, and Alzheimer’s Research & Therapy is second. Other prestigious alternatives include journals in general medicine (New England Journal of Medicine, Lancet, JAMA, Annals of Internal Medicine, and BMJ), multidisciplinary science (Nature and Science), and a range of specialist journals (e.g. Lancet Neurology, JAMA Psychiatry, and Nature Genetics). If you can get your research into these journals then it is likely to help your CV and raise the profile of your work. It may also help a little in attracting citations, but don’t assume that if you get your work into these journals it means it will necessarily be highly cited.
So how much attention should you pay to all of this? Perhaps the system is broken and you should ignore these potentially misleading and harmful metrics? That’s your choice. At a minimum, several other considerations should also be factored in when choosing a journal, such as open access options, target audience, the nature of the peer review process, endorsement of publishing best practice guidelines, author rights, and the underlying business model. That said if you take journal metrics with a large pinch of salt they may be useful. If nothing else they’re a rule of thumb for how your work will tend to be perceived by others. Particularly senior academics making career-changing decisions quickly with limited information.