Denne boka fra Springer har en veldig god gjennomgang og i konklusjonen står det mye lettfattelig informasjon:
"The Impact Factor has dominated research evaluation far too long6 due to
its availability and simplicity, and the h-index has been popular because of a
similar reason: the promise to enable the ranking of scientists using only one
number. For policy-makers—and, unfortunately, for researchers as well—it
is much easier to count papers than to read them. Similarly, the fact that these
indicators are readily available on the web interfaces of the Web of Science
and Scopus add legitimacy to them in the eyes of the research community."
og avslutter med den verste effekten med impact-factor:
scientific community has been, since the beginning of the twentieth century,
independent when it comes to research evaluation, which was performed
through peer-review by colleagues who understood the content of the
research. We are entering a system where numbers compiled by private
firms are increasingly replacing this judgment. And that is the worst side
effect of them all: the dispossession of researchers from their own evaluation
methods which, in turn, lessens the independence of the scientific community.[min utheving]"
Referanse til Open Access versjon s. 150-151 i bokkapittelet: "The Use of Bibliometrics for Assessing Research: Possibilities, Limitations and Adverse Effects" Stefanie Haustein and Vincent Larivière
som er å finne i boka "Incentives and performance" 2015 Springer.
Det skjer en holdningsendring etter hvert, og Michael Eisen rapporterer om en dramatisk holdningsendring om bruken av preprint innen biologi på sin blogg, hans skriver:
I honestly don’t know how this happened. Pre-prints are close to invisible in biology (we didn’t really have a viable pre-print server until a year or so ago) and other recent efforts to promote pre-print usage in biology have been poorly received. There is lots of evidence from social media that most members of the community fall somewhere in the skeptical to hostile range when discussing pre-prints. Some of it is selection bias – people hostile to pre-prints weren’t likely to agree to come to a meeting on pre-prints that they (mostly) had to pay their own way to attend.
But I think it’s bigger than that. I think the publishing zeitgeist may have finally shifted.Article-Influence som har blitt videreutviklet til forfatternivå siden denne er basert på siteringer og en fagfellevurderingsprosess som i seg selv er sterkt krtisert. Journal of Medical Science and Health har en artikkel der siteringer blir stilt spørsmål med:
"The whole onus of bibliometric indicators rests on citation and citation gives only an indication of impact. There is no linear correlation between citation and quality. "
Alle nylige artikler jeg har sett på er sterkt kritiske til impact factor så det skjer nok en endring i bruken av denne til forskningsevaluering framover.