Library Element Blog and Opinion

Bibliometrics: The Leiden Manifesto for research metrics

Uploaded by RRI Tools on 24 February 2017

Nature 520, 429–431 (23 April 2015). Bibliometrics: The Leiden Manifesto for research metrics
Diana Hicks, Paul Wouters, Ludo Waltman, Sarah de Rijcke & Ismael Rafols

Use these ten principles to guide research evaluation, urge Diana Hicks, Paul Wouters and colleagues.

Ten principles

  1. Quantitative evaluation should support qualitative, expert assessment. 
  2. Measure performance against the research missions of the institution, group or researcher. 
  3. Protect excellence in locally relevant research.
  4. Keep data collection and analytical processes open, transparent and simple. 
  5. Allow those evaluated to verify data and analysis. 
  6. Account for variation by field in publication and citation practices. 
  7. Base assessment of individual researchers on a qualitative judgement of their portfolio. 
  8. Avoid misplaced concreteness and false precision. Science and technology indicators are prone to conceptual ambiguity and uncertainty and require strong assumptions that are not universally accepted. 
  9. Recognize the systemic effects of assessment and indicators. 
  10. Scrutinize indicators regularly and update them. 

Next steps:

Abiding by these ten principles, research evaluation can play an important part in the development of science and its interactions with society. Research metrics can provide crucial information that would be difficult to gather or understand by means of individual expertise. But this quantitative information must not be allowed to morph from an instrument into the goal.

The best decisions are taken by combining robust statistics with sensitivity to the aim and nature of the research that is evaluated. Both quantitative and qualitative evidence are needed; each is objective in its own way. Decision-making about science must be based on high-quality processes that are informed by the highest quality data.


Related Resources