լƵ

XClose

UCL Research

Home
Menu

UCL Bibliometrics Policy

In early 2020, լƵ academic committee approved a policy on the responsible use of bibliometrics at UCL. Below you will find an introduction to the policy, and the policy's eleven principles

Introduction

Skip ahead to the policy's principles

Bibliometrics is a term describing the quantification of publications and their characteristics. Itincludes a range of approaches, such as the use of citation data to quantify the influence or impactof scholarly publications. When used in appropriate contexts, bibliometrics can provide valuableinsights into aspects of research in some disciplines.

However, bibliometrics are sometimes used uncritically, which can be problematic for researchersand research progress when used in inappropriate contexts.For example, some bibliometrics havebeen commandeered for purposes beyond their original design. The journal impact factor wasreasonably developed to indicate average journal citations (over a defined time period), but is oftenused inappropriately as a proxy for the quality of individual articles within a journal.Further, research “excellence” and “quality” are abstract concepts that are difficult to measuredirectly but are often inferred from bibliometrics.

Such superficial use of research metrics inresearch evaluations can be misleading. Inaccurate assessment of research can become unethicalwhen metrics take precedence over expert judgement, where the complexities and nuances ofresearch or a researcher’s profile cannot be quantified. When applied in the wrong contexts, such ashiring, promotion, and funding decisions, irresponsible metric use can incentivize undesirablebehaviours, such as chasing publications in journals with high impact factors regardless of whetherthis is the most appropriate venue for publication, or discouraging the use of open scienceapproaches such as preprints or data sharing.

As such, UCL has produced a policy and associated guidance on the appropriate use of metrics atUCL. This builds on a number of prominent external initiatives on the same task, including the Francisco Declaration on Research Assessment (DORA); the Leiden Manifesto for Research Metricsand Metric Tide report. The latter urged UK institutions to develop a statement of principles on theuse of quantitative indicators in research management and assessment, where metrics should beconsidered in terms of robustness (using the best available data); humility (recognising thatquantitative evaluation can complement, but does not replace, expert assessment); transparency(keeping the collection of data and its analysis open to scrutiny); diversity (reflecting a multitude ofresearch and researcher career paths); and reflexivity (updating our use of bibliometrics to takeaccount of the effects that such measures have had). These initiatives and the development ofinstitutional policies are also supported or mandated by research funders in the UK (e.g., UKResearch Councils, Wellcome Trust, REF).

This Policy Statement aims to balance the benefits and limitations of bibliometric use to create aframework for the responsible use of bibliometrics at UCL and to suggest ways in which they can beused to deliver the ambitious vision for excellence in research, teaching, and learning embodied inthe UCL 2034 strategy.We recognize լƵ is a dynamic and diverse university, and no metric or set of metrics coulduniversally be applied across our institution. Many disciplines or departments do not use researchmetrics in any way, because they are not appropriate in the context of their field. UCL recognisesthis and will not seek to impose the use of metrics in these cases. For those fields where metrics areused, this Policy Statement is deliberately broad and flexible to take account of the diversity ofcontexts, and is not intended to provide a comprehensive set of rules. To help put this into practice,we will provide an evolving set of guidance material with more detailed discussion and examples ofhow these principles could be applied. լƵ is committed to valuing research and researchers basedon their own merits, not the merits of metrics.

Principles for the responsible use of bibliometrics

  1. Quality, influence, and impact of research are typically abstract concepts that prohibit directmeasurement. There is no simple way to measure research quality, and quantitativeapproaches can only be interpreted as indirect proxies for quality.
  2. Different fields have different perspectives of what characterises research quality, anddifferent approaches for determining what constitutes a significant research output (forexample, the relative importance of book chapters vs journal articles). All research outputmust be considered on their own merits, in an appropriate context that reflects the needsand diversity of research fields and outcomes.
  3. Both quantitative and qualitative forms of research assessment have their benefits andlimitations. Depending on the context, the value of different approaches must be consideredand balanced. This is particularly important when dealing with a range of disciplines withdifferent publication practices and citation norms. In fields where quantitative metrics arenot appropriate nor meaningful, UCL will not impose their use for assessment in that area.
  4. When making qualitative assessments, avoid making judgements based on external factorssuch as the reputation of authors, or of the journal or publisher of the work; the work itselfis more important and must be considered on its own merits.
  5. Not all indicators are useful, informative, or will suit all needs; and metrics that aremeaningful in some contexts can be misleading or meaningless in others. For example, insome fields or subfields, citation counts can estimate elements of usage, but in others theyare not useful at all.
  6. Avoid applying metrics to individual researchers, particularly metrics which do not accountfor individual variation or circumstances. For example, the h-index should not be used todirectly compare individuals, because the number of papers and citations differsdramatically among fields and at different points in a career.
  7. Ensure that metrics are applied at the correct scale of the subject of investigation, and donot apply aggregate level metrics to individual subjects, or vice versa. For example, do notassess the quality of an individual paper based on the impact factor of the journal in which itwas published.
  8. Quantitative indicators should be selected from those which are widely used and easilyunderstood to ensure that the process is transparent and they are being appliedappropriately. Likewise, any quantitative goals or benchmarks must be open to scrutiny.
  9. If goals or benchmarks are expressed quantitatively, care should be taken to avoid themetric itself becoming the target of research activity at the expense of research quality.
  10. New and alternative metrics are continuously being developed to inform the reception,usage, and value of all types of research output.Any new or non-standard metric orindicator must be used and interpreted in keeping with the other principles listed here formore traditional metrics. Additionally, consider the sources and methods behind suchmetrics and whether they are vulnerable to being gamed, manipulated, or fabricated.
  11. Bibliometrics are available from a variety of services, with differing levels of coverage,quality and accuracy, and these aspects should be considered when selecting a source fordata or metrics. Where necessary, such as in the evaluation of individual researchers, choosea source that allows records to be verified and curated to ensure records are comprehensiveand accurate, or compare publication lists against data from the UCL IRIS/RPS systems.

This policy was approved by UCL Academic Committee, 27thFebruary 2020.

Supporting information

More information and guidance

Training and Support

External initiatives and UCL's committments

The text of the and can be adaptedor redistributed by third parties. However, to avoid confusion, please ensure that any modified version is notlabelled as a UCL policy.