Bibliometrics



Bibliometrics is mainly defined today as the measurement and assessment of the academic publication behaviour or output of persons, research institutions or countries.

Bibliometric analyses are based on measurable parameters of scientific publications, such as number of publications by one author and number of citations, which can be determined from citation databases. Well-known interdisciplinary databases are the Web of Science (WoS) of Clarivate Analytics (formerly Thomson Reuters) or Scopus of Elsevier. These are used for bibliometric analyses in the context of university rankings as published by e.g. Times Higher Education (THE) or the German centre for the development of universities (CHE).

However, bibliometrics is not suitable for a direct qualitative assessment of scientific publications or academic performance.

Data sources

Bibliometric surveys are based on parameters that can be determined using various data sources. Well-known interdisciplinary databases are the literature and citation databases WoS and Scopus. Other data sources are Google Scholar or the database Dimensions by Digital Science, which was used for a ranking of the Nature Index 2019.

The WoS can be accessed by all members of the University of Bayreuth. Scopus is not licensed by the University. Dimensions is available (with limited options) free of charge on the web.

The data sources used and the quality of the underlying data can have an impact on the bibliometric indicators: Incomplete identification of authors can lead to altered indicators, e.g. if a publication cannot be clearly assigned to the authors. Additionally, publications of a person are not always recorded in a single data source. Therefore, wherever possible, bibliometric surveys for comprehensive assessments should therefore be based on several suitable data sources. For reasons of transparency and traceability, the data sources used should be indicated with the date of the survey.

Indicators

Several indicators are available to measure the quantity and impact of academic output. They are mainly based on the number of publications and the number of citations.

To quantify the academic output of a person (or an institution/country), the number of publications is determined. The number of citations is used as an indicator of the perception or scientific ‚impact‘ of a publication in the scientific community. This is based on the assumption that the perception of a publication correlates positively with the number of citations. Based on the basic indicators, others can be determined, such as the Hirsch index (h index) and the Journal Impact Factor (JIF).

Bibliometric indicators can provide an overview of measurable aspects of academic activities and make them more visible. They should be selected according to the question at hand. They cannot replace a qualitative assessment and should therefore always be used in conjunction with a qualitative assessment when evaluating academic performance.

Person related indicators (Author impact)

H-Index

The h-index, which was introduced by Jorge Hirsch in 2005, has now become established as a benchmark for people in various specialist areas. The h-index represents the number of publications (h) of an author that have been cited at least h-fold.

Benefits and limitations

The h-index takes into account the number of publications of a person (output) as well as the number of citations (impact). It is regarded as a ‚robust‘ factor that provides an overall impression of the publication history and is less influenced by individual highly cited top articles than, for example, the total number of citations.

One point of criticism of the h-index is that it does not take into account the duration of an academic career. This can lead to a disadvantage for researchers who are at the beginning of their career and have comparatively fewer publications. In addition, frequently cited top articles or many, comparatively few cited articles are not adequately reflected in the h-index.

Jorge Hirsch has already pointed out that although a high h-index can be an indication of good academic performance, the reverse conclusion cannot necessarily be drawn (cf. Hirsch J., 2005).

How to determine the h-index

The h-index can be determined in various databases such as the WoS, Scopus or via Google Scholar, or "by hand" for a manageable number of publications. However, it will always differ depending on the respective database used and the period analysed.

Alternatives

  • M-Quotient: The m-quotient takes into account the length of the scientific career (h-index/years since the first publication).
  • G-Index: The g-index proposed by Leo Egghe includes higher cited top articles (see Egghe L., 2006).
  • i10-Index: The i10-index introduced by Google Scholar corresponds to the number of publications with ≥ 10 citations.

Journal related indicators (Journal Impact)

Journal related indicators are used to measure the importance or impact of a scientific journal. A possible application is the assessment of the relative importance of a journal or the visibility of the articles published in it within a subject area. Consequently, journal related indicators are not suitable for assessing individuals or individual articles. One of the best-known journal indicators is the Journal Impact Factor (JIF).

Journal Impact Factor

The JIF is an indicator of the importance of a journal. It is published annually in the Journal Citation Report (JCR) by Clarivate Analytics. The data basis is the WoS. The JIF indicates how often publications (or „items“) of a specific journal are cited on average in the year selected for analysis. Publications/„items“ from the two previous years are included (2-year period).

JIF = (Citations from JCR year to items in "year-2" + Citations from JCR year to items in "year-1") / (citable items in "year-2" + citable items in "year-1")

A detailed explanation of the JIF can be found at Clarivate Analytics.

Benefits and limitations

As a journal related indicator, the JIF can be used to assess the visibility of publications in a journal within a discipline. However, it is not suitable for evaluating the quality of individual publications, persons or research institutions.
The JIF and its occasional misuse have long been criticised. Weaknesses of the JIF are:
  • lacking consideration of subject-specific publication and citation habits,
  • insufficient coverage of some subjects in the WoS,
  • limited recording of relevant journals in the WoS and
  • the short period of only two years of publication taken into account.
Various initiatives have therefore drawn up recommendations on the use of the JIF as well as on the fair and transparent assessment of research output (see The San Francisco Declaration on Research Assessment (DORA) and the Leiden manifesto). In the context of its statement on the establishment of "cOAlition S" to support Open Access, the German Research Foundation DFG also recommended in September 2018 that research organisations move away from indicators such as the JIF and change their systems for measuring performance.

How to determine the JIF

Members of the University of Bayreuth can access the JIF free of charge via the JCR licensed at the university.

Alternatives
  • CiteScore: The CiteScore indicates the average number of citations based on publications in a 4-year publication window. The following peer-reviewed document types are included in the CiteScore: articles, reviews, conference papers, data papers and book chapters. The data basis for the CiteScore is Scopus by Elsevier. The CiteScore is (in contrast to the Scopus database) searchable free of charge without a license.

  • SCImago Journal Rank Indicator: The SCImago Journal Rank (SJR) indicates the average number of weighted citations in the year of the survey, based on publications of the journal from the previous three years. Depending on the importance (‚prestige‘) of the journal from which the citations originate, the citations are weighted differently. The data basis is Scopus. The SJR is searchable free of charge.

Article related indicators (Article Impact)

Article Level Metrics (ALMs) attempt to measure the impact of an individual article. ALMs use traditional data sources (e.g. number of citations) as well as newer alternative data sources (especially social media, tweets, blog posts, etc., see altmetrics.org)

The Public Library of Science (PLOS) offered ALMs on its publishing platform as early as 2009. Often the two terms ALMs and Alternative Metrics or Altmetrics are used synonymously. Altmetrics (composed of: "alternative" and "metrics") are metrics that use data from alternative data sources to assess online activity. In principle, they can also be applied to persons, institutions or journals, regardless of articles. ALMs, on the other hand, include metrics that attempt to measure impact at the article level, regardless of whether they use traditional or alternative data sources.

Benefits and limitations

Alternative metrics are available shortly after publication and also cover areas outside the scientific community (e.g. Facebook, Twitter...) They are not limited to the article level, but can also be applied to other forms of publication such as data, software, lectures, etc.

Little research has so far been done on alternative metrics. For example, it is not clear whether and what significance a tweet or online activity has in general for the impact of a scientific contribution. In addition, the data basis and classification often differ between the individual providers. Altmetrics are therefore only seen as a supplement to traditional metrics and – like traditional metrics – are not suitable for assessing the quality of a scientific contribution. Rather, Altmetrics try to take into account that science communication is changing and therefore capture other ways in which research is communicated and discussed, especially the attention that a contribution generates on the web.

Availability of ALMs/Altmetrics

For example, ALMs are offered directly on the publishing platform of the Public Library of Science (PLOS) for articles published there. Various ALMs are available for individual articles, such as online usage data, citations or activities on the social web (blogs, social media such as Facebook and Twitter, ...).
Many publishers have now integrated ALMs and Altmetrics on their publishing platforms, such as the Nature Publishing Group, BiomedCentral or Taylor and Francis. Besides citation data, the alternative metrics of the commercial provider Altmetric are often used. Altmetric visualises online activity using the altmetric badge (Altmetric Donut). Different colours represent the different areas of online activity (e.g. news, Twitter, Facebook). The altmetric badge is supplemented by an attention score, a weighed evaluation of online attention (see altmetric badge).

Limitations of bibliometrics

Not all publications and forms of publication are registered in the databases used for bibliometric analysis. This also applies to interdisciplinary databases such as Web of Science and Scopus, in which disciplines are represented in varying degrees of detail. Especially the humanities and social sciences, which mainly publish in monographs, are not sufficiently covered.

Due to different publication habits, a comparison between different disciplines is not possible. For example, the subjects differ in the average number of references given per article or in the number of (co-)authors. This can subsequently influence bibliometric indicators such as the h-index.

Furthermore, bibliometric analyses are not suitable for a direct qualitative assessment of scientific publications or academic performance. They cannot replace them and should therefore always be used in conjunction with a qualitative evaluation of academic performance.

General recommendations for the use of bibliometrics and further guidelines for the various actors in the research environment are provided by the DORA Declaration and Leiden manifesto.
Consultations and services

Bibliometrics can be useful to determine indicators for grant proposals (e.g. personal h-index) or to select a suitable journal for a publication.

Please contact us if you have any questions:

University Library

Clemens Engelhardt

Tel. 0921 / 55 - 3429

Dr. Birgit Regenfuß

Tel. 0921 / 55 - 3415
Office of Research Support

Dr. Ursula Higgins

Tel. 0921 / 55 - 7783