Skip to Main Content
University of York Library
Library Subject Guides

Bibliometrics: a Practical Guide

Data-driven decisions in your department

University rankings

Image of a podiumAlthough some impact evaluations, like REF (Research Excellence Framework), were never designed for ranking, we cannot seem to resist the urge to compare ourselves with others.

Many rankings use citation data to assess the impact of our research alongside other indicators. As for all metrics, context is crucial - University rankings cannot be read like football league tables. Whilst we might not agree with the ranking methodology, our rank, or the value of university rankings in general, we acknowledge that students use university rankings to support their decision making. Even funders and collaborators might consider rankings based on Research, Knowledge or Teaching Excellence Framework (REF, KEF or TEF).

Limitations: Most rankings combine multiple performance indicators into an overall score. This approach assumes that all indicators are independent, which is doubtful. Many indicators, such as citations and research reputation correlate. Combining correlating factors leads to multicolinearity issues, rendering exact ranks arbitrary. Furthermore, subject rankings target students that know what to study but don't know where. However, subject rankings do not reflect the actual output from the department, but rely on journal classifications to identify outputs from the respective field. This leads to false representation and inaccurate ranking, particularly for cross-disciplinary subjects.

Comparing across disciplines

Many departments are proud of their interdisciplinary work and even in departments with single-discipline focus some researchers might be involved in cross-disciplinary collaborations.

When comparing across disciplines, metrics should be normalised/ field-weighted, to ensure that you compare like with like. Different research fields have different publication rates and citation rates can vary widely.

There are several field-weighted options for journal-level metrics. The Source-Normalised Impact per Paper (SNIP), for example, allows you to compare different journals whilst adjusting for differing publication and citation practices.

Remember: Journal metrics are based on average citations of papers in a journal and cannot reflect the impact of a single study published in that journal.

The Field-Weighted Citation Impact (FWCI) reflects citations, whilst normalising for publication type, year of publication and discipline. FWCI of 1.00 is world average. Scopus lists the FWCI for publications, individual and departmental/institutional FWCI (average) can be found on SciVal.

Limitations: As citation distributions are heavily right skewed, average FWCI is not reliable for small publication sets (<100). Also, all Field-weighted indicators rely on the subject classifications of the journal that the study is published in. Hence, the choice of journal can affect the FWCI, depending on the average citations of the respective field. Additionally, journal classifications can be biased or inaccurate. Depending on your research approach (e.g. field-work based, computational, applied), the publication and citation practices can vary considerably even within the same journal subject classification. The FWCI can help facilitate comparisons, but the context of the research is still crucial.

Overview of departmental outputs via SciVal

Gain insights on your department's research strengths (e.g. publication volume, field-weighted citations, collaboration, policy mentions) by creating an overview report in SciVal. SciVal is a research intelligence platform, providing tools for research analysis and visualisation and is based on the database Scopus. We have created easily accessible report templates for departments based on researcher groups reflecting our departments. Follow the steps below to create a comprehensive overview report for your department in a few minutes. The report can also be customised or refined.

1. Use your york.ac.uk email address to sign into SciVal

2. Select Reporting module from top panel (see screenshot below)

3. Select 'Report templates' (see screenshot below)

4. Choose 'Templates provided by your institution' from drop-down menu (see screenshot below)

 

On SciVal, go to reporting, report templates, templates provided by your institution

 

6. Select a template to create a report. Choose your department from the drop-down list. Click 'Next Step' at bottom right.

7. Opt to keep the guidance and change the name of your report if you wish.

8. Your custom report is now loading. You can change the time frame for your analysis at the top right and 'Explore/ edit analysis' by following the blue link in the header of each analysis chunk to gain further insights.

9. Optional: The 'manage' button on the top right allows you to reshuffle your analysis chunks, delete or add to it.

10. Save your report (top right) in the desired format. You can, for example, save it as a report within SciVal, which will allow for automatic updates whenever new data is added to SciVal (weekly, from Scopus) or create a snapshot of your departmental outputs by exporting the data in the desired format (pdf, Word etc.).

The researcher groups that represent the departments are based on researchers and their Scopus IDs from PURE and updated annually. You can check what researchers are part of your department's SciVal researcher group through the MySciVal module (select 'Entities provided by your institution' from the drop down menu). Contact Paula (lib-open-research@york.ac.uk) from the Open Research Team if you want to refine the researcher group for your department. I'd also be happy to help customise your report.