Skip to Main Content

Public Financial Management: Metrics

A guide to public financial management resources

EMS metrics Workshops Recordings and Slides

Open Access and Unique author identifiers
Workshops presented by Marié Roux and Judy Williams. Link to recording and slides below:
​mp4 icon EMS Bibliometrics Workshop_ Author Identifiers and Open Access Publishing (Online)-20230228_130854-Meeting Recording.mp4

EMS metrics: SNIP; CABS; Google Scholar H5; ABDC
Workshops presented by Hanlie Strydom and Pepler Head. Link to recording and slides below:
​mp4 icon EMS Bibliometrics Workshop_ Journal Metrics (Online)-20230301_130029-Meeting Recording.mp4

Measuring impact

This guide will help you determine the impact of a specific scholarly work/article (citation analysis), journal impact, or author impact.

Bibliometrics is the overall name for the use of statistical methods in the analysis of publications, focusing on authors, sources, and citations. Main uses are to provide evidence of academic impact, ranking and benchmark of data, assessment of individual researchers, and journal rankings.

Keep in mind: The citation information in databases is based only on the information contained in those databases. Citation metrics for individual papers may be different based on which database you use. Gather metrics from a variety of sources in order to get the most complete picture.

Citation analysis (Article Impact)

Research impact is an important aspect in securing government funding for research activity, NRF rating, institutional ranking and the management of research output within the university sector.

This has placed considerable emphasis on tracking citations of a researcher's published works and publishing within highly-ranked journals. Citations are widely regarded as the most important indicators of research impact.


Citation Impact: The academic impact of particular works, such as journal articles, conference proceedings, and books, can be measured by the number of times they are cited by other works.

The three best known databases and citation analysis tools are Web of Science, Scopus and Google Scholar.

Web of Science indexes a wide variety of sources​, has a strict criteria for inclusion​ that is constantly evaluated​. It includes an array of metrics​ and is DHET accredited. 
Web of Science is a great citation management tool and  allows you to to create citation reports, maps and alerts. Chose to create a Citation Alert to be notified of when a specific work is cited.


Note: Before searching, make sure to remove the Emerging Source Citation Index (not part of DHET list)

Web of Science :

  • Choose Web of Science
  • Choose Cited Reference Search
  • Enter minimal details (e.g. Author and year, for example Clegg, W  and 2004)
  • Click on View Record in Web of Science to see list of Citing articles for each paper

 

Scopus Indexes a wide variety of sources​, has strict criteria for inclusion​ and is constantly evaluated​. It includes an array of metrics​ and is DHET accredited. Scopus is a large database of peer-reviewed literature in all subject areas. In addition to citation searching, it features tools to track, analyze and visualize research. Scopus complements Web of Science and Google Scholar, but each of them contains enough unique information that none of them should be used exclusively.

Scopus

  • Choose Scopus
  • Select Document search and search for the specific article
  • OR select Author search and enter the specific author's last name, name and affiliation
  • Click the title of the article to open the article page
  • Citation information and other metrics will be available in the Metrics box in the top right box.
  • Metrics are supplied by Plum Analytics

Google Scholar allows you to search from one place, across all disciplines and sources: peer-reviewed papers, theses, books, abstracts and articles, from academic publishers, professional societies, repositories, universities and other scholarly organizations. It includes citations from an array of sources in its "Cited by" calculation, including PowerPoint and Word documents, and gives everything an equal rank.

Google Scholar Citations provide a simple way for authors to keep track of citations to their articles. You can check who is citing your publications, graph citations over time, and compute several citation metrics. You can also make your profile public, so that it may appear in Google Scholar results when people search for your name

 

Comparing Citation Analysis Sources

Here is a quick summary of what to expect from the three best known citation analysis tools.

Source:
University of Michigan Library

  Web of Science Scopus Google Scholar
Subject Focus
  • Science Citation Index
  • Social Science Citation Index
  • Arts & Humanities Citation Index
  • Health Sciences
  • Physical Sciences
  • Social Sciences
  • Life Sciences
Theoretically, all disciplines
Coverage
  • Over 12,000 peer-reviewed journals
  • Over 1,300 open access journals
  • 30,000 books with 10,000 added annually
  • Over 2.6 M chemical compounds and 1 M chemical reactions
  • 148,000 conference titles with 12,000 added annually
  • Over 21,500 peer-reviewed journals
  • Over 360 trade publications
  • Over 4,200 open access journals
  • Over 120,000 book titles
  • Over 7.2 M conference papers
  • Over 27 M patent records
  • Books from Google Books
  • Dissertations
  • Peer-reviewed articles
  • Patents
  • Case law
  • Trade journals
  • Slide presentations
  • Gray literature
  • Newsletters
  • Syllabi (if cited by scholarly articles)
Time Span Some journals from 1900 Some journals from the 1820s Some citations as far back as the 1660s and 1670s
Updated Weekly Daily Unknown but generally quick
Strengths
  • Excellent search limits by discipline
  • The most well-known and most used resource for citation analysis
  • Citation analysis goes back farther than Scopus
  • Better open access journal coverage
  • Better foreign language coverage
  • Better Social Sciences & Arts/Humanities coverage
  • Excellent resource for finding cited references
  • It's free
  • May find more obscure references
Weaknesses
  • Weaker Arts/Humanities & Social Sciences content than Scopus
  • Cannot search by date any earlier than 1960
  • Too much irrelevant content in search results
  • Few options for sorting results

Altmetrics (short for alternative metrics) focuses on social media and the social engagement in regards to your work. Social media is a much quicker form of communication than scholarly citations. Altmetics includes: number of tweets, blog posts, likes, bookmarks, downloads, click-through numbers, peer collaboration tools, etc. in social media and the web.

Tools:

Other sources and criteria

Other criteria you can use you think about a journal’s impact:

  • Who is on the journal's editorial board? Are they recognized scholars in your field?
  • Who is publishing in that journal? Are they also recognized scholars in your field?
  • Is the journal indexed in databases relevant to your field?
  • Is the journal affiliated with a professional organization, scholarly society, or conference relevant to your subject area?

The checklist (Think, Check, Submit)  is a tool that will help you discover what you need to know when assessing whether or not a journal is a suitable for your research.

Harzing.com: Journal Quality List

Author impact: Variants (m-, g-, e-indices)
 

What is a predatory Journal?

Predatory journals are pseudo-academic journals that exist to exploit authors for money, without delivering on the promise of rigorous academic quality. They typically exploit the Open Access publishing business model where authors pay a fee to make their work freely available to the public. Predatory journals are a concern because they are often difficult to identify. 

How to spot a predatory (or possible predatory) journal?

Predatory journals are becoming more and more difficult to spot, since predatory publishers go to great lengths to make them seem legitimate. They can (and will) have ISSN's, assign DOI's to their articles and professional websites, making it even more difficult to make a definitive assessment of journals. Chek for:

Lies

  • Indexing: Make sure that journals are indexed in the databases they mention on their website
  • Editors: Check affiliations and editorial board members mentioned on the website to ensure that they are in fact associated with the journal

Errors:

  • Looks at the website for spelling, grammar and formatting errors
  • Check the articles that appear in the journals for the same mistakes

Recognition:

  • Do you recognise the indexed databases mentioned, or not? Indexing in databases such as Google Scholar, Ulrich's Web, Mendeley and others that are not recognised, is a concern
  • Do you recognise any of the editors in the specific field

For other criteria to spot predatory Journals, have a look at Think, Check, Submit

Are there lists I can check to ensure I don't fall prey to a predatory publisher?

Beall's list - As an Academic librarian concerned with predatory publisher, Jeffrey Beall started compiling his predatory publisher's list. He has received much critique for his list, and has subsequently taken it down. Please keep in mind that this list is not updated, and was based on a subjective assessment of journals.

DOAJ - the Directory of Open Access Journals is a list of reputable, evaluated journals and is the first stop for checking the credibility of any Open Access Journal. 

Cabell's International - compiles a list of journals that have penalties agains them, and could be an indication of predatory activity. 

Terms and Definitions

The weakening relationship between the impact factor and papers' citations in the digital age

Open Access Publishing

The Library and Information Service has signed agreements with a number of publishers that make it easier for SU researchers to publish open access (OA) with these publishers, and in some cases to even publish OA without paying any article processing charge (APC). Many of these agreements have been negotiated by the South African National Library and Information Consortium (SANLiC), of which Stellenbosch University (SU) is a member.

Please keep up to date with all the agreements signed with publishers by visiting this library guide on Open Access Publishing

These are some of the publishers with whom we have signed free-of-charge OA agreements:

  • Cambridge University Press
  • Emerald
  • SAGE
  • Wiley

These are some of the publishers with whom we have signed discounted OA agreements:

  • MDPI
  • Springer Nature

These are however updated on a regular interval, so please go the above library guide for more information.

 

Jounal ranking lists (Journal Impact)

Journal rankings can reveal a journal's influence by looking at how often a journal's articles have been cited. Various methodologies exist to rate and rank journals on different criteria.

Journal rankings can help a researcher determine which journals they should try to publish in and which journals are the most respected in a specific field.

Factors that Influence Journal Impact
The average citation level of a journal is a limited indicator, and is not a replacement for expert, qualitative assessment of the journal

  • Date of Publication: Impact factor is usually based on 1, 2, 3, or 5 year time periods. Journals with articles that are steadily cited for a long period of time (more than 5 years) rather than immediately lose out.
  • Large vs. Small Journals: Large journals tend to have higher impact factors.
  • Average Citation: Impact factor only looks at an average citation. A journal may have a few highly cited papers that greatly increase its impact factor, while other papers in that same journal may not be cited at all.
  • Review Articles: Impact factors are calculated using citations not only from research articles but also review articles (which tend to receive more citations), editorials, letters, meeting abstracts, and notes.
  • Changing / Growing Fields: Rapidly changing fields have much higher immediate citation rates.
  • Language, journal history, publication schedule, and subject specialty and scope can affect a journal 's ranking.

    Note: Journal metrics are used to evaluate the quality or caliber of the journal in which articles are published and should not be used to make any assessment at the article level. The Impact Factor and CiteScore is not an indication of article quality.

In South Africa, only articles published in "accredited journals" are considered for government subsidy. Only journals included in the lists/indices compiled by the Department of Higher Education and Training are considered "accredited" and will be taken into account for government subsidy and NRF rating and evaluation.

For more information about the lists of accredited journals and the Research Output Survey, please visit the Division for Research Development's website.

Since 2021, the DHET List includes the Directory of Open Access Journals (DOAJ).

How to get a South African journal accredited by DHET
For more information on getting a journal accredited by the DHET, visit the Division for Research Development's Output Survey Page.

Source Normalised Impact per Paper (SNIP) ranks journals included in the Scopus database. SNIP measures actual citations received relative to citations expected for the serial’s subject field. The impact of a single citation is given higher value in subject areas where citations are less likely, and vice versa. Eg: citation counts in the Life Sciences tend to be higher than in the Arts and Humanities. SNIP “levels the playing field”
SNIP is calculated as the number of citations given in the present year to publications in the past three years divided by the total number of publications in the past three years. A journal with a SNIP of 1.0 has the median (not mean) number of citations for journals in that field.

SNIP:

  • Takes a research field’s citation frequency into account
  • Considers immediacy - how quickly a paper is likely to have an impact in a given field
  • Accounts for how well the field is covered by the underlying database
  • Calculates without use of a journal’s subject classification to avoid delimitation
  • Counters any potential for editorial manipulation

    The Australian Business Deans Council, ABDC Journal Quality list classifies journals based on four mutually exclusive categories, namely, A*, A, B and C. A* represents the highest quality journals and C denoting the lowest quality journals​. 2022 Saw a small review, adding and removing of a few journals. See their draft recommendations.  2024/2025 will see an update of the entire list. 

Most researchers are familiar with well-established journals and conferences in their field. They are often less familiar with newer publications or publications in related fields as their are simply too many.

Google Scholar provides an overview of publications to help researcher decide where to publish. You can browse the top 100 publications in several languages, ordered by their five-year h-index and h-median metrics, as well as select specific categories and view the top journals in specific disciplines. Google Scholar also allows you to search publications by title or key words and on the main page of Google Scholar's Metrics, you can search by field or specific journal.
 

Finding Google Scholar Metrics

  • From Google Scholar, select the menu in the top left corner, and click metrics
  • This will display the top publications within Google Scholar
  • Select view top publications at the bottom of the list
  • This will display the top 100 publications within Google Scholar and can be filtered by Category and Language at the top of the list
  • Selecting the magnifying glass in the top right hand corner will allow you to search for specific publications using the title or keywords
  • Once you've searched for a journal, Google Scholar will display the 5-year h-index and the h-median metrics.

 


 

CABS Journal List

The Chartered Association of Business Schools - Academic Journal Guide, is a guide to the range and quality of journals in which business and management academics publish their research. Its purpose is to give both emerging and established scholars greater clarity as to which journals to aim for, and where the best work in their field tends to be clustered.

The AJG is based upon peer review, editorial and expert judgements following from the evaluation of publications, and is informed by statistical information relating to citation and are updated every 3 years

Impact Factor (IF) can be accessed from the Web of Science database and is the most commonly used measurement to determine the reputation of a journal in relation to other journals in a specific field. The calculation of IF is based on the average number of times the articles of a journal is cited in a two/five year period.

Keep in mind:

  • Many journals do not have an impact factor
  • The impact factor cannot assess the quality of individual articles. The impact factor only measure the interests of other researchers in an article, not its importance and usefulness.
  • Only research articles, technical notes and reviews are “citable” items. Editorials, letters, news items and meeting abstracts are “non-citable items”.
  • Only a small percentage of articles are highly cited and they are found in a small subset of journals. This small proportion accounts for a large percentage of citations.
  • Controversial papers, such as those based on fraudulent data, may be highly cited, distorting the impact factor of a journal.
  • Citation bias may exist. For example, English language resources may be favoured. Authors may cite their own work

    For all these reasons the impact factor for a journal should not be looked at in isolation.


 

Web of Science's Journal Citation Reports (JCR) ranks, evaluate and compare scholarly journals in all areas of the sciences and social sciences. Results can be used to determine which journals are the most important and influential in their respective disciplines based on high impact and citations. JCR gives an indication of influence and impact at a category level through its Impact Factor, showing citation relationships between journals. 
The journal Impact Factor is the average number of times articles from the journal published in the past 2 years have been cited in the Journal Citation Reports (JCR) year.

CiteScore is available on Scopus and calculates the average number of citations received in a calendar year by all items published in that journal in the preceding 3 years. CiteScore measures a journals influence and impact, ranking journals within their specific subject categories and is Scopus' version of the Impact Factor from Web of Science.

SCImago Journal Rank (SJR) ranks journals included in the Scopus database. It calculates not only the number of citations to articles in journals but also takes into account the ‘quality’ of the citing journal. Therefor with SJR, the subject field and quality and reputation of the journal has a direct effect on the value of a citation. The SJR:

  • Is weighted by the prestige of the journal, thereby ‘leveling the playing field’ among journals
  • Eliminates manipulation: raise the SJR ranking by being published in more reputable journals
  • ‘Shares’ a journal’s prestige equally over the total number of citations in that journal
  • Normalizes for differences in citation behaviour between subject fields

Author Impact and Indentifiers

Author impact: The number of works a researcher has published and the number of times these works have been cited can be an indicator of the academic impact of an individual researcher.

The h-index can be regarded as a measure of the number of publications published (productivity) as well as how often they are cited (impact).
h-index always uses the same formula, but different results will always happen because the sources that apply the formula have a different data set. So because Web of Science indexes data different from Scopus and different from Google Scholar they are always going to be different numbers.

Note:
In general you can only compare values within a single discipline.

The h-index correlates with the length of a researcher's career (i.e., researchers who have been publishing for longer tend to have higher h-indices). It can also be inflated by self-citation. Self-citation should only be done when truly appropriate.

The h-index may be less useful in some disciplines, particularly some areas of the humanities.

For more information on the h-index formula, see Library guide:
Bibliometrics and citation analysis: Author impact (h-index)

First, conduct an author search (Last name, First initial) (you can also choose to select an author from the index):

Author search in Web of Science

Second, click on the "Create Citation Report" link on the right-hand side of the results page:

Link to "Create Citation Report" in Web of Science results

Finally, you will get a report with a number of metrics and data on the author you searched, including the h-index.

Example of a Web of Science Author Impact

 

Log on to Scopus (Follow the path: Library homepage > Search > E-databases > Scopus).

Click on the Author Search tab at the top of the screen.

Enter the author name as indicated.

Select the author profile(s).

The H-Index is displayed on the right. 

You can create a Google Scholar Profile that will displays your scholarly works in Google Scholar, including the number of citing publications, as well as your h-index and i10-index. .H-Index: The h-index is a measure of the number of publications published (productivity) as well as how often they are cited (impact). Example: If your h-index is 20, it means that 20 of your publications have been cited 20 times or more. i10-Index: The number of publications with at least 10 citations. If your i10-index is 2, it means that 2 of your publications have been cited 10 times or more.

Tip: Create Google Scholar profile and  make your profile public.
If you have a common name, consider choosing the "Don't automatically update my profile" option -- that way Google Scholar will e-mail you to confirm that you are the author of a new paper before adding it to your profile.

Results with hyperlinked (underlined) author names allow you to click on the name, and see the the author's profile. For authors with public author profiles you can find an h-Index.

Author Identifiers connect your name(s) with your work throughout your career. It is important because:

- it provides a means to distinguish you from other authors with the same or similar names

- it links all your work even if you have used different names during your career

- it helps others to find your research output easily (i.e. funders, other researchers, etc.)

- it ensures that your work is clearly attributed to you.

Open Researcher en Contributor ID (ORCID) is a non-profit organisation that provides a registry of unique researcher identifiers,linking research activities and outputs to these identifiers and Stellenbosch University has been a member of the Open Researcher and Contributor ID (ORCID) network since 2015.
Note:it is compulsory to have an ORCID iD when applying for NRF-funding or –rating.
Three steps are required:
1.Create your ORCID iD ( takes 30 seconds)
2. Connect your ORCID and SU network identities.(If you want to connect and update your ORCID profile, log in to your account at http://orcid.org.)

 
3. Add your publications (See guide for steps and assistance)

See also Library ORCID guide for detailed steps, videos and more information about ORCID.

ORCID is similar to Publons (Formerly ResearcherID, Scopus Author ID, ISNI and other systems for identifying and distinguishing authors /researchers and creators.

Scopus Author ID
The Scopus database automatically assigns an ID profile to authors to help identify and link their publications. You can check your current Scopus author ID and publications by running an author search on Scopus using your name and current affiliation. You can manage your profile and check your publications are correct using the Scopus to ORCID wizard which will then link the publications associated with your with your Scopus author ID with your ORCID.
 

Publons (formerly ResearcherID)
ResearcherID is a unique identifier used to distinguish your publications on the Web of Science database, and is now fully integrated with Clarivate Analytics' Publons platform. Once you have registered, you can identify and claim your publications indexed in Web of Science, and your ResearcherID will then be associated with these works and they will be added to your Publons profile. You can also import publications to your Publons profile using ORCID.