This guide will help you determine the impact of a specific scholarly work/article (citation analysis), journal impact, or author impact.
Bibliometrics is the overall name for the use of statistical methods in the analysis of publications, focusing on authors, sources, and citations. Main uses are to provide evidence of academic impact, ranking and benchmark of data, assessment of individual researchers, and journal rankings.
Keep in mind: The citation information in databases is based only on the information contained in those databases. Citation metrics for individual papers may be different based on which database you use. Gather metrics from a variety of sources in order to get the most complete picture.
Research impact is an important aspect in securing government funding for research activity, NRF rating, institutional ranking and the management of research output within the university sector.
This has placed considerable emphasis on tracking citations of a researcher's published works and publishing within highly-ranked journals. Citations are widely regarded as the most important indicators of research impact.
Citation Impact: The academic impact of particular works, such as journal articles, conference proceedings, and books, can be measured by the number of times they are cited by other works.
The three best known databases and citation analysis tools are Web of Science, Scopus and Google Scholar.
Web of Science indexes a wide variety of sources, has a strict criteria for inclusion that is constantly evaluated. It includes an array of metrics and is DHET accredited.
Web of Science is a great citation management tool and allows you to to create citation reports, maps and alerts. Chose to create a Citation Alert to be notified of when a specific work is cited.
Note: Before searching, make sure to remove the Emerging Source Citation Index (not part of DHET list)
Scopus Indexes a wide variety of sources, has strict criteria for inclusion and is constantly evaluated. It includes an array of metrics and is DHET accredited. Scopus is a large database of peer-reviewed literature in all subject areas. In addition to citation searching, it features tools to track, analyze and visualize research. Scopus complements Web of Science and Google Scholar, but each of them contains enough unique information that none of them should be used exclusively.
Google Scholar allows you to search from one place, across all disciplines and sources: peer-reviewed papers, theses, books, abstracts and articles, from academic publishers, professional societies, repositories, universities and other scholarly organizations. It includes citations from an array of sources in its "Cited by" calculation, including PowerPoint and Word documents, and gives everything an equal rank.
Google Scholar Citations provide a simple way for authors to keep track of citations to their articles. You can check who is citing your publications, graph citations over time, and compute several citation metrics. You can also make your profile public, so that it may appear in Google Scholar results when people search for your name
Here is a quick summary of what to expect from the three best known citation analysis tools.
University of Michigan Library
|Web of Science||Scopus||Google Scholar|
||Theoretically, all disciplines|
|Time Span||Some journals from 1900||Some journals from the 1820s||Some citations as far back as the 1660s and 1670s|
|Updated||Weekly||Daily||Unknown but generally quick|
Altmetrics (short for alternative metrics) focuses on social media and the social engagement in regards to your work. Social media is a much quicker form of communication than scholarly citations. Altmetics includes: number of tweets, blog posts, likes, bookmarks, downloads, click-through numbers, peer collaboration tools, etc. in social media and the web.
Other criteria you can use you think about a journal’s impact:
The checklist (Think, Check, Submit) is a tool that will help you discover what you need to know when assessing whether or not a journal is a suitable for your research.
Author impact: Variants (m-, g-, e-indices)
What is a predatory Journal?
Predatory journals are pseudo-academic journals that exist to exploit authors for money, without delivering on the promise of rigorous academic quality. They typically exploit the Open Access publishing business model where authors pay a fee to make their work freely available to the public. Predatory journals are a concern because they are often difficult to identify.
How to spot a predatory (or possible predatory) journal?
Predatory journals are becoming more and more difficult to spot, since predatory publishers go to great lengths to make them seem legitimate. They can (and will) have ISSN's, assign DOI's to their articles and professional websites, making it even more difficult to make a definitive assessment of journals. Chek for:
Are there lists I can check to ensure I don't fall prey to a predatory publisher?
Beall's list - As an Academic librarian concerned with predatory publisher, Jeffrey Beall started compiling his predatory publisher's list. He has received much critique for his list, and has subsequently taken it down. Please keep in mind that this list is not updated, and was based on a subjective assessment of journals.
DOAJ - the Directory of Open Access Journals is a list of reputable, evaluated journals and is the first stop for checking the credibility of any Open Access Journal.
Cabell's International - compiles a list of journals that have penalties agains them, and could be an indication of predatory activity.
The Library and Information Service has signed agreements with a number of publishers that make it easier for SU researchers to publish open access (OA) with these publishers, and in some cases to even publish OA without paying any article processing charge (APC). Many of these agreements have been negotiated by the South African National Library and Information Consortium (SANLiC), of which Stellenbosch University (SU) is a member.
Please keep up to date with all the agreements signed with publishers by visiting this library guide on Open Access Publishing.
These are some of the publishers with whom we have signed free-of-charge OA agreements:
These are some of the publishers with whom we have signed discounted OA agreements:
These are however updated on a regular interval, so please go the above library guide for more information.
Journal rankings can reveal a journal's influence by looking at how often a journal's articles have been cited. Various methodologies exist to rate and rank journals on different criteria.
Journal rankings can help a researcher determine which journals they should try to publish in and which journals are the most respected in a specific field.
Factors that Influence Journal Impact
The average citation level of a journal is a limited indicator, and is not a replacement for expert, qualitative assessment of the journal
Note: Journal metrics are used to evaluate the quality or caliber of the journal in which articles are published and should not be used to make any assessment at the article level. The Impact Factor and CiteScore is not an indication of article quality.
In South Africa, only articles published in "accredited journals" are considered for government subsidy. Only journals included in the lists/indices compiled by the Department of Higher Education and Training are considered "accredited" and will be taken into account for government subsidy and NRF rating and evaluation.
Since 2021, the DHET List includes the Directory of Open Access Journals (DOAJ).
How to get a South African journal accredited by DHET
For more information on getting a journal accredited by the DHET, visit the Division for Research Development's Output Survey Page.
Source Normalised Impact per Paper (SNIP) ranks journals included in the Scopus database. SNIP measures actual citations received relative to citations expected for the serial’s subject field. The impact of a single citation is given higher value in subject areas where citations are less likely, and vice versa. Eg: citation counts in the Life Sciences tend to be higher than in the Arts and Humanities. SNIP “levels the playing field”
SNIP is calculated as the number of citations given in the present year to publications in the past three years divided by the total number of publications in the past three years. A journal with a SNIP of 1.0 has the median (not mean) number of citations for journals in that field.
The Australian Business Deans Council, ABDC Journal Quality list classifies journals based on four mutually exclusive categories, namely, A*, A, B and C. A* represents the highest quality journals and C denoting the lowest quality journals. 2022 Saw a small review, adding and removing of a few journals. See their draft recommendations. 2024/2025 will see an update
Most researchers are familiar with well-established journals and conferences in their field. They are often less familiar with newer publications or publications in related fields as their are simply too many.
Google Scholar provides an overview of publications to help researcher decide where to publish. You can browse the top 100 publications in several languages, ordered by their five-year h-index and h-median metrics, as well as select specific categories and view the top journals in specific disciplines. Google Scholar also allows you to search publications by title or key words and on the main page of Google Scholar's Metrics, you can search by field or specific journal.
The Chartered Association of Business Schools - Academic Journal Guide, is a guide to the range and quality of journals in which business and management academics publish their research. Its purpose is to give both emerging and established scholars greater clarity as to which journals to aim for, and where the best work in their field tends to be clustered.
The AJG is based upon peer review, editorial and expert judgements following from the evaluation of publications, and is informed by statistical information relating to citation and are updated every 3 years.
Web of Science's Journal Citation Reports (JCR) ranks, evaluate and compare scholarly journals in all areas of the sciences and social sciences. Results can be used to determine which journals are the most important and influential in their respective disciplines based on high impact and citations. JCR gives an indication of influence and impact at a category level through its Impact Factor, showing citation relationships between journals.
The journal Impact Factor is the average number of times articles from the journal published in the past 2 years have been cited in the Journal Citation Reports (JCR) year.
Impact Factor (IF) can be accessed from the Web of Science database and is the most commonly used measurement to determine the reputation of a journal in relation to other journals in a specific field. The calculation of IF is based on the average number of times the articles of a journal is cited in a two/five year period.
Keep in mind:
CiteScore is available on Scopus and calculates the average number of citations received in a calendar year by all items published in that journal in the preceding 3 years. CiteScore measures a journals influence and impact, ranking journals within their specific subject categories and is Scopus' version of the Impact Factor from Web of Science.
SCImago Journal Rank (SJR) ranks journals included in the Scopus database. It calculates not only the number of citations to articles in journals but also takes into account the ‘quality’ of the citing journal. Therefor with SJR, the subject field and quality and reputation of the journal has a direct effect on the value of a citation. The SJR:
Author impact: The number of works a researcher has published and the number of times these works have been cited can be an indicator of the academic impact of an individual researcher.
The h-index can be regarded as a measure of the number of publications published (productivity) as well as how often they are cited (impact).
h-index always uses the same formula, but different results will always happen because the sources that apply the formula have a different data set. So because Web of Science indexes data different from Scopus and different from Google Scholar they are always going to be different numbers.
In general you can only compare values within a single discipline.
The h-index correlates with the length of a researcher's career (i.e., researchers who have been publishing for longer tend to have higher h-indices). It can also be inflated by self-citation. Self-citation should only be done when truly appropriate.
The h-index may be less useful in some disciplines, particularly some areas of the humanities.
For more information on the h-index formula, see Library guide:
Bibliometrics and citation analysis: Author impact (h-index)
First, conduct an author search (Last name, First initial) (you can also choose to select an author from the index):
Second, click on the "Create Citation Report" link on the right-hand side of the results page:
Finally, you will get a report with a number of metrics and data on the author you searched, including the h-index.
Log on to Scopus (Follow the path: Library homepage > Search > E-databases > Scopus).
Click on the Author Search tab at the top of the screen.
Enter the author name as indicated.
Select the author profile(s).
The H-Index is displayed on the right.
You can create a Google Scholar Profile that will displays your scholarly works in Google Scholar, including the number of citing publications, as well as your h-index and i10-index. .H-Index: The h-index is a measure of the number of publications published (productivity) as well as how often they are cited (impact). Example: If your h-index is 20, it means that 20 of your publications have been cited 20 times or more. i10-Index: The number of publications with at least 10 citations. If your i10-index is 2, it means that 2 of your publications have been cited 10 times or more.
Tip: Create Google Scholar profile and make your profile public.
If you have a common name, consider choosing the "Don't automatically update my profile" option -- that way Google Scholar will e-mail you to confirm that you are the author of a new paper before adding it to your profile.
Results with hyperlinked (underlined) author names allow you to click on the name, and see the the author's profile. For authors with public author profiles you can find an h-Index.
Author Identifiers connect your name(s) with your work throughout your career. It is important because:
- it provides a means to distinguish you from other authors with the same or similar names
- it links all your work even if you have used different names during your career
- it helps others to find your research output easily (i.e. funders, other researchers, etc.)
- it ensures that your work is clearly attributed to you.
Open Researcher en Contributor ID (ORCID) is a non-profit organisation that provides a registry of unique researcher identifiers,linking research activities and outputs to these identifiers and Stellenbosch University has been a member of the Open Researcher and Contributor ID (ORCID) network since 2015.
Note:it is compulsory to have an ORCID iD when applying for NRF-funding or –rating.
Three steps are required:
1.Create your ORCID iD ( takes 30 seconds)
2. Connect your ORCID and SU network identities.(If you want to connect and update your ORCID profile, log in to your account at http://orcid.org.)
See also Library ORCID guide for detailed steps, videos and more information about ORCID.
ORCID is similar to Publons (Formerly ResearcherID, Scopus Author ID, ISNI and other systems for identifying and distinguishing authors /researchers and creators.
Scopus Author ID
The Scopus database automatically assigns an ID profile to authors to help identify and link their publications. You can check your current Scopus author ID and publications by running an author search on Scopus using your name and current affiliation. You can manage your profile and check your publications are correct using the Scopus to ORCID wizard which will then link the publications associated with your with your Scopus author ID with your ORCID.
Publons (formerly ResearcherID)
ResearcherID is a unique identifier used to distinguish your publications on the Web of Science database, and is now fully integrated with Clarivate Analytics' Publons platform. Once you have registered, you can identify and claim your publications indexed in Web of Science, and your ResearcherID will then be associated with these works and they will be added to your Publons profile. You can also import publications to your Publons profile using ORCID.