"One number from one tool is not enough to measure the quality of a journal"


Reading time
13 mins
"One number from one tool is not enough to measure the quality of a journal"

Cabell’s International was founded in 1978 by Dr. David W.E. Cabell, who wanted to make it easy for “tenure committees, professors, researchers and doctoral students to find detailed information for the purpose of evaluating and selecting academic journals.” Over the years, Cabell’s has become a trusted source for journal information, helping researchers from different disciplines identify the journal most suited for their paper. The Cabell’s directory provides information on journals’ publication processes, acceptance rates, review processes, acceptance timelines, and contact details. Beyond this, Cabell’s also provides unique tools to help evaluate research impact – CCI, DA, and IPA. Cabell’s Classification Index (CCI)© provides users with much needed context to citation metrics by ranking journals within each of the disciplines and topics published. The Difficulty of Acceptance (DA)© is a rating that helps identify which journal has been publishing articles from which research institutions, thereby helping users identify the top performing institutions. The Institutional Publishing Activity© score helps users compare the performance of their institution’s research program with that of others.

To understand exactly how Cabell’s provides such detailed and useful information, I spoke to Lacey E. Earle, Vice President of Business Development for Cabell’s International. Lacey leads corporate relations and business partnerships at Cabell’s. She is well-versed in the many facets of academic publishing, with over 12 years of experience in the industry. In her tenure at Cabell’s, she has successfully shepherded the company through the development and implementation of its proprietary journal evaluation systems and led the company's expansion from journal data collection and processing to journal ranking and evaluation.

In this interview, Lacey talks about how Cabell’s helps bridge major information gaps in academic publishing and explains how the directory and tools it provides are useful for early-career researchers as well as senior scholars. She also explains that it is essential for the academic research community to move away from using a single metric for evaluating research impact; instead, scholars should use a variety of metrics that help evaluate different aspects of published research and understand the real impact of science.

How does Cabell’s work? How do you connect “researchers, librarians and academics to the journal titles they need”?

Cabell’s provides the most detailed information about more journals than does any other academic resource available today. We connect researchers to the journals that they need by providing detailed information on over 11,000 journals, which can help our subscribers judge whether a prospective journal is legitimate, find out if a journal that they have in mind suits their needs, or discover an entirely new journal that would be a good target for publication.

Interestingly, researchers are not the only ones who need reliable information about academic journals. Librarians use our system to decide which journals to include in their collections and which journals to recommend to researchers. Academics in the tenure process rely on Cabell’s as a journal whitelist and a source of quality metrics. For example, when a professor submits his CV to the tenure committee, often the committee will refer to our database to learn more about and verify the legitimacy of the journals in which that professor has published.

The evaluative tools and the sheer amount of information we offer make Cabell’s indispensable for each of these groups. In addition to the metrics, we provide the following details about each journal:

  • Editorial name & contact info
  • Submission process info (how and where to submit manuscripts)
  • Journal website
  • Yearly acceptance rate
  • Yearly percentage of invited articles
  • Type of review process (blind, double-blind, etc.)
  • Number of external reviewers for each manuscript
  • Number of internal reviewers for each manuscript
  • The average time it takes to review a manuscript
  • The average time it takes from submission of a manuscript until actual publication
  • Whether the author can receive a copy of the reviewers' comments
  • Whether the journal is using a plagiarism screening tool
  • The average required length for the manuscript (in words or pages)
  • The reader type (academics, practitioners, etc.)
  • The journal's sponsor(s)
  • The frequency of issue
  • The year that the journal began publishing
  • ISSN and EISSN
  • The topics on which the journal will accept manuscripts
  • Open access model (green, gold, hybrid, etc.)
  • The Aims & Scope for the journal
  • We also allow editors to send a written "Statement of Impact" describing their journal's influence in real world practice.

Generally, one might expect that early-career researchers would need help with selecting the right journal for their paper. How do tenured committee members and more experienced researchers, who have more publication and journal selection experience, use Cabell’s?

It is true that many universities incorporate the use of our database into their Research Methods courses so that PhD students can experience the research process from idea inception to publication. Early-career researchers benefit from the wealth of information and powerful search tools provided by Cabell’s when selecting outlets for their work. Tenured professors generally use our database for more evaluative purposes. Our metrics are designed to help professors appraise publication records for potential new hires, academic accreditation, and promotion.

How do you short list journals for inclusion in the directory? Do you target only English-language journals?

We are contacted by 50-60 journals every month seeking inclusion in our database. We also receive nominations from subscribers, other researchers and publishers. Our journal selection staff conducts a preliminary screening on each of these journals. The staff is composed of data specialists who are trained to collect and examine data about journals and make decisions about quality. If a journal passes the preliminary screening, our staff will send the journal an official application for inclusion. We require journals to provide documentation of their policies, practices, and finance and revenue sources. Once all the data has been collected, our staff thoroughly reviews all the information in accordance with our Selection Policy. If the journal is found to be in compliance with our requirements, an invitation to be included in our database will be issued. This is by no means a short process and it generally takes 6 to 8 months from nomination to inclusion.

As we expand into the European and Asian markets, we can’t expect to only list English-language journals. At this time, the majority of our database is made up of English-language journals, but we do include journals in other languages. All journals are put through the same review process, regardless of the language. If a journal meets our selection criteria, we will include it in our database.

How can authors be assured of the quality of journals listed in Cabell’s? How do you evaluate the merit of the journals?

We are a whitelist. What we mean by this is that every journal in our database is a legitimate source of scholarly publication. We have taken the guesswork out of evaluating the “safety” of submitting a manuscript. In regards to assuring the quality or impact of individual journals, subscribers have full access to all of the quality metrics that we provide. We have partnerships with both Thomson Reuters and Elsevier to use their citation data. We display Thomson Reuters’ Impact Factors on our website, and we use Scopus citation counts to create our own evaluation of quality, the Cabell’s Classification Index©.

What is the Cabell’s Classification Index©? How does it work? Also, what value does it hold for authors/users?

The Cabell’s Classification Index© (CCI) is a robust metric we developed from raw citation data that is licensed from Scopus®. Basically, it works by analyzing the number of citations a journal receives, but what makes the CCI unique is that it is a contextual citation metric — it is normalized per discipline and topic. This means a particular journal is not assigned a single ranking with the CCI; it is evaluated at each discipline and topic level. If we consider a journal that specializes in the discipline of accounting, the CCI is calculated by comparing the citation rate of that journal with only other journals that also have specialties in accounting. The comparison is calculated at each discipline and each topic available. This unique feature, the contextual perspective, is what really empowers our users when evaluating journals using the CCI. The quality of a journal can be quickly assessed without having to account for what is a good citation rate in any particular field, as it can be highly variable.

Does the CCI also include multi-disciplinary journals within its scope?

The CCI is designed primarily as a tool to evaluate journals across and within disciplines. A multi-disciplinary journal will have a CCI for each discipline and because the scores are normalized it is easy to compare the journal’s influence in each discipline. Additionally, this makes it easy to discern how the journal performs relative to single discipline journals and other multi-disciplinary journals.

Could you tell us a bit about the other tools you provide – Difficulty of Acceptance© and Institutional Publishing Activity© ratings?

The Difficulty of Acceptance© (DA) and Institutional Publishing Activity© (IPA) are also citation-based metrics derived from raw citation data licensed from Scopus® but they measure the publishing landscape from different perspectives to help our users develop a comprehensive view.

The IPA basically attributes citations to the authors’ affiliated institutions rather than the journal in which the articles were published. This accumulation of institutional citations is then run through a statistical algorithm that compares each institution. Much like the CCI, the comparisons are performed within the context of each discipline and topic. The IPA is useful in measuring the publishing output of institutions within the scope of any field of interest by our users; they can compare and evaluate through the context of any discipline or topic. The IPA is useful to researchers who are looking to find an institution that matches their research goals in terms of quality and focus. One specific way that researchers can use the IPA is to get a snapshot of a university’s research culture. If a researcher is looking to advance to an institution with a strong program in his or her discipline, he or she would look for a university with a pattern of high-quality activity in his or her field. The IPA captures these patterns and associates them with specific institutions to help researchers make these kinds of decisions.

Researchers who are not looking to move to a new institution can use the IPA to compare their level of research with that of their institution at-large. Similarly, administrators can use their institution’s IPA as a benchmark for evaluating research for hiring and tenure. Also, administrators in the hiring process can find standout candidates by comparing applicants’ research to their IPA’s of previous affiliations. This helps set apart researchers with particular dedication and success in their research.

The DA offers a unique look into the publishing trends of individual journals. Without getting too deep into the algorithm that is used to calculate the DA, I can explain that the DA basically measures to what extent journals have a pattern of publishing articles by authors from high-performing institutions in their disciplines. Based on the caliber of authors published in a journal, the DA quantifies the likelihood of having a manuscript accepted to that publication.

Today, there are several tools available that help researchers find suitable journals for their papers. What distinguishes Cabell’s from other journal selection tools?

First and foremost, Cabell’s is the only database of our kind that has specially trained staff actively hand-sourcing and verifying information about journals. Most other systems rely on passive data collection, meaning that they wait for journals to come to their website and fill in whatever information they wish. Because we have dedicated journal researchers, we are able to collect and compile information that is unavailable in other databases.

Second, though we have our own quality metrics, Cabell’s strives to provide our users with as many evaluative tools as we can. As mentioned, we have partnered with both Thomson Reuters and Elsevier to offer our subscribers more than one unbiased, objective measure of journal quality. We hope to continually develop and include more quality measures as the publishing landscape changes.

Finally, we are the only resource that has been specifically designed to meet the needs of such a wide spectrum of users. Our dedication to every facet of academic publishing makes us stand out.

More and more academic publishing experts feel that the impact factor has been abused/overused and needs to be replaced. However, you mention that it can be used in conjunction with other metrics, so that users have more than one unbiased measure of journal quality. Could you elaborate on how using multiple tools could help the research community better?

In short, one number from one tool is not enough information to appropriately measure or convey the quality of a journal. Attempting to measure a journal by a single number is an attempt to measure a complex and widely varying creature by one attribute and then, based on that attribute, asking “How good is this journal?” A better question to ask, one which addresses the real complexities of journal quality is, “In what way and to whom is this journal important?”

Researchers seeking to have their work used, built upon, and integrated into our collective knowledge must concern themselves with delivering their publications to the appropriate audience. Using a variety of tools, including the Impact Factor, the CCI, and anything else at their disposal, researchers can implement a data-driven approach to journal selection. This approach, however, must be acknowledged and appreciated by those in the position to evaluate researchers for career progression. Instead of only encouraging researchers to find and publish in the “best” journal with the “highest score,” administrators need to ask whether their faculty’s body of research ended up in an appropriate journal. They need to ask whether a faculty member’s work was published in a journal that will put their research into the hands of those who can make the most of it. Otherwise, when we focus on a single number or any other individual measure as an end in itself, we have strayed from the purpose of the whole research process.

Where does your blog – “The Source” – fit into the larger picture of Cabell’s goal to facilitate scholarly communication?

The Source would be better described as a newsletter and it serves as a way for us to share articles, events, or developments that are relevant to the academic publishing landscape. We pull from a broad spectrum of sources to offer our subscribers intrigue, controversy, and sometimes a little bit of humor. Cabell's is actively engaged with the scholarly community, and we like to share our observations on trends and advancements in academic publishing. The Source goes out to around 15,000 subscribers; if you’d like to be one, click here.

Thank you, Lacey!

Be the first to clap

for this interview

Published on: May 17, 2016

Passionate about scholarly publishing, always looking to have memorable conversations with researchers and industry professionals across the globe
See more from Jayashree Rajagopalan

Comments

You're looking to give wings to your academic career and publication journey. We like that!

Why don't we give you complete access! Create a free account and get unlimited access to all resources & a vibrant researcher community.

One click sign-in with your social accounts

1536 visitors saw this today and 1210 signed up.