Laine and Winker: Identifying predatory or pseudo-journals

This World Association of Medical Editors (WAME) document aims to provide guidance to help editors, researchers, funders, academic institutions and other stakeholders distinguish predatory journals from legitimate journals.

Over the past decade a group of scholarly journals have proliferated that have become known as “predatory journals” produced by “predatory publishers.” “Predatory” refers to the fact that these entities prey on academicians for financial profit via article processing charges for open access articles, without meeting scholarly publishing standards (1). Although predatory journals may claim to conduct peer review and mimic the structure of legitimate journals, they publish all or most submitted material without external peer review and do not follow standard policies advocated by organizations such as the WAME, the Committee on Publication Ethics (COPE), the International Committee of Medical Journal Editors (ICMJE), and the Council of Science Editors (CSE) regarding issues such as archiving of journal content, management of potential conflicts of interest, handling of errata, and transparency of journal processes and policies including fees. A common practice among predatory publishers is sending frequent e-mails to large numbers of individuals soliciting manuscript submission and promising rapid publication for author fees that may be lower than those of legitimate author-pays journals. In the most egregious cases, they collect publication fees but the promised published articles never appear on the journal website. In some cases, authors publishing in such journals are aware that the journals do not adhere to accepted standards but choose to publish in them anyway, hence they are not “prey” (2, 3). Therefore, “pseudo-journals” may be a more accurate name.

Regardless of the name applied to them, such journals do not provide the peer review that is the hallmark of traditional scholarly publishing. As such, they fall short of being the type of publication that serves as evidence of academic performance that is necessary to gain future research funding and academic advancement. Identifying such journals is important for authors, researchers, peer reviewers, and editors, because scientific work that is not properly vetted should not contribute to the scientific record. “Pseudo-journals” include journals that despite being published by legitimate publishers exist solely for marketing purposes (4); do not provide peer review sufficient to identify “fake” papers (5, 6); and other questionable practices (7). Predatory journals are the most prevalent type of pseudo-journals and have increased quickly. A longitudinal study of article volumes and publishing market characteristics estimated 8000 active predatory journals, with total articles increasing from 53,000 in 2010 to 420,000 in 2014 (an estimated three-quarters of authors were from Asia and Africa) (8). Therefore, this statement focuses on predatory journals.

Most academicians (and their affiliated institutions and the entities that fund their work) want their work to be published in legitimate journals. Unfortunately, the tremendous proliferation of journals –both legitimate and predatory – makes it increasingly difficult to identify predatory journals. A journal that an author has never heard of might be a legitimate new journal, a legitimate journal that is well established but is read and cited far less frequently than other journals in the discipline, a journal from a part of the world that the author is unfamiliar with, or a “predatory” journal. Two substantial efforts to assist stakeholders in distinguishing predatory from legitimate journals include the now defunct Beall’s List and the Directory of Open Access Journals (DOAJ).

From 2011 to January 2017, Jeffrey Beall, a librarian at Auraria Library and associate professor at the University of Colorado Denver, compiled annual lists of potential, possible, or probably predatory scholarly open access journals (9). In 2015, he added two additional lists – misleading metrics and hijacked journals. The misleading metrics list included companies that produce counterfeit impact factors or similar journal measures that predatory publishers use to deceive scholars into thinking that the journals are legitimate. “Hijacked journals” refer to the creation of a counterfeit website that mimics the website of a legitimate journal for the purpose of soliciting submissions and collecting author fees from authors who believe they are sending their work to the legitimate journal. However, on January 17, 2017 Beall’s website was dismantled for unclear reasons (10). Beall’s lists were alarmingly lengthy, with 1155 predatory publishers and 1294 predatory journals being listed as of January 3, 2017. In compiling his list, Beall used criteria (Table 1) that he based in part on two policy statements – the COPE Code of Conduct for Journal Publishers and the Principles of Transparency and Best Practice in Scholarly Publishing from WAME, COPE, DOAJ, and Open Access Scholarly Publishers Association (OASPA) (11, 12). The effort involved in developing Beall’s list was impressive and it was a reasonable starting point for someone who wanted to investigate a journal’s or publisher’s authenticity. However, Beall did not list the specific criteria he used to categorize a given journal as predatory and he mistakenly black-listed some legitimate journals and publishers, particularly those from low and middle income countries (LMICs) (13, 14). He used criteria like “journals having little or no geographic diversity on their editorial boards” and “not being listed in standard periodical directories or library databases”, problems common for journals in LMICs (9, 15, 16). In addition, some criticized Beall for being biased against open access publishing models, and for conflating access rules with business models (17). Other Beall criteria, while identifying potentially undesirable journal features, are not reliable indicators of predatory publication practices (e.g., exclusion of female members on the editorial board). Thus, WAME cautions against the use of prior appearance on Beall’s list as the solitary method for determining whether a journal is predatory or legitimate.

Table 1

Beall’s criteria for identification of predatory journals and publishers*

Editor and Staff The publisher’s owner is identified as the editor of each and every journal published by the organization.
No single individual is identified as any specific journal’s editor.
The journal does not identify a formal editorial / review board.
No academic information is provided regarding the editor, editorial staff, and/or review board members.
Evidence exists showing that the editor and/or review board members do not possess academic expertise to reasonably qualify them to be publication gatekeepers in the journal’s field.
Two or more journals have duplicate editorial boards (i.e., same editorial board for more than one journal).
The journals have an insufficient number of board members (e.g., 2 or 3 members), have concocted editorial boards (made up names), name scholars on their editorial board without their knowledge or permission or have board members who are prominent researchers but exempt them from any contributions to the journal except the use of their names and/or photographs.
There is little or no geographical diversity among the editorial board members, especially for journals that claim to be international in scope or coverage.
The editorial board engages in gender bias (i.e., exclusion of any female members).
Business management, the publisher Demonstrates a lack of transparency in publishing operations.
Has no policies or practices for digital preservation.
Begins operations with a large fleet of journals, often using a common template to quickly create each journal’s home page.
Provides insufficient information or hides information about author fees, offering to publish an author’s paper and later sending an unanticipated “surprise” invoice.
Does not allow search engines to crawl the published content, preventing the content from being indexed in academic indexes.
Copy-proofs (locks) their PDFs, thus making it harder to check for plagiarism.
Integrity The name of a journal is incongruent with the journal’s mission.
The name of a journal does not adequately reflect its origin (e.g., a journal with the word “Canadian” or “Swiss” in its name when neither the publisher, editor, nor any purported institutional affiliate relates whatsoever to Canada or Switzerland).
In its spam email or on its website, the publisher falsely claims one or more of its journals have actual (Thomson-Reuters) impact factors, or advertises impact factors assigned by fake “impact factor” services, or it uses some made up measure (e.g., view factor), feigning/claiming an exaggerated international standing.
The publisher sends spam requests for peer reviews to scholars unqualified to review submitted manuscripts, in the sense that the specialties of the invited reviewers do not match the papers sent to them.
The publisher falsely claims to have its content indexed in legitimate abstracting and indexing services or claims that its content is indexed in resources that are not abstracting and indexing services.
The publisher dedicates insufficient resources to preventing and eliminating author misconduct, to the extent that the journal or journals suffer from repeated cases of plagiarism, self-plagiarism, image manipulation, and the like.
The publisher asks the corresponding author for suggested reviewers and the publisher subsequently uses the suggested reviewers without sufficiently vetting their qualifications or authenticity.
Other Re-publish papers already published in other venues/outlets without providing appropriate credits.
Use boastful language claiming to be a “leading publisher” even though the publisher may only be a startup or a novice organization.
Operate in a Western country chiefly for the purpose of functioning as a vanity press for scholars in a developing country (e.g., utilizing a mail drop address or PO box address in the United States, while actually operating from a developing country).
Provide minimal or no copyediting or proofreading of submissions.
Publish papers that are not academic at all, e.g. essays by lay people, polemical editorials, or obvious pseudo-science.
Have a “contact us” page that only includes a web form or an email address, and the publisher hides or does not reveal its location.
Poor journal standards/practice (do not equal predatory criteria, but authors should consider these items prior to manuscript submissions) The publisher copies “authors guidelines” verbatim (or with minor editing) from other publishers.
The publisher lists insufficient contact information, including contact information that does not clearly state the headquarters location or misrepresents the headquarters location (e.g., through the use of addresses that are actually mail drops).
The publisher publishes journals that are excessively broad (e.g., Journal of Education) in order to attract more articles and gain more revenue from author fees.
The publisher publishes journals that combine two or more fields not normally treated together (e.g., International Journal of Business, Humanities and Technology).
The publisher charges authors for publishing but requires transfer of copyright and retains copyright on journal content. Or the publisher requires the copyright transfer upon submission of manuscript.
The publisher has poorly maintained websites, including dead links, prominent misspellings and grammatical errors on the website.
The publisher makes unauthorized use of licensed images on their website, taken from the open web, without permission or licensing from the copyright owners.
*Formerly available at https://scholarlyoa.files.wordpress.com/2015/01/criteria-2015.pdf; no longer accessible.

While the purpose of Beall’s list was to identify “predatory” journals, the DOAJ has the converse purpose of identifying legitimate open access journals (18). According to its website, “The [DOAJ] is a service that indexes high quality, peer reviewed Open Access research journals, periodicals and their articles’ metadata. The Directory aims to be comprehensive and cover all open access academic journals that use an appropriate quality control system and is not limited to particular languages or subject areas.” As of January 5, 2017, DOAJ included 9456 journals from 128 countries. The DOAJ grants some journals the DOAJ seal, a mark of certification for open access journals for achievement of a high level of openness, adhering to best practices, and having high publishing standards (Table 2). However, the DOAJ is not a comprehensive list of all legitimate open access journals and a journal that is not listed should not be assumed to be illegitimate or predatory. It may be a journal that has not sought inclusion on the DOAJ or has insufficient funding to meet some of DOAJ’s requirements. Conversely, listing on the DOAJ does not guarantee high quality – the DOAJ has a routine mechanism for users of the DOAJ to notify DOAJ if they find a journal with questionable practices on the DOAJ list.

Table 2

Criteria for Receipt of the DOAJ Seal*

To receive the DOAJ Seal, journals must meet all of the following criteria:
provide permanent identifiers (e.g., DOIs) in the papers published;
provide DOAJ with article metadata;
deposit content with a long term digital preservation or archiving program;
embed machine-readable CC licensing information in articles;
allow generous reuse and mixing of content, in accordance with a CC BY, CC BY-SA or CC BY-NC license;
have a deposit policy registered with a deposit policy registry;
allow the author to hold the copyright without restrictions.
* Available at: https://doaj.org/publishers#seal.

A third approach is the “Think. Check. Submit.” checklist developed by a coalition of scholarly publishing organizations (19). These criteria (Table 3) are useful for authors considering where to submit their work, but as with the other initiatives are not a failsafe to identify all legitimate scholarly journals. The criterion of knowledge of individuals involved in the journal make this approach less useful for those who are evaluating journals from a different part of the world.

Table 3

Checklist from “Think. Check. Submit.” Initiative*

Do you or your colleagues know the journal?
Have you read any articles in the journal before?
Is it easy to discover the latest papers in the journal?
Can you easily identify and contact the publisher?
Is the publisher name clearly displayed on the journal website?
Can you contact the publisher by telephone, email, and post?
Is the journal clear about the type of peer review it uses?
Are articles indexed in services that you use?
Is it clear what fees will be charged?
Does the journal site explain what these fees are for and when they will be charged?
Do you recognize the editorial board?
Have you heard of the editorial board members?
Do the editorial board members mention the journal on their own websites?
Is the publisher a member of a recognized industry initiative?
Do they belong to the Committee on Publication Ethics (COPE)?
If the journal is open access, is it listed in the Directory of Open Access Journals (DOAJ)?
If the journal is open access, does the publisher belong to the Open Access Scholarly Publishers Association (OASPA)?
Is the publisher a member of another trade association?
*Available at: http://thinkchecksubmit.org/check/.

Because existing initiatives do not provide error-proof methods for determining the status of a particular journal, individuals who aim to gain a high level of assurance about a journal’s status need to investigate further. WAME developed the framework illustrated in Figure 1 for such investigation. This framework begins with assessing whether the journal has any of the characteristics Beall viewed as potentially problematic (Table 1), its presence in the DOAJ, and presence of “Think. Check. Submit.” features (Table 3), with further investigation guided by these initial indicators. Assessment remains subjective, but reviewing the journals’ website and practices/policies for evidence of the “warning sign” features (Table 4) will help inform this judgment. The more “red flags” that are present, the more hesitant one should be to consider the journal a desirable publication venue.

Figure 1

Predatory journals algorithm

bm-27-285-f1
Table 4

“Warning Sign” features that should increase suspicion that a journal is predatory (although features may be absent even in a predatory journal)

No information as to whether there are author fees in the Instructions for Authors.
Peer review is not mentioned in the Instructions for Authors.
Little or no information is provided regarding the editor or editorial board.
No location is listed for the journal offices, or location is very different than the location of the editors and editorial board.
The journal website is not easily accessible in an internet search (could be a problem in a legitimate journal in a low or middle income locale).
The journal publishes either an unusually small, unusually large, or markedly variable numbers of articles each year.
You or your colleagues have received formulaic e-mail solicitations for submissions that do not specify an interest in particular projects or areas that you are working on.
Promised routine turnaround times for review and publication are so rapid that they seem “too good to be true” and would be unlikely to encompass the time necessary for true peer review.
You do not receive a response to e-mail or telephone messages sent to the editor or journal office within a few days.
The name of the journal is very similar to the name of a well-known, established journal with a good reputation.
The publication fees are atypical for the scholarly publishing industry (much higher or much lower fees can both signal problems [with recognition that journals in low or middle income countries may have legitimately low fees]).
It is difficult to identify articles published in the journal when searching Google Scholar or other databases (with recognition that new journals or those in low or middle income countries may face lags in indexing).
Information about author affiliations and/or contact information is not present in published articles.
Someone you know listed on the editorial board or journal staff, when you query them about the journal, is unaware of their supposed affiliation with the journal.

Why have predatory journals become a significant problem? Digital publication brought many benefits, including lowered journal overhead relative to printing and postage, and “author pays” models enabled immediate open access. Nevertheless, scholarly journals pay substantial costs for editor and staff time for manuscript evaluation, peer review, editing, and quality assurance. Predatory journals reduce or eliminate these services, skimming the author fees as profit.

Why have predatory journals thrived? Their promise of quick publication is attractive to academics. Predatory journals provide young researchers who may not know better and academicians in search of quick publication with a low barrier to publication. In too many settings, promotions committees and other such bodies focus on the number of publications rather than the quality of those publications and the venues in which they appear. Thus, predatory journals are likely to continue to prosper unless such bodies and funders begin to routinely scrutinize the quality as well as the quantity of their faculty’s publications, not by excluding all online journals from consideration, but by identifying acceptable journals according to quality criteria (20). Ideally, academic institutions should also identify academics who are listed as editors or Editorial Board members for journals established as predatory, and require that their affiliation with the institution is removed. Those mentoring junior researchers must recognize that predatory journals exist and help those they mentor identify high quality publication venues. Websites developed to help researchers must be responsible about the journals their resources help promote (21). Addressing the scourge of predatory journals will require efforts at every level of the research process.

Future initiatives to identify predatory journals should be as transparent and objective as possible, with mechanisms for journals incorrectly identified as predatory to correct the record and for predatory journals to become legitimate by improving their practices. Authors who have submitted their work to predatory journals should share their experiences to “out” poor journal practices. Authors whose legitimate research was published in predatory journals should have a mechanism for submitting their research to a legitimate peer reviewed journal, preferably after retraction of the “predatory” publication—although, unfortunately, most predatory journals do not publish corrections or retractions. Such initiatives would hasten the demise or conversion of predatory journals.

Acknowledgments

We thank the WAME Board (Rod Rohrich, Lorraine Ferris, Tom Lang, Phaedra Cress, Fatema Jawad, Rajeev Kumar, José Lapeña, Chris Zielinski) and Peush Sahni for their critical review and approval.

Notes

[1] Conflicts of interest None declared.

References

1 

Clark J, Smith R. Firm action needed on predatory journals [Electronic version]. BMJ. 2015;350:h210. [cited 2017 February 14th] Available at https://www.researchgate.net/profile/Jocalyn_Clark/publication/271022726_Firm_action_needed_on_predatory_journals/links/56f8f0cc08ae81582bf40ff0.pdfhttps://doi.org/10.1136/bmj.h210

2 

Wallace F, Perri T. Economists behaving badly: publications in predatory journals [Electronic version]. Munich Personal RePec Archive August 15, 2016. Accessed February 7th 2017. Available at: https://mpra.ub.uni-muenchen.de/73075/1/MPRA_paper_73075.pdf.

3 

Seethapathy GS, Santhosh Kumar JU, Hareesha AS. India’s scientific publication in predatory journals: need for regulating quality of Indian science and education. Curr Sci. 2016;111:1759–64. [cited 2017 February 7th] Available at http://www.currentscience.ac.in/Volumes/111/11/1759.pdfhttps://doi.org/10.18520/cs/v111/i11/1759-1764

4 

Grant B. Elsevier published 6 fake journals [Electronic version]. Accessed February 7th 2017. Available at: http://www.the-scientist.com/?articles.view/articleNo/27383/title/Elsevier-published-6-fake-journals/.

5 

Bohannon J. Who’s afraid of peer review? Science. 2013;342:60–5. https://doi.org/10.1126/science.342.6154.60

6 

Davis P. Open access publisher accepts nonsense manuscript for dollars [Electronic version]. Scholarly Kitchen June 10, 2009. Accessed February 7th 2017. Available at: http://scholarlykitchen.sspnet.org/2009/06/10/nonsense-for-dollars.

7 

Eriksson S, Helgesson G. The false academy: predatory publishing in science and bioethics. Med Health Care Philos. 2017 Oct 7 [Cited 2017 Feb 21]. [Epub ahead of print] https://doi.org/10.1007/s11019-016-9740-3

8 

Shen C, Pjork BC. ‘Predatory’ open access: a longitudinal study of article volumes and market characteristics. BMC Med. 2015;13:230. https://doi.org/10.1186/s12916-015-0469-2

9 

Beall J. Beall’s list of predatory publishers 2016. Scholarly Open Access. Available at https://web.archive.org/web/20170113114519/https://scholarlyoa.com/2016/01/05/bealls-list-of-predatory-publishers-2016/. Accessed February 7th 2017.

10 

Chawla DS. Mystery as controversial list of predatory publishers disappears. Available at: http://www.sciencemag.org/news/2017/01/mystery-controversial-list-predatory-publishers-disappears. Accessed February 7 th 2017.

11 

COPE. Code of Conduct. Available at: http://publicationethics.org/resources/code-conduct. Accessed February 7 th 2017.

12 

Principles of transparency and best practice in scholarly publishing. Available at: http://www.wame.org/about/principles-of-transparency-and-best-practice. Accessed February 11, 2017.

13 

Crawford W. “Trust me”: the other problem with 87% of Beall’s lists. Walt at Random (blog). Available at: http://walt.lishost.org/2016/01/trust-me-the-other-problem-with-87-of-bealls-lists/. Accessed February 11th 2017.

14 

Brazilian Forum of Public Health Journals Editors and the Brazilian Public Health Association (Abrasco). Motion to repudiate Mr. Jeffrey Beall’s classist attack on SciELO. Available at: http://blog.scielo.org/en/2015/08/02/motion-to-repudiate-mr-jeffrey-bealls-classist-attack-on-scielo/. Accessed February 11th 2017.

15 

Coyle K. Predatory publishers. Peer to peer review. Available at: http://lj.libraryjournal.com/2013/04/opinion/peer-to-peer-review/predatory-publishers-peer-to-peer-review/#_. Accessed February 7th 2017.

16 

Emery J. Heard on the net: it’s a small world after all: traveling beyond the viewpoint of American exceptionalism to the rise of the author. Charleston Advisor. 2012;15:67–8. https://doi.org/10.5260/chara.15.2.67

17 

Crawford W. Ethics and access 1: the sad case of Jeffrey Beall. Cites & Insights 2014;14:1–14. Available at: http://citesandinsights.info/civ14i4.pdf. Accessed February 14th 2017.

18 

Directory of Open Access Journals. Available at: https://doaj.org/faq#whatis. Accessed February 11th 2017.

19 

Think Check Submit. Available at: http://thinkchecksubmit.org. Accessed February 11th 2017.

20 

RAggarwalNGogtayRKumarPSahniIndian Association of Medical Journal Editors. The revised guidelines of the Medical Council of India for academic promotions: need for a rethink. Indian J Urol. 2016;32:1–4. https://doi.org/10.4103/0970-1591.173117

21 

Memon AR. ResearchGate is no longer reliable: leniency towards ghost journals may decrease its impact on the scientific community. J Pak Med Assoc. 2016;66:1643–7.