Scholarly Open Access
also points out that Beall favors toll-access publishers, specifically
Elsevier, praising its “consistent high quality.” However, a simple
Google search for “fake Elsevier journals” reveals Beall’s position
as tenuous. Furthermore, Beall conflates OA journals with “author
pays” journals, and reveals his skepticism, if not hostility, about OA.
aside, Beall’s laser-like focus on predatory publishers may prevent him
from having a broader perspective on scholarly communication. Case in
point: Beall has blithely declared the “serials crisis” to be over, but
those of us who manage resources beg to differ.
concerning aspect of Beall’s work is his evaluation of OA publishers
from less economically developed countries. Crawford, Karen Coyle, and
Jill Emery have all noted Beall’s bias against these publishers. Imperfect
English or a predominantly non-Western editorial board does not make a
journal predatory. An interesting example is Hindawi, an Egyptian
publisher once considered predatory that improved its practices and
standards over time. If we accept that there is a continuum from devious
and duplicitous to simply low-quality and amateurish, then it is likely,
as Crawford believes, that some of the publishers on Beall’s list are
not actually predatory.
Beall’s contributions are arguably compromised by his attitudes about
OA, the criteria he uses for his list are an excellent starting point for
thinking about the hallmarks of predatory publishers and journals. He
encourages thorough analysis, including scrutiny of editorial boards and
business practices. Some of his red flags provide a lot of “bang for
your buck” in that they are both easy to spot and likely to indicate a
predatory operation. These include editors or editorial board members with
no or fake academic affiliations, lack of clarity about fees, publisher
names and journal titles with geographic terms that have no connection to
the publisher’s physical location or journal’s geographic scope, bogus
impact factor claims and invented metrics, and false claims about where
the journal is indexed.
also lists common practices indicative of low-quality but not necessarily
predatory journals. He is rightfully wary of journals that solicit
manuscripts by spamming researchers, as established publishers generally
do not approach scholars, as well as publishers or editors with email
addresses from Gmail, Yahoo, etc. Also, he wisely warns researchers away
from journals with bizarrely broad or disjointed scopes and journals that
boast extremely rapid publication, which usually suggests no or only
cursory peer review.
the fuzziness between low-quality and predatory publishers, whitelisting,
or listing publishers and journals that have been vetted and verified as
satisfying certain standards, may be a better solution than blacklisting.
The central player in the whitelisting movement is the Directory of Open
Access Journals (DOAJ).
response to the Bohannon sting, DOAJ removed 114 journals and revamped its
criteria for inclusion. Journals
accepted into DOAJ after March 2014 under the stricter rules are marked
with a green tick symbol, and DOAJ has announced that it will require the
remaining 99% of its listed journals to reapply for acceptance.
the basic level, a journal must be chiefly scholarly; make the content
immediately available (i.e., no embargoes); provide quality control
through an editor, editorial board, and peer review; have a registered
International Standard Serial Number (ISSN); and exercise transparency
about APCs. Journals that meet additional requirements, such as providing
external archiving and creating persistent links, are recognized with the
DOAJ Seal. DOAJ receives an assist from the ISSN Centre, which in 2014
added language reserving the right to deny ISSNs to publishers that
provide misleading information.
organization that whitelists publishers by accepting them as members is
the Open Access Scholarly Publishers Association (OASPA). Members must
apply and pledge to adhere to a code of conduct that disallows any form of
predatory be-havior. OASPA
has made errors in vetting applicants, though: it admitted some publishers
that it later had to reject (e.g., Dove Medical Press).
course, no blacklist or whitelist can substitute for head-on investigation
of a journal. Open Access Journal Quality Indicators, a rubric by Sarah
Beaubien and Max Eckard featuring both positive and negative journal
characteristics, can help researchers perform such evaluation.
Furthermore, any tool or practice that gives researchers more information
is a boon. For example, altmetrics provide a broad picture of an
article’s impact (not necessarily correlated to its quality), and open
peer review—i.e., any form of peer review where the reviewer’s
identity is not hidden—increases transparency and allows journals to
demonstrate their standards.
article initially published on