The review system is broken – not only by the sheer number of “me too” papers but also by the lack of reviewers who are willing to spend their time on these papers. This is also the result of a new essay Continue reading The system itself is untenable
Virginia Walbot “Are we training pit bulls to review our manuscripts?”
Who hasn’t reacted with shock to a devastatingly negative review of a manuscript representing years of work by graduate students and postdoctoral fellows on a difficult, unsolved question? … dismissing the years of labor and stating that the manuscript can only be reconsidered with substantially more data providing definitive proof of each claim. … Your manuscript is declined, with encouragement to resubmit when new data are added.
I confess. I’m partly responsible for training the pit-bull reviewer, and I bet you are too.
I am frequently asked if my reviewer record should be transferred to Publons. I always say no as I don’t see any benefit. Publons according to Wikipedia is a
commercial website that provides a free service for academics to track, verify, and showcase their peer review and editorial contributions for academic journals. It was launched in 2012 and was bought by Clarivate Analytics in 2017 (which also owns Web of Science, EndNote, and ScholarOne). It claims that over 200,000 researchers have joined the site, adding more than one million reviews across 25,000 journals.
So should I really spend my limited time with a pseudo-scientific profile? Even the head scores there seem to be ridiculous as I have done less than 500 reviews so far. Since 1988 and not 2012….
There is an interesting paper at bioRxiv on the never ending stream of review requests
… overburdening of reviewers to be caused by (i) an increase in manuscript submissions; (ii) insufficient editorial triage; (iii) a lack of reviewing instructions; (iv) difficulties in recruiting reviewers; (v) inefficiencies in manuscript handling and (vi) a lack of institutionalisation of peer review.
What makes it even more worse, that with the limited capacity of pre-publication review also the capacity of post-publication review is dropping…
It’s not easy to monitor science output. This may be particular true when it comes to Journal Hijacking. In brief
The Spanish journal Afinidad has been hijacked. Someone has set up a fake website for the journal and is soliciting submissions and payments from the authors in accordance with the gold open-access model.
With the recent quality of some scholarly journals I feel they may have been highjacked too: typing errors, omission of references, major misunderstandings, logical errors, you name it.
Being spammed by a company called Hindawi for many years, I tried to find out a bit more about one of their journals called “Journal of Allergy”. The website http://www.hindawi.com/journals/ja says
Journal of Allergy is a peer-reviewed, open access journal that publishes original research articles, review articles, and clinical studies in all areas of allergy. Journal of Allergy currently has an acceptance rate of 43%. The average time between submission and final decision is 59 days and the average time between acceptance and final publication is 34 days.
According to their own description, they are located in Cairo and employ some 200 to 1,000 employees. Hindawi seems to be the name of one of their founders. In some other web sources they claim 410 Park Avenue, 15th Floor, New York, USA, as their address. Google Streetview shows at that address a 11+2 floor building with Chase Manhattan Bank located at the ground floor.
Only 40 or so of the 500+ Hindawi journals have any impact factor associated with.
Declan Butler at Nature already wrote about these kind of journals:
Open-access publishers often collect fees from authors to pay for peer review, editing and website maintenance. Beall asserts that the goal of predatory open-access publishers is to exploit this model by charging the fee without providing all the expected publishing services. These publishers, Beall says, typically display “an intention to deceive authors and readers, and a lack of transparency in their operations and processes”.
At the moment, the Journal of Allergy is not being black listed by Beall (while Hindawi had been in the past). “Journal of Allergy” should not be confused with “The Journal of Allergy”[Jour] that has 1514 PUBMED entries while the “Journal of Allergy”[Jour] has only 157 entries so far. Is this an “intention to deceive authors and readers”?
The most recent issue appears as of “Epub 2014 Apr 6”, the first one as “Epub 2009 Jul 2”, so the company basically publishing 2-3 papers per month.
The Pubmed Analyzer are not very informative here. The whole “Journal of Allergy” has accumulated only 135 citations in the past 5 years (not an impressive figure as I have authored more than a dozen single papers that have received all more citations than the whole journal).
The extreme low citation rate and the missing impact factor may not be taken as an indicator that all papers are of poor quality but raises serious doubts.
The next question therefore is: Does the journal run a state of the art review process? The website list the following 24 scientists on the review board:
William E. Berger, University of California, Irvine, USA
Kurt Blaser, Universität Zürich, Switzerland
Eugene R. Bleecker, Wake Forest University, USA
Jan de Monchy, University of Groningen, The Netherlands
Frank JP Hoebers, MAASTRO Clinic, The Netherlands
Stephen T. Holgate, University of Southampton, United Kingdom
S. L. Johnston, Imperial College London, United Kingdom
Young J. Juhn, Mayo Clinic, USA
Alan P. Knutsen, Saint Louis University, USA
Marek L. Kowalski, Medical University of Lodz, Poland
Ting Fan Leung, The Chinese University of Hong Kong, Hong Kong
Clare M Lloyd, Imperial College London, United Kingdom
Redwan Moqbel, University of Manitoba, Canada
Desiderio Passali, University of Siena, Italy
Stephen P. Peters, Wake Forest University, USA
David G. Proud, University of Calgary, Canada
Fabienne Rancé, CHU Rangueil, France
Anuradha Ray, University of Pittsburgh, USA
Harald Renz, Philipps University of Marburg, Germany
Nima Rezaei, Tehran University of Medical Sciences, Iran
Robert P. Schleimer, Northwestern University, USA
Massimo Triggiani, Università degli Studi di Napoli Federico II, Italy
Hugo Van Bever, National University of Singapore, Singapore
Garry M. Walsh, University of Aberdeen, United Kingdom
Unfortunately this list is not identical to the editor names that are being listed directly on the PDFs ( eg the academic editor RM is not being listed at the web front). The above editor list includes indeed some well respected scientists but there are also others that show their Hindawi affiliation as their first hit on Google only. As I know 7 of the 24 persons, I decided to email them a short 6 item questionnaire via Surveymonkey.
When did you start your role as an editor?
2009, 2010, 2011, 2012, 2013, 2014
2. What is your role there?
Leading editor- supervising associate editors
Editor – assigning papers to reviewes, holding final decision Reviewer – reading and scoring papers
Sonstiges (bitte angeben)
3. How many papers have you been dealing with?
0, 1, 2, 3, 4, 5 or more
4. How many papers did you accept?
nearly none, about half, most, all
5. Are you being paid for that work?
no, yes, don’t want to tell
6. Is this a serious journal?
no, yes, don’t know
2 of my 24 emails bounced- some of the members of the editorial board are already retired.
19 did not respond. I believe they will show the same behaviour when being addressed by Hindawi.
1 editor sent me a personal email saying that he will resign from the board. It will be interesting to see when the list of editors will be changed, I already started a change detection.
3 editors answered the mini survey: Editor #1 started in 2010, has been dealing with more than 5 papers, accepted most, is not paid and believes it is a serious journal. Editor #2 started in 2009 with all other responses being identical. Editor #3 started also in 2009 but accepts only half of the papers.
It doesn’t come unexpected that these 3 motivated editors believe in a regular review process. I fear, however, that most editors either do not work for the journal (anymore) or are not motivated to spend even 3 minutes for the quality control of their work.
Without any transparent review process like that at the BMC journals, we can not judge from the outside if there is any review process. The names of the individual reviewers are unknown, and even contacting the authors would not help as they don’t have an interest to reveal that they get a paper published without any review process.
As a library one could order printed copy ( e.g. 20 articles per year for $395 ) although I could not locate any library in the world that has any subscription to this journal.
As an author I would be charged $800 per PDF. There seems to be no major text editing included in the publication process, what you get for your $800 is a quickly reformatted text, a PUBMED entry and a PDF sitting at a cloud server for an unknown storage time. My estimate for that service is $10.
Declan Butler developed a check list of serious publishers and journals. So we can now use that check list to judge this journal.
Check that the publisher provides full, verifiable contact information, including address, on the journal site. Be cautious of those that provide only web contact forms.
Check that a journal’s editorial board lists recognized experts with full affiliations. Contact some of them and ask about their experience with the journal or publisher.
Check that the journal prominently displays its policy for author fees.
Be wary of e-mail invitations to submit to journals or to become editorial board members.
Read some of the journal’s published articles and assess their quality. Contact past authors to ask about their experience.
FAILED (POOR QUALITY)
Check that a journal’s peer-review process is clearly described and try to confirm that a claimed impact factor is correct.
FAILED (NO IMPACT)
Find out whether the journal is a member of an industry association that vets its members, such as the Directory of Open Access Journals (www.doaj.org) or the Open Access Scholarly Publishers Association (www.oaspa.org).
Another set of guidelines for fake journals is available at Wikipedia. Complaints that are associated with predatory open-access publishing include
Accepting articles quickly with little or no peer review or quality control, including hoax and nonsensical papers.
CAN NOT BE DECIDED YET
Notifying academics of article fees only after papers are accepted.
Aggressively campaigning for academics to submit articles or serve on editorial boards.
Listing academics as members of editorial boards without their permission, and not allowing academics to resign from editorial boards.
Appointing fake academics to editorial boards.
Mimicking the name or web site style of more established journals.
Verdict: The journal does not pass the Butler criteria of a scientific journal.
Comment: I do not see any major problem if an open access journal is publishing all manuscripts it receives, leaving the final decision of being good or bad science to a post-publication review process. I see, however, a major problem if any pre-publication review process is being assumed for Pubmed listed papers (and paid for) while being never documented in a transparent way.
Addendum: Change log editor page
Something like “winners don’t punish”? A smart letter in this week’s Nature with the 3 options of Cooperation(C) – Defection (D) and Punishment (P)?
"nice people" player 1: C C C C player 2: C C C C top payoff! "punish and perish" player 1: C P P P P player 2: C D D D D extremely bad! "turning the other cheek" player 1: C C C C C player 2: D D C C C payoff still positive!
we should have known this earlier…
Here is a little tale that could have happened every day. An author sends a major paper to a major journal. The major journal has a major editor that asks other major reviewers before writing a major email. Continue reading An editor is one who separates wheat from chaff and then prints the chaff I