Tag Archives: bad science

Disagree without being disagreeable?

The NIH director on communication with the NIH

We are committed to ensuring a safe and respectful workplace wherever NIH-supported research occurs. Be it at a recipient institution, at a conference where scientific ideas are exchanged, or in our own intramural labs, everybody deserves to work in an environment that is free of harassment, bullying, intimidation, threats, or other disruptive and inappropriate behaviors. Likewise, this goes for NIH program officers, scientific review officers (SROs), grants management specialists, and other extramural staff.

Sugar daddy science

Der Begriff ist nicht neu, er wurde wohl 2019 von der Zeitschrift Atlantic erstmals in das allgemeine Bewusstsein gerückt für Forschung die direkt – ohne Ausschreibung, Wettbewerb oder Review – finanziert wird. Damals war es das MIT Media Lab, das “Spenden” von Jeffrey Epstein angenommen hat. Das muss nicht per se fragwürdig sein, hinterlässt aber doch ein ungutes Gefühl, warum gibt jemand sein Geld für Forschung aus?

Auch jetzt in Corona Zeiten, scheint es wieder in Mode zu kommen, dass einfach mal Geld aus der Staatskasse überwiesen wird: 1.000.000€ von Markus Söder an die LMU plus weitere Finanzierung an die Uni Regensburg, 65.315 € von Armin Laschet an die Uni Bonn, Winfried Kretschmann an vier Unis in Baden Württemberg in unbekannter Höhe, 2.000.000€ von Stephan Weil an die Uni Hannover “um in den kommenden zwei Jahren Bioproben und Daten von 1.000 unter­schied­lich stark am Coronavirus SARS-CoV-2 erkrankten Patienten zu sammeln”.

Das Ganze ist natürlich ein Witz, wenn nun die Unikinderklinik Freiburg schon beklagt, dass sie nicht mehr genügend Patienten für ihre Studie findet. Und wo sollen mit aktuell 54 Neuerkrankungen in Niedersachsen die 1000 Fälle für eine Biobank herkommen? Wo sich doch längst vor Monaten ein internationales Konsortium zu der Frage gebildet hat und es nun wirklich nicht die primäre Frage ist, ob eine Genvariante das Risiko um das 1.1 fache erhöht. Mit dem nächsten Virus ist sowieso wieder alles anders.

Und sollten die Studien nicht helfen, die Containment Massnahmen zu steuern? Ausser der Gangelt Studie kenne ich keine einzige offizielle Zahl obwohl vor genau 4 Monaten der erste Fall in München beschrieben wurde. Nach den letzten  Presseberichten werden die Ergebnisse der sugar daddy Studien hinter verschlossenen Türen dafür hergenommen um die lokale Politik zu rechtfertigen, ohne dass sie irgendjemand überprüfen kann. Vielleicht ist das auch besser so, denn wir wären wahrscheinlich von der Qualität enttäuscht, da kann ich mich nur Antes in der Lagebeurteilung vom 25.5. anschliessen

Ja, es gibt ein klareres Bild von der Unklarheit. Es gibt leider extreme regionale Unterschiede. Das war die Hauptkritik an der Heinsberg Studie mit ihren 14 oder 15 Prozent Dunkelziffer. Wenn ich in einen Hotspot fahre, bilde ich natürlich das ab, was vor Ort passiert ist. Es ist aber ohne jede Aussagekraft für das ganze Land … Die verschiedenen Massnahmen sind so miteinander vermengt, dass man das nicht differenzieren kann. Wir gehen leider mit allen Fehlern der Welt zurück in die Öffnung. Es ist ein Abschied von der Wissenschaft, den wir gerade sehen. …  Wir haben weltweit etwa 1100 sichtbare Studien zum Thema Corona. In Deutschland scharren auch alle mit den Hufen … um Studien durchzuführen. Leider aber gibt es keine Führungsstruktur, die diese Anstrengungen koordiniert. Meine größten Vorwürfe gehen in dieser Hinsicht an das Bundesforschungsministerium, von dem man fast nichts hört. Wir bräuchten für Deutschland einen Masterplan von Fragen. Dann müsste man diese Aufgaben verteilen und die Ergebnisse zusammenfassen, damit diese Entscheidungen dienen können. Dafür muss sich Wissenschaft allerdings koordinieren lassen.

 

 

Journal of Allergy

Being spammed by a company called Hindawi for many years, I tried to find out a bit more about one of their journals called “Journal of Allergy”. The website http://www.hindawi.com/journals/ja says

Journal of Allergy is a peer-reviewed, open access journal that publishes original research articles, review articles, and clinical studies in all areas of allergy. Journal of Allergy currently has an acceptance rate of 43%. The average time between submission and final decision is 59 days and the average time between acceptance and final publication is 34 days.

According to their own description, they are located in Cairo and employ some 200 to 1,000 employees. Hindawi seems to be the name of one of their founders. In some other web sources they claim  410 Park Avenue, 15th Floor, New York, USA, as their address.  Google Streetview shows at  that address a 11+2 floor building with Chase Manhattan Bank located at the ground floor.
Only 40 or so of the 500+ Hindawi journals have any impact factor associated with.
Declan Butler at Nature already wrote about these kind of journals:

Open-access publishers often collect fees from authors to pay for peer review, editing and website maintenance. Beall asserts that the goal of predatory open-access publishers is to exploit this model by charging the fee without providing all the expected publishing services. These publishers, Beall says, typically display “an intention to deceive authors and readers, and a lack of transparency in their operations and processes”.

At the moment, the Journal of Allergy is not being black listed by Beall (while Hindawi had been in the past). “Journal of Allergy” should not be confused with “The Journal of Allergy”[Jour] that has 1514 PUBMED entries while the “Journal of Allergy”[Jour] has only 157 entries so far. Is this an “intention to deceive authors and readers”?
The most recent issue appears as of “Epub 2014 Apr 6”, the first one as “Epub 2009 Jul 2”, so the company basically publishing 2-3 papers per month.
The Pubmed Analyzer are not very informative here. The whole “Journal of Allergy” has accumulated only 135 citations in the past 5 years (not an impressive figure as I have authored more than a dozen single papers that have received all more citations than the whole journal).
The extreme low citation rate and the missing impact factor may not be taken as an indicator that all papers are of poor quality but raises serious doubts.
The next question therefore is: Does the journal run a state of the art review process? The website list the following 24 scientists on the review board:

William E. Berger, University of California, Irvine, USA
Kurt Blaser, Universität Zürich, Switzerland
Eugene R. Bleecker, Wake Forest University, USA
Jan de Monchy, University of Groningen, The Netherlands
Frank JP Hoebers, MAASTRO Clinic, The Netherlands
Stephen T. Holgate, University of Southampton, United Kingdom
S. L. Johnston, Imperial College London, United Kingdom
Young J. Juhn, Mayo Clinic, USA
Alan P. Knutsen, Saint Louis University, USA
Marek L. Kowalski, Medical University of Lodz, Poland
Ting Fan Leung, The Chinese University of Hong Kong, Hong Kong
Clare M Lloyd, Imperial College London, United Kingdom
Redwan Moqbel, University of Manitoba, Canada
Desiderio Passali, University of Siena, Italy
Stephen P. Peters, Wake Forest University, USA
David G. Proud, University of Calgary, Canada
Fabienne Rancé, CHU Rangueil, France
Anuradha Ray, University of Pittsburgh, USA
Harald Renz, Philipps University of Marburg, Germany
Nima Rezaei, Tehran University of Medical Sciences, Iran
Robert P. Schleimer, Northwestern University, USA
Massimo Triggiani, Università degli Studi di Napoli Federico II, Italy
Hugo Van Bever, National University of Singapore, Singapore
Garry M. Walsh, University of Aberdeen, United Kingdom

Unfortunately this list is not identical to the editor names that are being listed directly on the PDFs ( eg the academic editor RM is not being listed at the web front). The  above editor list includes indeed some well respected scientists but there are also others that show their Hindawi affiliation as their first hit on Google only. As I know 7 of the 24 persons, I decided to email them a short 6 item questionnaire via Surveymonkey.

When did you start your role as an editor?
2009, 2010, 2011, 2012, 2013, 2014
2. What is your role there?
Leading editor- supervising associate editors
Editor – assigning papers to reviewes, holding final decision Reviewer – reading and scoring papers
Sonstiges (bitte angeben)
3. How many papers have you been dealing with?
0, 1, 2, 3, 4, 5 or more
4. How many papers did you accept?
nearly none, about half, most, all
5. Are you being paid for that work?
no, yes, don’t want to tell
6. Is this a serious journal?
no, yes, don’t know

2 of my 24 emails bounced- some of the members of the editorial board are already retired.
19 did not respond. I believe they will show the same behaviour when being addressed by Hindawi.
1 editor sent me a personal email saying that he will resign from the board. It will be interesting to see when the list of editors will be changed, I already started a change detection.
3 editors answered the mini survey: Editor #1 started in 2010, has been dealing with more than 5 papers, accepted most, is not paid and believes it is a serious journal. Editor #2  started in 2009 with all other responses being identical.  Editor #3 started also in 2009 but accepts only half of the papers.
It doesn’t come unexpected that these 3 motivated editors believe in a regular review process. I fear, however, that most editors either do not work for the journal (anymore) or are not motivated to spend even 3 minutes for the quality control of their work.
Without any transparent review process like that at the BMC journals, we can not judge from the outside if there is any review process. The names of the individual reviewers are unknown, and even contacting the authors would not help as they don’t have an interest to reveal that they get a paper published without any review process.

As a library one could order printed copy ( e.g. 20 articles per year for $395 ) although I could not locate any library in the world that has any subscription to this journal.
As an author I would be charged $800 per PDF. There seems to be no major text editing included in the publication process, what you get for your $800 is a quickly reformatted text, a PUBMED entry and a PDF sitting at a cloud server for an unknown storage time. My estimate for that service is $10.

Declan Butler developed a check list of serious publishers and journals. So we can now use that check list to judge this journal.

Check that the publisher provides full, verifiable contact information, including address, on the journal site. Be cautious of those that provide only web contact forms.

FAILED (PARTIALLY)

Check that a journal’s editorial board lists recognized experts with full affiliations. Contact some of them and ask about their experience with the journal or publisher.

FAILED (PARTIALLY)

Check that the journal prominently displays its policy for author fees.

PASSED

Be wary of e-mail invitations to submit to journals or to become editorial board members.

FAILED (SPAMMER)

Read some of the journal’s published articles and assess their quality. Contact past authors to ask about their experience.

FAILED (POOR QUALITY)

Check that a journal’s peer-review process is clearly described and try to confirm that a claimed impact factor is correct.

FAILED (NO IMPACT)

Find out whether the journal is a member of an industry association that vets its members, such as the Directory of Open Access Journals (www.doaj.org) or the Open Access Scholarly Publishers Association (www.oaspa.org).

PASSED

Another set of guidelines for fake journals is available at Wikipedia. Complaints that are associated with predatory open-access publishing include

Accepting articles quickly with little or no peer review or quality control, including hoax and nonsensical papers.

CAN NOT BE DECIDED YET

Notifying academics of article fees only after papers are accepted.

FALSE

Aggressively campaigning for academics to submit articles or serve on editorial boards.

TRUE

Listing academics as members of editorial boards without their permission, and not allowing academics to resign from editorial boards.

UNCLEAR

Appointing fake academics to editorial boards.

FALSE

Mimicking the name or web site style of more established journals.

TRUE

Verdict: The journal does not pass the Butler criteria of a scientific journal.

Comment: I do not see any major problem if an open access journal is publishing all manuscripts it receives, leaving the final decision of being good or bad science to a post-publication review process. I see, however, a major problem if any pre-publication review process is being assumed for Pubmed listed papers (and paid for) while being never documented in a transparent way.

Addendum: Change log editor page

Are science blogs dangerous?

Amnesty International reports that an Egyptian blogger is now facing up to 10 years in prison for criticizing Egypt’s religious authorities. A German blogger writing about constructing buses in China even faced an invitation to a court in Bejing. And everybody knows of Ellen Simonetti becoming famous for being fired by Delta.
A major difference of blogs to accredited journalism is also the limited capacity to respond to any prosecution: I don’t have any money for a lawyer while newspapers and journals can hire dozens.
Sure, science blogs are much less intrusive but there is always a risk that the empire will strike back; 99% of grant and paper reviews are anonymous.
BUT, there are good news – the blog community is large and always alert. As a science blooger writing on bad science you can now even get nice prizes – gratulations to Ben. Don’t forget that all students arriving in your lab have read your weblog first.
Refraining from all activities also involves some risk, yea, yea.