My latest column in Vox Magazine, the Radboud university newspaper, is about the limits of universities shifting towards industry in light of their public mission (in Dutch).
Our book about environmental expertise and the complications of advising policy and society is out now. Specifically designed as a tool to teach environmental scientists with a natural science background about social science insights. With extensive case examples and practitioner experience. Order here.
What if you would play Chinese whispers with objects instead of words? Explaining what a constructivist finds so fascinating about circulating objects in simple terms, Dutch terms: in my column in Vox.
Research journals often fail to explain their precise editorial policies, including their terms for peer review. Do they use open or blind review? Will articles be subjected to statistics scanners? Will reviewers be selected from a pool, or the open scientific community? Check their webpages and try to find out what precisely will happen with your paper after submission and discover how hard it is.
Hence twenty-two editors, researchers, and publishing professionals have just launched the Declaration on Transparent Editorial policies for Academic Journals. Will you sign it too?
Which peer review procedures are related to more retractions? Causal connections are hard to make, as retractions are both a sign of trouble and a sign that journals are taking action to address trouble. Nevertheless, it is remarkable that some peer review procedures, such as double blind review, seem to involve fewer retractions, even after correction for research field.
Horbach, S. P. J. M., & Halffman, W. (2018). The ability of different peer review forms to flag problematic publications. Scientometrics, doi:10.1007/s11192-018-2969-2. (online 29 November 2018)
There is a short interview about it on Retraction Watch.
The international responses to the Academic Manifesto, with experiences from 14 countries about how to resist the productivist university, was translated into Spanish. It’s really exciting to see how stories about resistance are shared all over the world, and not just the globalised management-speak of The Wolf.
Halffman, W., & Radder, H. (eds.) (2017), International Responses to the Academic Manifesto: Reports from 14 Countries, Social Epistemology Review and Reply Collective, p. 1-77.(online 13 July 2017). http://wp.me/p1Bfg0-3FV,
Translated into Spanishby Eva Aladro Vico, Respuestas internacionales al manifiesto académico: informes desde 14 países, to appear in: CIC Cuadernos de Información y Comunicación, 2018 vol. 23, 25-103. http://revistas.ucm.es/index.php/CIYC/article/view/60686
Horbach, S. P. J. M., & Halffman, W. (2018). The changing forms and expectations of peer review. Research Integrity and Peer Review, 3(1), 8. doi:10.1186/s41073-018-0051-5 (open access)
In addition to a systematisation of the current variety in peer review, the paper also explains the considerations that have gone into innovations such as post-publication peer review, open peer review, or statistics scanners. Two further papers are in preparation, one on the distribution of peer review practices and one on their ability to prevent retractions.
I started writing a column for the RU university newspaper Vox (in Dutch). Here’s the first one.
The misidentification of cell lines has been a problem in biomedical research for decades. First noted for HeLa cells, cell lines get mixed up or contaminated with other cells. As a result, researchers publish results based on other cells than they assume. Sometimes this does not affect research results, sometimes it fundamentally flaws the findings. Important efforts have been made to prevent these problems, such as journals requiring genetic verification of cell cultures prior to publication.
But what about the research of the past? We used the ICLAC database of cell lines known to be misidentified to estimate the number of articles in Web of Science using misidentified cells. We found 33.000 publications, currently about 1.200 per year, with no signs of improvement. The articles in this ‘primary contamination’ are in turn cited by 500.000 papers, constituting a ‘secondary contamination’ of the scientific literature.
We suggest publications that base results on misidentified cells should get a warning label, allowing the expert reader to assess the consequences for validity.
Horbach, S., & Halffman, W. (2017). The Ghosts of HeLa: How cell line misidentification contaminates the scientific literature. PLOS ONE. doi:10.1371/journal.pone.0186281.
(12 October 2017, open access)
Here is the media attention.