Work in Progress: First steps towards open science
Date: | 30 September 2020 |
Author: | Merle-Marie Pittelkow |
-
This case study is one of the three winning submissions for the Open Research Award, which will be celebrated on 22 October 2020
My journey towards open science began as a Master student. I participated in the course Transparency in Science and learned about questionable research practices, non-reproducible, and nonreplicable research findings. Majoring in clinical psychology, this made me especially concerned about the impact on validity of treatment recommendations. Were the recommendations being based on studies which might fail to replicate? Would it be better to replicate studies to increase the veracity of a particular finding (i.e., successful replication) before translating them into clinical practice? However, replication of every clinical study is not feasible assuming limited resources, and this made me realize the need for a framework to transparently and systematically assess when replication is needed to prioritize some studies. Together with Rink Hoekstra, Julie Karsten, and Don van Ravenzwaaij, I developed a step-wise approach, combining quantitative (i.e., a Bayesian re-analysis) and qualitative (i.e., relevance, theory, methodology, interpretation) criteria. As proof of principle, we re-analyzed a sample of studies published in the Journal of Consulting and Clinical Psychology between 2012 and 2016.
This project introduced me to open science practices. While I advocated for transparency in the process of selecting replication targets and asked others to justify their decisions, I had to acknowledge that I had not been transparent about my own research process in the past. Hence, I decided to change my research practices. While I had missed the opportunity to preregister my study, I was still able to create a project on the Open Science Framework (OSF), upload my data and code, share my manuscript as a preprint, and eventually submit the manuscript to a journal that allows for open access.
Achieving these things made me realize that the transparency I was asking from others was not an easy feat to implement. Creating a project on OSF was a matter of minutes, but collecting all datafiles corresponding to the project took much longer. I must confess that I could not find the original R code corresponding to the Bayesian re-analysis. Luckily, my co-author was able to send me the file. Moreover, some parts of the analysis were performed in Excel, rendering them not reproducible. Thus, I had to rewrite these parts in R to be able to upload code that would reliably reproduce our results.
The nature of our qualitative evaluations (i.e., judgements regarding the quality of other’s work) posed another challenge. We were concerned with the tone as we openly criticized the work of experts while not necessarily being experts in these topics ourselves. Thus, we ensured that the critique was constructive and not overstated before sharing it online. Making our research open made me feel vulnerable. I vividly remember my heart pounding before hitting the button to publicly share our work with its potential flaws to the world. However, I believe that we should hold ourselves accountable for what we are doing, and the best way to do so is by being as transparent and open as much as we possibly can.
Moreover, open science has clear advantages for me. First and foremost, I feel proud to practice as I preach. Second, since I uploaded a preprint, my work has gained more exposure and been mentioned in a few talks even though it’s not published yet. It has provided me with the opportunity to network and connect with other researchers on this topic. For example, I was invited to present the project at a lab meeting. Finally, this project has since been the inspiration for a follow up project, which I am excited about already. Taken together, to practice science openly is as much reward as it is work!