Reproducibility

Debrief of SORTEE Code Club: Hacky Hour Code Review Exercise - Tuesday May 21

In May’s Hacky Hour, we did a code review exercise using the 17-step checklist for Ecology and Evolution. Participants reviewed each other’s code or that of already published papers and discussed what would constitute the “perfect” piece of Open, Reliable and Transparent (ORT) code.

Continue reading

Debrief of SORTEE Code Club: Code Review Checklist for Ecology and Evolution - Tuesday April 16

In this month’s Training Session, Stefan Vriend, Freddy Hillemann and Joey Burant hosted a workshop on how to code review a manuscript, using a checklist they developed for Ecology and Evolution.

Continue reading

Debunking myths around open data

 

Introduction

Scientific research has led to multiple advancements and methodological innovations. However, modern scientists function under constant time pressure to produce a high number of publications and statistically significant results, thus sometimes they resort to questionable research practices. In a survey that examined how widespread these practices are in the field of Ecology and Evolutionary Biology, the majority of participants admitted to having implemented a questionable practice in the past. 64% of the respondents had selected only the statistically significant results of an analysis (cherry picking) in at least one publication. 42% collected additional data after checking the significance of the data (p-hacking) and 51% admitted to presenting an unexpected result as their initial hypothesis (HARKing).

Continue reading

Debrief of SORTEE Code Club: Hacky Hour - Tuesday March 19

 

The Member Engagement Committee runs Code Club every third Tuesday of the month. Time can vary depending on the host and will be announced at least two weeks in advance on SORTEE’s Slack.

In this month’s Hacky Hour, 9 participants shared their code mistakes, starting up the SORTEE library of code mistakes! The goal is twofold: the normalization of coding errors and building a resource of (common) code mistakes that you can use during code review.

Continue reading

Debrief of SORTEE Code Club: Kickoff Meeting - Tuesday February 20

 

The Member Engagement Committee is breathing new life into the peer code review club: we will run Code Club every third Tuesday of the month. Time can vary depending on the host and will be announced at least two weeks in advance.

With 13 participants, we kicked off the first Code Club of 2024, learning how code review can make coding a more collaborative process in the scientific research cycle.

Continue reading

Setting the record straight: how data and code transparency caught an error and how I fixed it

 

“We were unable to reproduce your results, and I think the reason is that there is a bug in how you are calculating your correlation coefficients.”

 

That was part of an email I got this summer that absolutely crushed me. It doesn’t take much empathy to feel that knot in your stomach and existential dread from imposter syndrome, especially if you are currently a graduate student, post-doc, or another early-career researcher. What happens when you or someone else catches an error post-publication is not something most scientists know, certainly none of my peers or advisors did, but I got to experience the process from a supportive group of colleagues, advisors, and journal editors. I’m not writing this piece to commiserate on the fears, anxieties, and setbacks we have as scientists, nor am I going to belabor the details of the analysis or the justifications/explanations of how I missed the error; I’m writing this because the larger picture of the scientific process, when aided by data transparency, works to make our collective knowledge better.

Continue reading