0
Article ? AI-assigned paper type based on the abstract. Classification may not be perfect — flag errors using the feedback button. Tier 2 ? Original research — experimental, observational, or case-control study. Direct primary evidence. Sign in to save

The SORTEE guidelines for data and code quality control in ecology and evolutionary biology

Peer Community Journal 2026 Score: 40 ? 0–100 AI score estimating relevance to the microplastics field. Papers below 30 are filtered from public browse.
Quentin Petitjean, Joel L. Pick, Quentin Petitjean, Bethany J. Allen, Malgorzata Lagisz, Quentin Petitjean, Benedicte Bachelot, Jack A. Brand, Shinichi Nakagawa, Kevin R. Bairos-Novak, Jack A. Brand, Jack A. Brand, Barbara Class, Tad Dallas, Malgorzata Lagisz, Pietro B D'Amelio, Erola Fenollosa, Esteban Fernández-Juricic, Dylan Gomes, Matthew Grainger, Shinichi Nakagawa, Shinichi Nakagawa, Thomas Guillemaud, Christian John, Ruby Krasnow, Malgorzata Lagisz, Sébastian Lequime, Daniel S. Maynard, Shinichi Nakagawa, Rose E. O'Dea, Matthieu Paquet, Quentin Petitjean, Alfredo Sánchez-Tójar, Natalie E. van Dis, Laura A. B. Wilson, Edward R Ivimey-Cook

Summary

Scientists created new guidelines to help make sure research data and computer code are properly checked before studies get published. Many scientific journals require researchers to share their data, but the quality has been poor and other scientists often can't reproduce the results. These guidelines give editors a clear checklist to verify research quality, which should lead to more trustworthy science that people can rely on for important decisions.

Open data and code are crucial to increasing transparency and reproducibility, and in building trust in scientific research. However, despite an increasing number of journals in ecology and evolutionary biology mandating for data and code to be archived alongside published articles, the amount and quality of archived data and code, and subsequent reproducibility of results, has remained worryingly low. As a result, a handful of journals have recruited dedicated data editors, whose role is to help authors increase the overall quality of archived data and code. There is, however, a general lack of consensus around what a data editor should check, how to do it, and to what level of detail, and the process is often vague and hidden from readers and authors alike. Here, with the input from multiple data editors across several journals in ecology and evolutionary biology, we establish and describe the first standardised guidelines for Data and Code Quality Control on behalf of the Society for Open, Reliable, and Transparent Ecology and Evolutionary Biology (SORTEE). We then introduce the SORTEE-led guidelines as a flexible six-stage framework that journals can implement incrementally and/or apply on a case-by-case basis, particularly when some checks (e.g., computational reproducibility) are not feasible (e.g., proprietary software). We conclude with practical advice for journals and authors, arguing that flexible adoption of these standardised guidelines will improve the consistency and transparency of the data editor process for readers, authors, data editors, and the wider scientific community.

Sign in to start a discussion.

Share this paper