Scientists value transparency and reproducibility, but are rewarded for highlighting the novelty of unexpected findings. This is one reason why published research findings are hard to reproduce. See, for example, the recent work done by us and the scientists involved in the Open Science Collaboration on Estimating the Reproducibility of Psychological Science (https://osf.io/ezcuj/wiki/home/). When scientists preregister their research, they are making key decisions without being biased by the data they collect, which makes standard statistical tests more effective. Though preregistration is required by law for clinical research involving human medical studies, it is not widely practiced by most scientists. We at the Center for Open Science have $1,000,000 to hand out as prizes for researchers who publish the results of their preregistered research. See https://cos.io/prereg We’ll be back at 12 pm ET (9 am PT, 5 pm UTC) to answer your questions, Ask us anything! Answering questions today: Courtney Soderberg is our Statistical and Methodological Consultant who advises researchers on best practices in experimental design and statistical analysis to make their work more reproducible. Jolene Esposito works with researchers in Africa to to improve the rigor of their work using the tools we’ve made, such as the Open Science Framework (osf.io) April Clyburne-Sherin is our Reproducible Research Evangelist who conducts workshops to train researchers on reproducible research methods and open science tools. David Mellor works on encouraging researchers to preregister their work on the Open Science Framework. Hello Reddit! http://imgur.com/DpMrjKV (edits for formatting, picture, our names) Edit 2 PM EST: Thanks for all of your questions everyone! We’ve enjoyed talking to you. We will come back later today to see if any more questions are up. Follow us on Twitter! @OSFramework
Last Thursday, our article “Estimating the Reproducibility of Psychological Science” was published in Science. Coordinated by the Center for Open Science, we conducted 100 replications of published results in psychology with 270 authors and additional volunteers. We observed a substantial decline effect between the original result and the replications. This community-driven project was conducted transparently, and all data, materials, analysis code, and reports are available openly on the Open Science Framework. Ask us anything about our process and findings from the Reproducibility Project: Psychology, or the initiatives to improve transparency and reproducibility in science more generally. We will be back at 12pm EDT (9 am PT, 4 pm UTC), AUA! Responding are: Brian Nosek, Center for Open Science & University of Virginia Johanna Cohoon, Center for Open Science Mallory Kidwell, Center for Open Science [EDITED BELOW] Some links for context: PDF of the paper: http://www.sciencemag.org/content/349/6251/aac4716.full.pdf OSF project page with data, materials, code, reports, and supplementary information: https://osf.io/ezcuj/wiki/home/ Open Science Framework: http://osf.io/ Center for Open Science: http://cos.io/ TOP Guidelines: http://cos.io/top/ Registered Reports: https://osf.io/8mpji/wiki/home/ 12:04. Hi everyone! Mallory, Brian, and Johanna here to answer your questions! 12:45. Our in house statistical consultant, Courtney Soderberg, has joined us in responding to your methodological and statistical questions. 3:50. Thanks everyone for all your questions! We’re closing up shop for the holiday weekend but will check back in over the next few days to give a few more responses. Thanks to all the RPP authors who participated in the discussion!