Welcome to the ML Reproducibility Challenge 2022. This is the sixth edition of the event (v1, v2, v3, v4, v5), where we are accepting reproducibility reports on papers published at eleven top ML conferences, including NeurIPS 2022, ICML 2022, ICLR 2022, ACL 2022, EMNLP 2022, CVPR 2022, ECCV 2022, AAAI 2022, IJCAI-ECAI 2022, ACM FAccT 2022, SIGIR 2022, and also for papers published in top ML journals in 2022, including JMLR, TACL and TMLR.
The primary goal of this event is to encourage the publishing and sharing of scientific results that are reliable and reproducible. In support of this, the objective of this challenge is to investigate reproducibility of papers accepted for publication at top conferences by inviting members of the community at large to select a paper, and verify the empirical results and claims in the paper by reproducing the computational experiments, either via a new implementation or using code/data or other information provided by the authors.
We are happy to announce the decisions of MLRC 2022 publicly! We had already communicated the decisions to the authors, but we held off to release the decisions until our Camera Ready process is complete. We received 74 submissions, and this year marks yet another iteration of exceptionally high quality reproducibility reports. While we did had to desk reject several reports due to double-blind anonymity violations, formatting issues, and incorrect submissions, the quality of the submissions again improved sharply from last year! After an extensive peer review and meta review process, we are delighted to accept 45 reports to the program, all of which raise the bar in the standard and process of reproducibility effort in Machine Learning.
All papers, decisions and reviews can be viewed at our OpenReview platform.
Following the tradition set last iteration, we are presenting best paper awards to a few select reports to highlight the excellent quality all-round of their reproducibility work. The selection criteria consisted of votes from the Area Chairs, based on the reproducibility motivation, experimental depth, results beyond the original paper, ablation studies, and discussion/recommendations. Since the quality of these top papers are exceptionally high, we decided to change the “Best paper” award nomenclature to “Outstanding Paper” and “Outstanding Paper (Honorable Mentions)” to closely reflect the overall merits of the best performing papers. We believe the community will appreciate the strong reproducibility efforts in each of these papers, which will improve the understanding of the original publications, and inspire authors to promote better science in their own work. Congratulations to all!
All camera ready accepted papers will be available soon in our ReScience C Journal publication, Volume 9, Issue 2. Congratulations to all authors!
Our program would not have been possible without the hard work and support of our reviewers. Thus, we would also like to honor them for their timely, high quality reviews which enabled us to curate high quality reproducibility reports.
Kaggle deserves a special mention as they partnered with us in this iteration to provide awards to the best papers and reviewers. Kaggle has provided awards in the form of Google Cloud Compute (GCP) credits worth of 500k USD, which is extremely beneficial to conduct exploratory research leveraging high performance computing platform of Google. Kaggle has sponsored this award to outstanding papers and reviewers based on a final decision of the Kaggle awards committee. We thank Kaggle for providing such generous award and enabling reproducible research in the Machine Learning community.
The challenge is a great event for community members to participate in shaping scientific practices and findings in our field. We particularly encourage participation from:
If you are an instructor participating in RC2022 with your course, we would love to hear from you and will be happy to list your course here! Please fill the following form with your course details: https://forms.gle/NsxypsS2MTxNCj8f7.
For general queries regarding the challenge, mail us at reproducibility.challenge@gmail.com.