We were honored to collaborate with the following conferences to organize artifact evaluation for accepted papers and reproduce experimental results based on our unified Artifact Appendix and the Reproducibility Checklist and continuously updated submission and evaluation procedures for research artifacts:
Event: | AE wesbite: | Event website: | Accepted artifacts: | ||||
HPCA'24 | Link | Link | IISWC'24 | Link | Link | ||
ASPLOS'24 | Link | Link | |||||
MICRO'23 | Link | Link | |||||
ASPLOS'22 | Link | Link | |||||
ASPLOS'21 | Link | Link | Link | ||||
MLSys'20 | Link | Link | Link | ||||
ASPLOS'20 | Link | Link | Link | ||||
Supercomputing'19 | Link | Link | Artifact automation project for SCC (GitHub1, GitHub2) | ||||
MLSys'19 | Link | Link | CK platform | ||||
Computing Frontiers'19 | Link | Link | |||||
PPoPP'19 - a new record with 20 papers participating in the reproducitility initiative out of the 29 accepted papers or ~70%, with a dramatic increase from ~30% for PPoPP'15! | Link | Link | ACM proceedings, our AE report, CK platform | ||||
ReQuEST-ASPLOS'18 - the 1st open tournament on reproducible and Pareto-efficient SW/HW co-design of deep learning (speed,accuracy,energy,costs) with the automated Artifact Evaluation. Final results are available at the live ReQuEST scoreboard | Link | Link | ACM proceedings, report, CK workflows, CK platform | ||||
IA3 2018 | Link | Link | |||||
PPoPP'18 - we used the new ACM Artifact Review and Badging policy which we co-authored in 2017 | Link | Link | Link, CK platform | ||||
CGO'18 - we used the new ACM Artifact Review and Badging policy which we co-authored in 2017 | Link | Link | Link, CK platform | ||||
IA3 2017 | Link | Link | Link,
CK platform
[ See research paper with a portable CK workflow ] |
||||
Supercomputing'17 (based on our Artifact Appendix) | Link | Link | |||||
CGO'17 | Link | Link | Link,
CK platform
[ Portable CK workflow , Paper with AE appendix and CK workflow , PDF snapshot of the interactive CK dashboard ] |
||||
PPoPP'17 | Link | Link | Link | ||||
PPoPP'16 | Link | Link | Link | ||||
SC'16 | Link | Link | Link | ||||
RTSS'16 | Link | Link | Link | ||||
CGO'16 | Link | Link | Link | ||||
PACT'16 | Link | Link | Link, CK platform | ||||
ADAPT'16 | Open reviewing via Reddit (see the motivation for our open reviewing and publication model) | Link | Link, CK platform | ||||
ADAPT'15 @ HiPEAC'15 (open Reddit-based reviewing) | Link | Link | Link | ||||
PPoPP'15 | Link | Link | Link | ||||
CGO'15 | Link | Link | Link | ||||
ADAPT'14 @ HiPEAC'14 | Link | Link | Link | ||||
GCC Summit'08 and GCC Summit'09 - our original attempt to create a public repository for code and data from published papers to let the community validate and compare experimental results. Since then we have been working on automation and standardization of this complex process, and are very grateful to the community for their positive feedback and support! | Link | Link | Link |
Our current and past events on collaborative and reproducible R&D:
We would like to thank the ACM taskforce on reproducibility, Prof. Shriram Krishnamurthi (artifact-eval.org), Matthias Hauswirth and all our colleagues for fruitful discussions and feedback!