Artifact Evaluation for Computer Systems' Research
We work with the community and ACM to improve methodology and tools for reproducible experimentation, artifact submission / reviewing and open challenges!
Home Artifacts Joint Committee Submission Guide Reviewing Guide FAQ Prior AE
Sponsors and supporters
If you would like to sponsor this community service including prizes for highest ranked artifacts and supporting open-source technology for collaborative, customizable and reproducible experimentation, please get in touch with the AE steering committee!
Important dates:
Paper decision: 25 Oct 2016
Artifact submission: 20 Nov 2016 (AoE)
Technical clarification: 5-10 Dec 2016
Decision announced: 13 Dec 2016
Final paper: 15 Dec 2016
Public discussion: 6 Feb 2017 (program)

CGO or PPoPP prizes
High-end GPGPU card for the distinguished artifact    $500 for the top experimental workflow in CK format
CGO AE Chair:
Joseph Devietti (University of Pennsylvania, USA)

Grigori Fursin, Bruce Childers

Artifact Evaluation for CGO 2017

[ Back to CGO 2017 conference website ]

Artifact evaluation is finished - see accepted artifacts (with awards) here !

News: This year we saw a considerable increase in the amount of submitted artifacts: 27 versus 18 two years ago. The Artifact Evaluation Committee assembled of 41 researcher and engineer spent two weeks validating artifacts. Each artifact received at least three reviews and only 8 artifacts fell significantly below acceptance criteria.

Since our philosophy is that AE should act as a mechanism to help authors prepare their materials and replicate or reproduce experimental results, we spent one more week shepherding these artifacts. During this process we allowed back-and-forth anonymous communication between evaluators and authors to resolve concerns, documentation issues and bugs. At the same time, we also successfully tried an "open reviewing model", when we asked the community to publicly evaluate several artifacts already available at GitHub, GitLab and other project hosting services. This allowed us to find external reviewers who had access to very rare HPC servers or proprietary benchmarks and tools. With the help of such shepherding, a 100% success rate for all 27 artifacts was achieved, which reflects a significant achievement and effort by both authors and evaluators. We thank them all for their hard work!

All papers with evaluated artifacts received an AE seal and were allowed to add up to 2 pages of Artifact Appendix to let readers better understand what was evaluated and how.

Authors of accepted CGO 2017 papers will be invited to formally submit their supporting materials to the Artifact Evaluation process. The Artifact Evaluation process is run by a separate committee whose task is to reproduce (at least some) experiments and assess how the artifacts support the work described in the papers. This submission is voluntary and will not influence the final decision regarding the papers.

Papers that successfully go through the Artifact Evaluation process will receive a seal of approval printed on the papers themselves. Authors of such papers will have an option to include their Artifact Appendix to the final paper (up to 2 pages). Authors are also encouraged (though not obliged) to make these materials publicly available upon publication of the proceedings, by including them as "source materials" in the Digital Library.

If you have any questions, please check AE FAQs and do not hesitate to contact AE chair and the steering committee or post your question to our LinkedIn group.

How to submit

Please prepare your artifacts for submission using the following guide. Then, register your submission at the joint PPoPP/CGO EasyChair website - you will be asked to submit your paper title, author list, artifact abstract, pdf of your paper with an appendix describing how to access and validate your artifacts, and possible conflicts of interests with AE members.

To encourage reproducible experimentation and participation in artifact evaluation, NVIDIA will give a high-end GPGPU card for the highest ranked artifact! To promote sharing of artifacts and experimental workflows as reusable and customizable components cTuning foundation and dividiti will give $500 for the highest ranked experimental workflow implemented using Collective Knowledge framework.

Reviewing process

Your artifacts will be reviewed according to the following guidelines. Artifacts receiving "met expectations" or above score will pass evaluation and will receive a stamp of approval. The highest ranked artifacts will receive prizes (to be announced).


We consider Artifact Evaluation as a continuous learning curve - our eventual goal is to develop a common methodology for experiment sharing and evaluation in computer system's research. Therefore, based on encountered issues during past AE and your feedback, we are currently developing the following open-source supporting technology for Artifact Evaluation:

If you have questions, comments and suggestions on how to improve artifact submission, reviewing, customization and reuse, please do not hesitate to get in touch with the AE steering committee!

Maintained by
cTuning foundation (non-profit R&D organization)
and volunteers!
Powered by Collective Knowledge
           Locations of visitors to this page