ACM-sigplan.png
Logo-validated-by-the-community.png
Edinburgh.jpg

Important dates:

Submission: March 19, 2014
Notification: April 14, 2014
Final version: May 6, 2014

Early registration: May 7, 2014
Workshop: June 12, 2014


New publication model

[ ACM DL ] [ ArXiv ]

LinkedIn group

[ Link ]

Collective Mind project:

Collective Mind
Live repository ], [ Framework ]
, [Discussions]

OCCAM project

[ Link ]

Special journal issue

IEEE TETC

[ Link ]

Artifact evaluation for software conferences

[ Link ]


Sponsors

ACM SIGPLAN

If your company or institution is interested to become our sponsor, please don't hesitate to contact us !


ACM SIGPLAN TRUST 2014 @ PLDI 2014

1st Workshop on Reproducible Research Methodologies
and New Publication Models in Computer Engineering
 

June 12, 2014 (afternoon), Edinburgh, UK

We particularly focus on technological aspects to enable reproducible research and experimentation on program and architecture empirical analysis, optimization, simulation, co-design and run-time adaptation.

Panel on conference artifact evaluation experience

Final program

Proceedings in ACM Digital Library

1:30 - 1:35
Workshop introduction

Grigori Fursin, INRIA, France

1:35 - 1:55
Introducing OCCAM project

Bruce Chillders, University of Pittsburgh, USA

1:55 - 2:20
Software in reproducible research: advice and best practice collected from experiences at the Collaborations Workshop.

Mario Antonioletti1, Neil Chue Hong1, Stephen Crouch2, Alexander Hay2, Simon Hettrick2, Devasena Inupakutika2, Mike Jackson1, Aleksandra Pawlik3, Giacomo Peru1, John Robinson2, Shoaib Sufi3, Les Carr2, David De Roure4, Carole Goble3, and Mark Parsons1.

1 EPCC, University of Edinburgh, UK
2 WAIS, University of Southampton, UK
3 University of Manchester, UK
4 Oxford eResearch Centre, Oxford University, UK

[ Slides ] [ Paper in ACM DL ]

2:20 - 2:45
Academia 2.0: removing the publisher middle-man while retaining impact

Raphael Poss1, Sebastian Altmeyer1,Mark Thompson2, Rob Jelier3
1 University of Amsterdam, Netherlands
2 L.U.M.C., Netherlands
3 KU Leuven, Belgium

[ Slides ] [ Paper in ACM DL ]

2:45 - 3:15
Coffee break (synchronized with all PLDI workshops)
3:15 - 3:50
Invited presentation: Sharing Specifications

Christian Collberg & Todd Proebsting, University of Arizona, USA

[ Slides ]

3:50 - 4:15
Falsifiability of network security research: the Good, the Bad, and the Ugly

Dennis Gamayunov, Lomonosov Moscow State University, Russia

[ Paper in ACM DL ]

4:15 - 4:40
CARE, the Comprehensive Archiver for Reproducible Execution

Yves Janin, Cedric Vincent and Remi Duraffort
STMicroelectronics, France

[ Slides ] [ Paper in ACM DL ]

4:40 - 6:00
Panel on conference artifact evaluation experience

Jan Vitek1 , Shriram Krishnamurthi2Christophe Dubach3 , Grigori Fursin4

1 Purdue University, USA 2 Brown University, USA 3 University of Edinburgh, UK 4 INRIA, France

[ Slides ] [ Paper in ACM DL ]

Workshop organizers

Program committee

Call for papers

It becomes excessively challenging or even impossible to capture, share and accurately reproduce experimental results in computer engineering for fair and trustable evaluation and future improvement. This is often due to ever rising complexity of the design, analysis and optimization of computer systems, increasing number of ad-hoc tools, interfaces and techniques, lack of a common experimental methodology, and lack of simple and unified mechanisms, tools and repositories to preserve and exchange knowledge apart from numerous publications where reproducibility is often not even considered. This ACM SIGPLAN workshop is intended to become an interdisciplinary forum for academic and industrial researchers, practitioners and developers in computer engineering to discuss challenges, ideas, experience, trustable and reproducible research methodologies, practical techniques, tools and repositories to:

  • capture, preserve, formalize, systematize, exchange and improve knowledge and experimental results including negative ones
  • describe and catalog whole experimental setups with all related material including algorithms, benchmarks, codelets, datasets, tools, models and any other artifact
  • validate and verify experimental results by the community
  • develop common research interfaces for existing or new tools
  • develop common experimental frameworks and repositories
  • share rare hardware and computational resources for experimental validation
  • deal with variability and rising amount of experimental data using statistical analysis, data mining, predictive modeling and other techniques
  • implement previously published experimental scenarios (auto-tuning, run-time adaptation) using common infrastructure
  • discuss open access to publications and data including intellectual property IP and legal issues
  • improve reviewing and evaluation process for publications and shared artifacts
  • enable interactive articles

Submission guidelines

Easychair submission website is now closed.

We invite papers in three categories (please use these prefixes for your submission title):

  • T1: Position papers should be at most 3 pages long (excluding bibliography). We welcome preliminary and exploratory work, presentation of related tools and repositories in development, experience reports particularly related to recent research validation initiatives at OOPSLA, PDLI and ADAPT, and any related ideas.
  • T2: Full papers should be at most 6 pages long (excluding bibliography). Papers in this category are expected to have relatively mature content.
  • T3: Papers validating and sharing past research on design and optimization of computer systems published in relevant conferences. These papers should be at most 6 pages long (excluding bibliography).

Submissions should be in PDF formatted with double column/single-spacing using 10pt fonts and printable on US letter or A4 sized paper using standard ACM LaTeX2e template. All papers will be peer-reviewed. The authors will have a choice to publish accepted papers at the conference website (that will not prevent later publication of extended papers) or in the ACM Digital Library (International Conference Proceedings Series; ISBN 978-1-4503-2918-7).

Important dates

  • Paper submission: March 19, 2014 (Anywhere on Earth)
  • Notification: April 14, 2014
  • Final version: May 6, 2014
  • Early registration: May 7, 2014 (link)
  • Workshop: June 12, 2014

Assorted related projects, initiatives and tools 

Assorted tools

  • CARE tool from STMicroelectronics (Comprehensive Archiver for Reproducible Execution)
  • CDE tool (automatically create portable Linux applications with all dependencies)
  • Docker tool (pack, ship and run applications as a lightweight container)
  • IPython Notebook (a web-based interactive computational environment where you can combine code execution, text, mathematics, plots and rich media into a single document)
  • cTuning technology (crowdsource auto-tuning and combine with machine learning) (2006-cur.)
  • cTuning's Collective Mind technology (towards collaborative, systematic and reproducible computer engineering) (2011-2014)

(C) 2011-2014 cTuning foundation