Line 1: Line 1:
 
<span style="font-size:x-large"><span style="color: rgb(178, 34, 34)">Enabling collaborative, systematic and reproducible research, experimentation and development with an open publication model in computer engineering</span></span>
 
<span style="font-size:x-large"><span style="color: rgb(178, 34, 34)">Enabling collaborative, systematic and reproducible research, experimentation and development with an open publication model in computer engineering</span></span>
  
== Manifesto / motivation<br/> ==
+
== Manifesto / motivation ==
  
 
Rather than writing yet another manifesto on reproducible research and experimentation in computer engineering, we have been working on enabling sharing and reproducing experimental results and artifacts in computer engineering since 2006 as a side effect of our MILEPOST and cTuning.org projects. We attempted to build a practical machine learning based self-tuning compiler combining plugin-based auto-tuning framework with a public cTuning repository of knowledge, crowdsourcing predictive analytics, but faced numerous problems including:
 
Rather than writing yet another manifesto on reproducible research and experimentation in computer engineering, we have been working on enabling sharing and reproducing experimental results and artifacts in computer engineering since 2006 as a side effect of our MILEPOST and cTuning.org projects. We attempted to build a practical machine learning based self-tuning compiler combining plugin-based auto-tuning framework with a public cTuning repository of knowledge, crowdsourcing predictive analytics, but faced numerous problems including:
Line 10: Line 10:
 
*''Difficulty to reproduce performance results from the cTuning.org database submitted by users due to a lack of full software and hardware dependencies;''
 
*''Difficulty to reproduce performance results from the cTuning.org database submitted by users due to a lack of full software and hardware dependencies;''
 
*''Difficulty to validate related auto-tuning and machine learning techniques from existing publications due to a lack of culture of sharing research artifacts with full experiment specifications along with publications in computer engineering.''
 
*''Difficulty to validate related auto-tuning and machine learning techniques from existing publications due to a lack of culture of sharing research artifacts with full experiment specifications along with publications in computer engineering.''
 +
 +
== Joined events ==
 +
We have been releasing our own ...
 +
After evangelizing for 7 years and trying to solve technical aspects
 +
* ADAPT'14
 +
 +
We collaborate with our colleagues from AEC who recently managed to persuade the following conferences join similar initiative:
 +
* OOPSLA'13, PLDI'14
 +
 +
== Our expertise and work ==
 +
 +
* Set up evaluation of experimental results and all related material for workshops, conferences and journals
 +
* Improve sharing, description of dependencies, and statistical reproducibility of experimental results and related material
 +
* Improve public Collective Mind repository of knowledge and collaborative experimentation infrastructure in computer engineering
 +
* Validate new open publication model
 +
 +
== Community-driven reviewing of publications and artifacts ==
 +
 +
Pool
 +
 +
== Packing artifacts for evaluation ==
 +
 +
* Links to tools for possible packing
 +
 +
== Events ==
 +
 +
== Links ==
 +
 +
* [[Reproducibility:Links|Collection of related initiatives]]
 +
* [[Reproducibility:Links|Collection of related tools]]

Revision as of 14:59, 28 June 2014

Enabling collaborative, systematic and reproducible research, experimentation and development with an open publication model in computer engineering

Manifesto / motivation

Rather than writing yet another manifesto on reproducible research and experimentation in computer engineering, we have been working on enabling sharing and reproducing experimental results and artifacts in computer engineering since 2006 as a side effect of our MILEPOST and cTuning.org projects. We attempted to build a practical machine learning based self-tuning compiler combining plugin-based auto-tuning framework with a public cTuning repository of knowledge, crowdsourcing predictive analytics, but faced numerous problems including:

  • Lack of common, large and diverse benchmarks and data sets needed to build statistically meaningful predictive models;
  • Lack of common experimental methodology and unified ways to preserve, systematize and share our growing optimization knowledge and research material including benchmarks, data sets, tools, tuning plugins, predictive models and optimization results;
  • Problem with continuously changing, "black box" and complex software and hardware stack with many hardwired and hidden optimization choices and heuristics not well suited for auto-tuning and machine learning;
  • Difficulty to reproduce performance results from the cTuning.org database submitted by users due to a lack of full software and hardware dependencies;
  • Difficulty to validate related auto-tuning and machine learning techniques from existing publications due to a lack of culture of sharing research artifacts with full experiment specifications along with publications in computer engineering.

Joined events

We have been releasing our own ... After evangelizing for 7 years and trying to solve technical aspects

  • ADAPT'14

We collaborate with our colleagues from AEC who recently managed to persuade the following conferences join similar initiative:

  • OOPSLA'13, PLDI'14

Our expertise and work

  • Set up evaluation of experimental results and all related material for workshops, conferences and journals
  • Improve sharing, description of dependencies, and statistical reproducibility of experimental results and related material
  • Improve public Collective Mind repository of knowledge and collaborative experimentation infrastructure in computer engineering
  • Validate new open publication model

Community-driven reviewing of publications and artifacts

Pool

Packing artifacts for evaluation

  • Links to tools for possible packing

Events

Links


(C) 2011-2014 cTuning foundation