We are developing open-source CK technology and common methodology for reproducible experimentation, autotuning and ML via open challenges!
Home / About Success Repository Tools Community Reproducibility Wiki archive Contacts


  •  2017.February: Our CGO'07 research paper received "test of time" award - it motivated development of the cTuning's framework to crowdsource optimization!
  •  2017.February: We organized Artifact Evaluation panel at CGO/PPoPP'17 (Monday, 17:15-17:45, Austin, TX, USA)
  •  2017.February: We started preparing AI for collaborative optimization powered by CK: cKnowledge.org/ai
  •  2017.February: Last year we co-authored ACM's policy on Result and Artifact Review and Badging and prepared Artifact Appendices now used at SuperComputing'17!
  •  2017.February: Michel Steuwer (University of Edinburgh) blogged about CK concepts
  •  2017.January: We can now compile and run Caffe (popular DNN framework) with all deps in a unified way using our CK portable workflow framework on Linux, Windows and Android!
  •  2017.January: Catch our team at HiPEAC'17 (Jan.23-25, Stockholm, Sweden)
  •  2017.January: One of the highest ranked public artifacts from the CGO'17 was implemented using our CK framework - see it at GitHub!
  •  2017.January: We wish you a very happy and successful New Year, and start it with several exciting internships available at dividiti (Cambridge, UK)
  •  2016.December: We released new version of our open-source Android application to crowdsource benchmarking and optimization of varioud DNN libraries and models (Dec.27) [ grab it at Google Play; get sources from GitHub; see crowd-results (scenario "crowd-benchmark DNN libraries") ]
  •  2016.October: We presented our collaborative approach to workloads benchmarking at ARM TechCon'16 (Oct.27, Santa Clara, USA)
  •  2016.October: We updated list of CK-powered open R&D challenges in computer engineering
  •  2016.October: We presented Collective Knowledge approach for unified artifact sharing MozFest'16 Open Science Session (Oct.27, London, UK)
  •  2016.September: We released CK V1.8.2 with continuous integration, support for farms of machines, new CK documentation, and new Open Science resources at GitHub!
  •  2016.August: Artifact Evaluation for PACT'16 was successfully completed!
  •  2016.June: Congratulations to Abdul Memon (PhD student advised by the cTuning foundation researchers) for successfully defending his thesis "Crowdtuning: Towards Practical and Reproducible Auto-tuning via Crowdsourcing and Predictive Analytics" in the University of Paris-Saclay. Most of the software, data sets and experiments are not only reproducible but also shared as reusable and extensible components via Collective Mind and CK!
  •  2016.June: Dagstuhl workshop on Engineering Academic Software!
  •  2016.June: Our Collective Knowledge approach for collaborative and reproducible experimentation was presented at the Smart Anything Everywhere Workshop: Enhancing digital transformation in European SMEs!
  •  2016.May: Thanks to a one-year grant from Microsoft, we moved Collective Knowledge Repository to Azure cloud!
  •  Recent publications with our long-term vision: [DATE'16 (with artifacts), CPC'15 (with artifacts), Scientific Programming'14 (with artifacts), TRUST@PLDI'14, CK-based interactive report with OpenCL crowd-tuning].
[ News archive ]  [ All events ]  [ cTuning twitter ]

Our on-going community activities

Developing and supporting our open-source Collective Knowledge framework aka CK - a small, portable and customizable research platform to quickly prototype experimental workflows (such as multi-objective autotuning) from shared components with unified JSON API; crowdsource and reproduce experiments; apply predictive analytics; enable interactive articles. [ Framework at GitHub ], [ GCC/LLVM live crowd-tuning results ], [ Interactive report about CK-based OpenCL crowd-tuning ]
Promoting our new open, collaborative and reproducible research and publication model with the community-driven reviewing and validation of results (see our proposal, ADAPT workshop series and wiki); improving artifact evaluation methodology for conferences and journals including CGO, PPoPP, PACT, RTSS and SC:
Helping the community share workloads and other artifacts as reusable components with JSON API using our open CK format via GitHub:
• Benchmarks and codelets (CPU/OpenCL/CUDA/etc): 150
• Data sets: 560 (~19500 via Google Drive and BitTorrent)
• Tools and libraries: 110
• Compilers: 43
Collaboratively improving CK-based Android app to crowdsource experimentation such as GCC/LLVM/OpenCL auto-tuning using shared workloads across Android-based mobile devices [ Live crowd results ]
Enabling reproducible and interactive papers using our latest CK technology: [ Interactive report ], [ DATE'16 ]
Improving our paper title generator (our joke about incremental research which motivated our open and collaborative R&D initiatives) [ web ], [ Android app ].

The cTuning foundation is a non-profit research and development organization. It is the outcome of Grigori Fursin's initiative started in 2006 to build faster, smaller, more power efficient and more reliable computer systems via open repository of optimization knowledge, artifact sharing, reproducible experimentation, universal and multi-objective autotuning and crowd-tuning of real workloads across diverse hardware provided by volunteers, statistical analysis, predictive analytics, and run-time adaptation. We enabled collaborative computer engineering via open challenges, and actively collaborate with leading universities, Fortune 50 companies, ACM and dividiti.

The cTuning foundation is developing an open source knowledge management framework and web-based repository to enable collaborative and reproducible experimentation in computer engineering while exposing it to powerful predictive analytics (statistical analysis, machine learning, detection of missing features, improvement of models) and collective intelligence. The latest version of our technology (Collective Knowledge or CK) is available at GitHub, in Google Play Store and in Debian distribution. You can see latest results from various experiment crowdsourcing (such as GCC/LLVM crowd-tuning) in CK live repository. You can see an example of a CK-based interactive article here (OpenCL crowd-tuning and machine-learning based run-time adaptation). You can see the list of the CK-powered open research challenges here. Here are a few papers describing our approach: [ DATE'16, CPC'15, JSC'14, TRUST'14@PLDI'14, GCC Summit'09, IJPP'11, TACO'10, PLDI'10, SMART'09, HiPEAC'09 ]

Our tools and techniques have been successfully used in multiple academic and industrial projects helping the community unify, systematize, standardize and accelerate their previously ad-hoc, complex, time-consuming and error-prone process of benchmarking, autotuning (optimization) and co-design of computer systems (software and hardware). Enabling faster, smaller, cheaper, more energy efficient and reliable computer systems help, in turn, boost innovation in science and technology!

For example, our CK-based, universal, customizable and multi-objective auto-tuner can help gain up to 10x performance speedups and 30% energy savings on various OpenCL/CUDA/CPU libraries without sacrificing accuracy (or up to 20x speed-ups with a small accuracy degradation) as shown in a CK-powered interactive graph below for a popular SLAM algorithm (all points can be reproduced/validated on practically any platform using CK):

Since the beginning of his own research in 1997, Grigori Fursin realized that it will not be possible to advance without sharing artifacts, reproducing empirical experiments, validating others' results and enabling fair comparison of techniques. Therefore, he spent a considerable effort sharing all artifacts from his own research and promoting new publication model where results are validated by the community. cTuning foundation continued this effort and recently successfully arranged the first workshop in computer engineering where publications and results have been validated by the community (see ADAPT workshop). Furthermore, together with other colleagues, we managed to persuade several major conferences in computer engineering to start artifact evaluation. We are now a part of the ACM taskforce on reproducible research and experimentation! We also actively participate in various international research projects developing novel techniques to improve machine learning, data mining, knowledge discovery, statistical analysis, feature detection, experiment crowdsourcing and so on.

[ Our history ] , [ Our educational activities ], [ Community ]
We would like to thank the following organizations and companies for supporting our community service:

You are very welcome to join our foundation or simply donate here to support our public activities and keep cTuning.org and cknowledge.org live:

Website is powered by CK
           Locations of visitors to this page
Grigori Fursin started cTuning project in 2006 to share artifacts and predictive models with the community,
opened cTuning.org in 2008, and established non-profit cTuning foundation in 2014 to support Collective Knowledge initiative.