- Conference Program
- SC15 Schedule
- Technical Program
- Birds-of-a-Feather Sessions (BOFs)
- Emerging Technologies
- Invited Talks
- Scientific Visualization Showcase
- Doctoral Showcase Program
- Early Career Program
- Technical Program Committee
- SC15 Diversity Committee Focused Events
- Best Paper and Poster Awards
- Test of Time Award
- ACM Gordon Bell Prize
- ACM/IEEE-CS George Michael Memorial HPC Fellowship
- ACM/IEEE-CS Ken Kennedy Award
- IEEE-CS Seymour Cray Computer Engineering Award
- IEEE-CS Sidney Fernbach Memorial Award
- Research with SCinet
- HPC Impact Showcase
- HPC Matters Plenary
- Keynote Address
- Support SC
- SC15 Archive
- HPC Matters
Call for Application Proposal at the SC16 Student Cluster Competition
Replication and reproducibility of experimental computer science results in peer-reviewed paper is gaining relevance in the HPC community. SC, the leading conference in the field, wants to promote and support replication and reproducibility through a new initiative that aims to integrate aspects of past technical papers into the Student Cluster Competition (SCC).
SC16 invites authors of technical papers accepted at past SC conferences, including SC15, to submit proposals for case studies based on applications and tests in their SC paper that can be transformed into benchmarks for the SCC. This initiative provides SC authors with the unique opportunity to further promote their published research as an example of replicable and reproducible experimental computer science. This solicitation aligns with other initiatives pursuing artifacts’ replication and reproducibility, including the ACM Project on Data, Software, and Replicability, and the ACM Transactions on Mathematical Software (TOMS) Replicated Computational Results Intuitive. The participation in this new SC project provides authors of selected papers with a certified assessment of replication and reproducibility of their published research, and will be among the first ones in our community. The selected paper will be supplemented with rigorous documentation of the experimental environment, the methodology for experimental replication, and the validated outcomes (results and performance). Authors will be recognized in a press release and at the SC16 award ceremony.
The SCC runs for about 48 hours straight during which teams of students build and test high-end clusters whose configuration can range from 6-18 nodes with a minimum memory of 2GB per core. Clusters are equipped with a mix of conventional CPUs (e.g., Intel and AMD) and GPU (e.g., NVIDIA K20, K40, K80) or Xeon Phi, but may also contain novel architectures.
A suitable code for the case study must meet these requirements:
- The code must be usable on most architectures
- The code should be able to run at least on conventional CPUs. Preference will be given to codes that also support GPU, Phi or ARM.
- Competition data set must be completable on a 6-10 TFlop cluster within 12-24 hours OR have a way to score incompleteness
- The runs that the competing students are asked to reproduce must be able to run on a cluster described above.
- The code must be open source and/or can be made available to student team (including international teams - no export control issues)
- Examples of data sets must be available for student teams to learn the application
- Data sets and results from the paper need to be made available to the students and committee
- All data sets and results need to fit into 500GB
A committee will select one code and associated paper from the submitted applications. The authors of the associated paper must be willing to assist and answer questions throughout the year regarding their application and paper to SCC student teams. In addition, one of the paper authors must agree to serve as the application expert for the competition at SC16 and will help to judge the student teams on the application. The authors will be recognized in a press release and at the SC16 award ceremony.
Proposals must include:
A copy of the technical paper accepted at the previous SC Conference
A copy of the software or a pointer to an archive of the software
Careful documentation of the process used to produce results
Any input data set required to initialize, calibrate or guide the simulation
Submission open: December 1, 2015
Submission close: January 15, 2016
Notifications: February 28, 2016
Hai Ah Nam