pdsw-DISCS 2020:

5th International Parallel Data Systems Workshop


Held in conjunction with SC20

November 12, 2020
10:00 am - 6:30 pm (EST)
VIRTUAL WORKSHOP


Program Chair:

Inria, France

Program Vice Chair:

Riken, Japan
General Chair:

Argonne National Laboratory, USA


deadlines / camera ready instructions / reproducibility

-- see new updates (10/6/20) to deadlines below --


PDSW 2020 Technical Papers


We invite regular papers which may optionally undergo validation of experiment results by providing reproducibility information through the PDSW Reproducability Initiative.

Text formatting:

Submit a not previously published paper as a PDF file, indicate authors and affiliations. Papers must be up to 5 pages, not less than 10 point font and not including references and optional reproducibility appendices. Papers must use the IEEE conference paper template available at: https://www.ieee.org/conferences/publishing/templates.html.

Please do not include page numbers on your paper!

Deadlines

Submissions deadline: Paper (in pdf format) due Sept. 6, 2020, 11:59 PM AoE
Submissions website: https://submissions.supercomputing.org/
Notification: Sep. 28, 2020
Copyright forms due: Oct. 18, 2020 - note new date
Pre-recorded presentation due: Oct. 7, 2020, 11:59 PM AoE
Slides due before workshop: OCt. 7, 2020, 11:59 PM AoE - note new date
Camera ready files due:
Oct. 18, 2020, 11:59 PM AoE - note new date
* Submissions must be in the IEEE conference format

copyright instructions

Instructions on submitting copyright will be sent to each contact author. It is extremely important that authors submit their copyright info as soon as possible after receiving this email as your paper will be denied publication without it. Once you have submitted your copyright info, you will receive further information on how to include it on your final paper.


Camera ready instructions


Copyright info: 

All final papers must include on their first page the appropriate copyright and bibliographic lines. The text can vary, depending upon whether or not the authors are associated with a government, or if the paper copyright is owned by a government. It is your responsibility to read the information that confirms your copyright and make sure the text is correct. The confirmation from IEEE will specify exactly what to include. The full text must be included, exactly as shown. All this information must appear in the lower lefthand corner on the first page of your PDF file. 

Previously copyrighted material: 

Make sure your manuscript contains no previously copyrighted material (whether text, images, or tables) unless it is properly cited. NOTE: This guideline also applies to materials from your own previous papers; the only difference between re-using your own work and someone else’s is that you don’t need to quotation marks around text that you wrote – it still needs to be cited any time it appears. Violation of these guidelines will result in your paper being withdrawn from the conference proceedings.


The PDSW Reproducibility Initiative

Ivo Jimenez, Carlos Maltzahn, UC Santa Cruz

The aim of this initiative is to bring reproducibility and validation of experiments into the digital age by enabling experiments to be digitally reproduced. Digital reproductions are perfect and do not degrade with the number of copies. This is in contrast to analog reproductions which degrade with each copy of a copy. For experiments to be digitally reproduced like music, one has to have a name for the experiment (i.e. a URL), a player, and a standard format that instructs the player how to reproduce the experiment. Digitally reproducing experiments is not a new concept and has been in practice in the DevOps community for years in the context of testing. It is called “Continuous Integration” (CI), a practice that provides testing as a service by executing software delivery pipelines whenever a new change is pushed into a software repository. Popular examples are Travis, Jenkins, and Gitlab-CI. Such a CI service turns out to be a convenient entry point for “playing” an experiment, not only for the authors but also for reviewers so they can easily validate reproducibility. This is why we also call digital reproducibility “automated reproducibility.” [...more]