This year, we're starting the formation of an artifact review committee for the purpose of collecting, evaluating, and displaying any artifacts related to accepted papers and we encourage you to submit your artifacts for review. The goal of this process is to provide a way for authors to share any work beyond the contents of the paper itself that aid in the reproducibility of results and allow other researchers or community members to build on the work reflected in the paper.
Possible artifacts include (but are not limited to):
- source code (e.g., system implementations, proof of concepts)
- datasets (e.g., network traces, raw study data)
- scripts for data processing or simulations
- machine-generated proofs
- formal specifications
- build environments (e.g., VMs, Docker containers, configuration scripts)
Submission of artifacts is encouraged but optional, and artifacts will be evaluated by the artifact review committee so that we can provide feedback on possible bugs in the build environment, readability of documentation, and appropriate licensing. After your artifact has been approved by the committee, we will accompany the paper link on petsymposium.org with a link to the artifact along with an artifact badge so that interested readers can find and use your hard work.Artifact Submission Guidelines
For PoPETs 2020, we'll be doing a soft start with artifact reviews and will only be performing very basic checks for the proper documentation, licensing, and compilation of artifacts rather than doing an in-depth analysis of the source code and the reproducability of results in the paper. Specifically, we ask that authors provide the following:
- Proper licenses for all submitted artifacts. The goal is for these artifacts to be made public and used by other groups wishing to reproduce or build on the results.
- Clear documentation of the artifact and its use such as a well documented descriptions of provided datasets, or a clear and easy to follow README for building and running code
- A build environment such as a Docker container or VM for compiling or running provided code
- Reviewers should be able to build compiled code in the provided environment by following instructions in a README or INSTALL file. For interpreted code, the program(s) should run without error on some provided inputs