This guide is meant to help organizers create a challenge space within Synapse to host a crowd-sourced challenge. A challenge space provides participants with a Project
to learn about the challenge, join the challenge community, submit entries, track progress and view results. This document will focus on:
The following steps are automated by the createchallenge
command in the challengeutil
python package. The instructions to install and use this package are located in the Challenge Utilities Repository. This guide describes the key Synapse components generated by the tool.
Projects
The command createchallenge
creates two Synapse Projects
and initializes the Wiki
of the Live challenge Project
with a pre-registration page:
Live challenge project - The Live challenge Project
serves as the pre-registration page during challenge development and is replaced with a challenge Wiki
once the challenge is ready to be launched to allow participants to register for the challenge. Organizers fill in the content of the Wiki
to provide detailed information about the challenge (e.g. challenge questions, data, participation, evaluation metrics…). The Wiki
page must be made public to allow anyone to learn about the challenge and pre-register.
Staging challenge project - This Project
is used by the organizers during the development of the challenge to share files and draft the challenge Wiki
. createchallenge
initializes the Wiki
with the DREAM Challenge Wiki Template.
Maintence of both a Staging and Live Project
enables Wiki
content to be edited and previewed in the Staging Project
before the content is published to the Live challenge Project
.
For background on how to create and share Projects
, Files
, Folders
and Wiki
pages, please see our article Making a Project.
The command createchallenge
creates three Synapse Teams
:
Challenge participant team - This Synapse Team
includes the individual participants and Teams
that register to the challenge.
Challenge administrator team - The challenge organizers must be added to this list to provide the permissions to share files and edit the content of the Wiki
on the Staging Project
.
Challenge pre-registration team - This Team
is recommended for when the challenge is under development. It allows participants to join a mailing list to receive notification of challenge launch news.
Please visit this page to learn more about Teams
.
The command createchallenge
also connects the challenge participant team to the Live challenge Project
to effectively enable participant submissions.
The challenge data (e.g. training dataset, scoring data…) are uploaded to the Live challenge Project
when it is ready to be shared with participants.
For background on how to create and share Project
, Files
, Folders
and Wiki
pages, please see our article Making a Project.
Synapse has the ability to apply access restrictions to sensitive data (e.g. human data), so that legal requirements are met before participants access such data. If human data are being used in the challenge, or if you have any question about the sensitivity of the challenge data, please contact the Synapse Access and Compliance Team (act@sagebase.org) for support to ensure that the necessary data access approval procedures are put in place.
There are cases where there are no human data concerns and instead a pop-up agreement needs to be presented before the first data download. Contact the Access and Compliance Team to set up this agreement.
Please view the Access Controls page to learn how to add conditions for use on data.
Challenge participants can submit Synapse Entities (e.g. File
, Folder
, Project
, Docker
) to evaluation queues. Multiple Evaluation queues can be created to support challenges with more than one question.
Please visit the Evaluation Queue article to learn more about queue configuration.
One of the features of Synapse for DREAM Challenges is the live compilation of submission statistics for all evaluation queues, including total submission count, count per individual/team, count per submission state (scored, invalid) and count per week. You can see the statistics for various challenges here. In order to activate statistics for your evaluation queues, you must be an administrator of the challenge Project. Each queue needs to be configured to generate the statistics. To do this:
evaluationstatistics
. The new entry will appear in the list of accessors above.Statistics are updated weekly. They are also retroactive - you do not have to enable statistics at the beginning of your challenge to have the entire history reported.
To launch the space, share the evaluation queues with the participant Team
and use the copyWiki
command provided by synapseutils to deploy the initial Staging Wiki
to the Live challenge Project
. The Wiki
of the Live Project
will then be replaced by the Wiki
in the Staging Project
.
After the initial deployment, the mirrorwiki
command provided by challengeutils mirrors the Wiki
from the Staging site to the Live site throughout the duration of the challenge. The Wiki
titles must match to maintain this connection.
Throughout the challenge, participants will continuously submit to the evaluation queues. To manage continuous submissions, organizers can automate validation and scoring with the Synapse python client evaluation commands.
Organizers can create a leaderboard when scores are ready to be revealed to participants. Leaderboards are sorted, paginated, tabular forms that display submission annotations (e.g. scores from the scoring application and other metadata) and update as annotations or scores change. A leaderboard can provide real-time insight into the progress of a challenge.
Learn more about adding leaderboards in the Evaluation Queue article.
Try posting a question to our Forum.
Let us know what was unclear or what has not been covered. Reader feedback is key to making the documentation better, so please let us know or open an issue in our Github repository (Sage-Bionetworks/synapseDocs).