General Information

Please register for our challenge before making submissions. Note that only participants with verified accounts can submit entries for this phase. Our challenge is organized as an "algorithm submission" to enable fair comparison across participants, and the evaluation datasets for all phases are hidden. Therefore, participants must submit a Docker container to be ranked on the leaderboard.

Form a Team

Participants can form teams through the Teams section on the challenge website.

Baseline Example

A baseline example with inference code for nnU-Net for tissue segmentation and Hover-NeXt for nuclei segmentation is provided for each track on GitHub.

How to Submit an Algorithm

Please follow the steps below to submit an algorithm. For more details, check out the respective GitHub repository for track 1 and GitHub repository for track 2.

  1. Develop a pipeline or model that takes .tif images as input and outputs a .json file for nuclei segmentation and a .tif file for tissue segmentation.
  2. Clone the GitHub repository provided by the organizers for each track:

  3. Update the inference.sh bash script appropriately for your model.

  4. In output_rename.py, the specifics for the images/melanoma-tissue-mask-segmentation/<uuid>.tif file to work with the grand challenge platform can be found.
  5. Export the algorithm Docker container as a .tar.gz file, compression with gzip results in a significant reduction in container size. (see also save.sh)
  6. Submit the created .tar file through the submission system on the challenge website.

In the test phase, two additional materials are required to be submitted (a) a report and (b) GitHub repository URL. Please read below for the details of each item.

(a) report of 4 pages in Springer format that summarizes the participants’ method. The report should include information that is needed to reproduce the results, for example, data processing, hyperparameter tuning, method, preliminary results in the validation set, etc.

(b) GitHub repository URL that provides the source code. It should include the environment (e.g. Docker) and all relevant source code to generate the results. The repository can be either public or private before the challenge ends, however, it must be public when the challenge ends and remain public afterward. Participants have to mention the PUMA 2024 challenge in the README of their GitHub repository.

Limitation on the number of submissions

We limit the number of submissions for preliminary test phase to 10 and for the test phase to 1 time, respectively. Note that, the final ranking is only based on the test phase score, not the validation phase score.