The ReXrank Challenge is a competition in chest X-ray report generation leveraging ReXGradient, our comprehensive multi-institutional dataset, running from December 1, 2024, to March 15, 2025. The competition welcomes participation from academic institutions, industry professionals, and independent researchers worldwide. We will evaluate the performance of the models from multiple critical dimensions, including clinical accuracy and generalization capability across diverse institutions. A panel of distinguished radiologists will conduct thorough assessments of the highest-performing models. Top-performing participants will be invited to collaborate on future research initiatives and model development.
To evaluate your models, we made available the evaluation script we will use for official evaluation, along with a sample prediction file that the script will take as input.
To run the evaluation, use
python evaluate.py <path_to_data> <path_to_predictions>
.
1. Evaluating on the MIMIC-CXR Test Set
To achieve a consistent score with our leaderboard, please use the official MIMIC-CXR test split. You can download the file from here. We evaluate at the study level. If the submitted model can input multiple images, we will input all images of a study. If the submitted model includes only one image, we will default to using the frontal image. We also include context information like patient age, patient gender, indication and comparison. When submission, you can select if you are going to use this info.
2. Model Submission
Your model submission should include the following:
environment.yaml
file support conda install
.python inference.py <input_json_file> <output_json_file> <img_root_dir>
We provide an example of MedVersa for understanding our requirements.Any questions or concerns? Please reach out to us with email.