| # MRGen: Segmentation Data Engine for Underrepresented MRI Modalities (ICCV 2025) | |
| This repository contains the curated MRGen-DB dataset proposed in MRGen: https://arxiv.org/abs/2412.04106/. | |
| ## Some Information | |
| [Project Page](https://haoningwu3639.github.io/MRGen/) · [Paper](https://arxiv.org/abs/2412.04106/) · [Dataset](https://huggingface.co/datasets/haoningwu/MRGen-DB) · [Checkpoints](https://huggingface.co/haoningwu/MRGen) | |
| ## Dataset | |
| Please check out [MRGen-DB](https://huggingface.co/datasets/haoningwu/MRGen-DB) to download our curated dataset, including two parts: `radiopaedia_data` and `conditional_dataset`. | |
| For the conditional dataset, we have directly provided our processed data, including the raw image, mask annotations, and text descriptions. | |
| As described in our paper, considering the data privacy concerns of [Radiopaedia](radiopaedia.org), we only release the JSON files of this part here. | |
| For each case, the format is represented as `./radiopaedia/{patient_id}/{case_id}/{volume_id}/{slice_id}.jpeg`, for example, `./radiopaedia/2564/1/MRI_4/1.jpeg`. | |
| This format allows you to locate the corresponding original volume through the `link` provided in our JSON files. | |
| After obtaining official authorization from Radiopaedia, you may download the data corresponding to the JSON file on your own. | |
| Alternatively, you can send the authorization via email to us (`[email protected]` or `[email protected]`) to obtain the download link for the image data in our MRGen-DB. | |
| ## Citation | |
| If you use this dataset for your research or project, please cite: | |
| @inproceedings{wu2025mrgen, | |
| author = {Wu, Haoning and Zhao, Ziheng and Zhang, Ya and Wang, Yanfeng and Xie, Weidi}, | |
| title = {MRGen: Segmentation Data Engine for Underrepresented MRI Modalities}, | |
| booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)} | |
| year = {2025}, | |
| } | |
| ## Contact | |
| If you have any questions, please feel free to contact [email protected] or Zhao[email protected]. |