UniUSNet: A Promptable Framework for Universal Ultrasound Disease Prediction and Tissue Segmentation
UniUSNet is a universal framework for ultrasound image classification and segmentation, featuring:
- A novel promptable module for incorporating detailed information into the model's learning process.
- Versatility across various ultrasound natures, anatomical positions, and input types. Proficiency in both segmentation and classification tasks
- Strong generalization capabilities demonstrated through zero-shot and fine-tuning experiments on new datasets.
For more details, see the accompanying paper and Project Page,
UniUSNet: A Promptable Framework for Universal Ultrasound Disease Prediction and Tissue Segmentation
Zehui Lin, Zhuoneng Zhang, Xindi Hu, Zhifan Gao, Xin Yang, Yue Sun, Dong Ni, Tao Tan. BIBM, 2024.
- Clone this repository.
git clone https://github.com/Zehui-Lin/UniUSNet.git
cd UniUSNet
- Create a new conda environment.
conda create -n UniUSNet python=3.10
conda activate UniUSNet
- Install the required packages.
pip install -r requirements.txt
- BroadUS-9.7K consists of ten publicly-available datasets, including BUSI, BUSIS, UDIAT, BUS-BRA, Fatty-Liver, kidneyUS, DDTI, Fetal HC, CAMUS and Appendix.
- You can prepare the data by downloading the datasets and organizing them as follows:
data
├── classification
│ └── UDIAT
│ ├── 0
│ │ ├── 000001.png
│ │ ├── ...
│ ├── 1
│ │ ├── 000100.png
│ │ ├── ...
│ ├── config.yaml
│ ├── test.txt
│ ├── train.txt
│ └── val.txt
│ └── ...
└── segmentation
└── BUSIS
├── config.yaml
├── imgs
│ ├── 000001.png
│ ├── ...
├── masks
│ ├── 000001.png
│ ├── ...
├── test.txt
├── train.txt
└── val.txt
└── ...
- Please refer to the
data_demo
folder for examples.
We use torch.distributed
for multi-GPU training (also supports single GPU training). To train the model, run the following command:
python -m torch.distributed.launch --nproc_per_node=1 --master_port=1234 omni_train.py --output_dir exp_out/trial_1 --prompt
To test the model, run the following command:
python -m torch.distributed.launch --nproc_per_node=1 --master_port=1234 omni_test.py --output_dir exp_out/trial_1 --prompt
- You can download the pre-trained checkpoints from BaiduYun or directly from the release page.
To train your own model, please download the Swin Transformer backbone weights and place it in the pretrained_ckpt/
directory:
The folder structure should look like:
pretrained_ckpt
└── swin_tiny_patch4_window7_224.pth
If you find this work useful, please consider citing:
@inproceedings{lin2024uniusnet,
title={UniUSNet: A Promptable Framework for Universal Ultrasound Disease Prediction and Tissue Segmentation},
author={Lin, Zehui and Zhang, Zhuoneng and Hu, Xindi and Gao, Zhifan and Yang, Xin and Sun, Yue and Ni, Dong and Tan, Tao},
booktitle={2024 IEEE International Conference on Bioinformatics and Biomedicine (BIBM)},
pages={3501--3504},
year={2024},
organization={IEEE}
}
This repository is based on the Swin-Unet repository. We thank the authors for their contributions.