ucl-candi / freehand

Freehand ultrasound without external trackers
Apache License 2.0
26 stars 8 forks source link

Freehand ultrasound without external trackers

This repository contains algorithms to train deep neural networks, using scans of freehand ultrasound image frames acquired with ground-truth frame locations from external spatial trackers. The aim is to reconstruct the spatial frame locations or relative transformation between them, on the newly acquired scans.

The data can be downloaded here. We have collected a new large freehand ultrasound dataset and are organising a MICCAI2024 Challenge TUS-REC Challenge. Check Part 1 and Part 2 of the training dataset.

Steps to run the code

1. Clone the repository.

git clone https://github.com/ucl-candi/freehand.git

2. Navigate to the root directory.

cd freehand

3. Install conda environment

conda create -n FUS python=3.9.13
conda activate FUS
pip install -r requirements.txt

4. Download data and put Freehand_US_data.zip into ./data directory. (You may need to install zenodo_get)

pip3 install zenodo_get
zenodo_get 7740734
mv Freehand_US_data.zip ./data

5. Unzip.

Unzip Freehand_US_data.zip into ./data/Freehand_US_data directory.

unzip data/Freehand_US_data.zip -d ./data

6. Make sure the data folder structure is the same as follows.

├── data/ 
│ ├── Freehand_US_data/ 
│  ├── 000/
│    ├── *.mha
│    ├── ...
│  ├── ...
│  ├── 018/ 

7. Data processing (Generate one .h5 file, using downloaded .mha files)

python data/prep.py

8. Train model

python scripts/train.py

9. Test model

python scripts/test.py

If you find this code or data set useful for your research, please consider citing some of the following works: