GitHub - ucl-candi/freehand: Freehand ultrasound without external trackers (original) (raw)

This repository contains algorithms to train deep neural networks, using scans of freehand ultrasound image frames acquired with ground-truth frame locations from external spatial trackers. The aim is to reconstruct the spatial frame locations or relative transformation between them, on the newly acquired scans.

The data can be downloaded here. We have collected a new large freehand ultrasound dataset and are organising MICCAI2024&2025 Challenges TUS-REC Challenge. Check Part 1 and Part 2 of the training dataset for TUS-REC2024, and Train Data for TUS-REC2025.

Steps to run the code

1. Clone the repository.

git clone https://github.com/ucl-candi/freehand.git

2. Navigate to the root directory.

3. Install conda environment

conda create -n FUS python=3.9.13 conda activate FUS pip install -r requirements.txt

4. Download data and put Freehand_US_data.zip into ./data directory. (You may need to install zenodo_get)

pip3 install zenodo_get
zenodo_get 7740734
mv Freehand_US_data.zip ./data

5. Unzip.

Unzip Freehand_US_data.zip into ./data/Freehand_US_data directory.

unzip data/Freehand_US_data.zip -d ./data

6. Make sure the data folder structure is the same as follows.

├── data/ │ ├── Freehand_US_data/ │ ├── 000/ │ ├── *.mha │ ├── ... │ ├── ... │ ├── 018/

7. Data processing (Generate one .h5 file, using downloaded .mha files)

8. Train model

9. Test model

If you find this code or data set useful for your research, please consider citing some of the following works: