docker pull ibeatgroup/ibeat_v2:release208
.--skip_surface
option may not work properly in some cases.--skull_prob_thresh
has been removed. Instead, we provided two more options, i.e., --t1_skull_prob_thresh
and --t2_skull_prob_thresh
to make it more flexible for different modalities especially when two modalities are input together.docker pull ibeatgroup/ibeat_v2:release205
.docker pull ibeatgroup/ibeat_v2:release200
.docker pull ibeatgroup/ibeat_v2:release120
.docker pull ibeatgroup/ibeat_v2:release110
.This README
illustrates how to install and run the Docker version of iBEAT V2.0 pipeline, which is an infant-dedicated structural imaging processing pipeline for infant brain MR images. More details of the pipeline can be referred to iBEAT V2.0 Cloud.
Since this is a Linux based container, please install the container on a Linux system. The supported systems include but not limited to Ubuntu
, Debian
, CentOS
and Red Hat
.
The pipeline is developed based on the deep convolutional neural network techniques. A GPU is required to support the processing. During the running, around 3 GB GPU memory is required.
Please refer to the official installation guideline Install Docker on Linux. You can follow the below commands to install the Docker on the popular Ubuntu Linux system. If you have previously installed Docker on your computer, please skip this step.
sudo apt update
sudo apt install ca-certificates curl gnupg lsb-release
sudo apt mkdir -p /etc/share/keyrings
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /etc/share/keyrings/docker.gpg
sudo echo "deb [arch=$(dpkg --print-architechture) signed-by=/etc/share/keyrings/docker.png] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable" > /etc/apt/sources.list.d/docker.list
sudo apt update
sudo apt install -y docker-ce docker-ce-cli containerd.io docker-compose-plugin
Since the docker needs the sudo
to run the general commands, to run without sudo
, you need to add your user name ${USER}
to the docker group, which can be done
sudo group add docker
sudo usermod -aG docker ${USER}
After running these commands, you may need to restart your computer to make the configuration take effect. One easy way to check whether you have successfully installed the Docker or not is to run the Docker using the command docker info
, which dose not need sudo
requirement if you have added your username to the docker
group. Please refer to more details for potential issues on Docker.
The Nvidia-docker
is required since our pipeline needs GPU support. Please refer to the official installation guideline Install Nvidia-docker on Linux. If you have previously installed Nvidia-docker, please skip this step. The following commands can be used to install the Nvidia-docker on the popular Ubuntu Linux system.
sudo apt update
curl -fsSL https://nvidia.github.io/libnvidia-container/gpgkey | sudo gpg --dearmor -o /usr/share/keyrings/nvidia-docker.gpg
distribution=$(. /etc/os-release; echo ${ID}${VERSION_ID})
curl -sL https://nvidia.github.io/nvidia-docker/libnvidia-container/${distribution}/libnvidia-container.list | sed 's#deb https://# deb [signed-by=/usr/share/keyrings/nvidia-docker.gpg] https://#g' | sudo tee /etc/apt/sources.list.d/nvidia-docker.list
sudo apt update
sudo apt install -y nvidia-docker2
After the installation, please restart the Docker daemon to complete the installation after setting the default funtime:
sudo systemctl restart docker
Finally, you can check whether you have successfully installed the Nvidia-docker
using the following command:
docker run --rm --gpus=all nvidia/cuda:9.0-base nvidia-smi
If succeeded, the output should be the GPU card information on your PC.
Run docker pull ibeatgroup/ibeat_v2:release_version_tag
, where release_version_tag
is the container tag. Currently, the latest container tag is release205
, thus run docker pull ibeatgroup/ibeat_v2:release205
to download the pipeline.
After downloading, you can use docker images
to see the container images you have downloaded.
The container is totally free. Please first register on the iBEAT V2.0 Cloud to get a free license.
Before running the pipeline, please:
your_data_folder
), which will be mounted to the container. The container will load the data for processing from this directory and also save the processed results to this directory. After that, you can run the pipeline with the following example code:
docker run --gpus=all --rm -it -v /your_data_folder:/InfantData --user $(id -u):$(id -g) ibeatgroup/ibeat_v2:release100 --t1 <t1_file_path> --t2 <t2_file_path> --age <age_in_month> --out_dir <result_dir> --sub_name <subject_id>
In the above example, you can regard docker run --gpus --rm -it -v /your_data_folder:/InfantData --user $(id -u):$(id -g) ibeatgroup/ibeat_v2:release100
as a simple linux command (despite it is long)
docker run --gpus=all
is the command to run a docker container with all gpus, you can change --gpu=all
to a specific gpu id if you have multiple gpus installed, for example, --gpu=0
will use the first gpu.--rm
indicates that the container will be removed from the memory once it is finished.-v
indicates the input data folder. /your_data_folder
is the directory where you put the processing data and the license. /InfantData
is the internal path inside the container to locate the data and license.--user $(id -u):$(id -g)
indicates the container will be runned in the provided user (current user of the linux) and provided group (current user group).ibeatgroup/ibeat_v2:release100
is the container name of the pipeline.The important parameters are:
--t1
or --t2
: the path of the T1w or T2w infant brain image (relative to your_data_folder
), which is the data folder you plan to mount into the container. They should be in NIfTI format (.nii). If you only have one modality, just input the available modaility (T1w or T2w).--age[a]
: the infant age at scan (in MONTH).--out_dir[d]
: the folder where to save the results. If empty, it will be created. Of note, this folder is rooted based on the inputed data folder (your_data_folder in the above example).--sub_name[n]
: the subject name for the current processing subject. If you don’t assign, the pipeline will refer based on your T1 and T2 image names.
You can get detailed parameter information and simple example command by just running the pipeline without any parameters.The user interventioned parameters are:
--skull_type
: how to do the skullstripping. 0
: skip. 1
: use our model. 2
: provide the mask. The mask file is provided by the following 2 parameters. (optional, default: 1
)--t1_skull_mask
: skull mask for T1. (optional, only meaningful when skull_type is set to 2
)--t2_skull_mask
: skull mask for T2. (optional, only meaningful when skull_type is set to 2
)--cere_type
: how to do the cerebellum removal. 0
: skip. 1
: use our model. 2
: provide the mask. The mask file is provided by the cere_mask parameter. (optional, default: 1
)--cere_mask
: cerebrum mask file. (optional, only meaningful when cere_type is set to 2
)--tissue_type
: how to do the tissue segmentation. 0
: skip. 1
: use our model. If 0
, the pipeline will only do the surface reconstruction with the input tissue specified by tissue_in parameter. (optional, default: 1
)--tissue_in
: the provided tissue map. (optional, only meaningful when cere_type is set to 0
)--skip_surface
: whether need to do the surface reconstruction. 0
: skip. 1
: do the reconstruction. (optional, default: 1
)--t1_skull_prob_thresh
: The threshold for binarizing the skull stripping probability map for the T1w modality. You can set this value between 0 and 1 to adjust the skull stripping mask. (optional, default: 0.5).--t2_skull_prob_thresh
: The threshold for binarizing the skull stripping probability map for the T2w modality. You can set this value between 0 and 1 to adjust the skull stripping mask. (optional, default: 0.9).--cerebrum_prob_thresh
: The threshold for binarizing the cerebrum probability map. You can set this value between 0 and 1 to adjust cerebrum mask. (optional, default: 0.5)--gpu_id
: determine which gpu will be used. 0
, 1
, 2
, … (optional, default: 0
)Note, these parameters are modified since version release 200.
--skull_mask
(removed from version release200): the path of the user provided brain mask. If there is only one modality (either --t1
or --t2
is used), this would be the brain mask for that modality. If there are two modalities input (both --t1
and --t2
are used), this would be the brain mask for the t1w modality. In this case, if you also want to provide a brain mask for t2 modality, please use an additional parameter --exmod_skull_mask
(see below).--exmod_skull_mask
(removed from version release200): the path of the user provided brain mask for t2w image when both t1w and t2w images are input.--cere_mask
(removed from version release200): the path of the user provided cerebrum mask. If there is only one modality, this would be the cerebrum mask for that modality. If there are two modalities input, this would be the cerebrum mask for the t1w modality. Since at the cerebellum removal stage, the t2w image has been aligned with the t1w image. So, the t2w cerebrum mask is no longer needed.nvidia-docker run --rm -it -v /your_data_folder:/InfantData --user $(id -u):$(id -g) ibeatgroup/ibeat_v2:release100 --t1 <t1_file> --age <t1_image_age> --out_dir <result_dir> --sub_name <subject_id>
The above command is a typical example of processing one subject with both T1w and T2w images. You can also only input a single T1w (or T2w) image if you only have one modality.
Since docker run --gpus=all --rm -it -v /your_data_folder:/InfantData --user $(id -u):$(id -g) ibeatgroup/ibeat_v2:release100
can be regarded as single command, you can also write a script to process the data in a batch by treating the pipeline command as a simple command in a for
or while
loop. The following is a simple example if you are using the bash script,
for t1_file in t1_file_pattern; do
nvidia-docker run --rm -it -v /your_data_folder:/InfantData --user $(id -u):$(id -g) ibeatgroup/ibeat_v2:release100 --t1 <t1_file> --age <t1_image_age> --out_dir <result_dir> --sub_name <subject_id>
done
After the processing is finished, in the "mounted" folder your_data_folder
, all the processing results will be generated. The following explains what the results are:
Yes. In the current version, we do need GPU support to run the pipeline. In the future, we will also release the pipeline that only needs a CPU for computation.
Yes. We have successfully processed 16,000+ infant brain images with various imaging protocols and scanners from 100+ institutions. Please see https://ibeat.wildapricot.org/Feedbacks.
Yes. The iBEAT V2.0 Cloud (http://www.ibeat.cloud) is timely updated with our latest developments, while the Docker version could be slightly delayed in updating. For optimal performance, iBEAT V2.0 Cloud is highly recommended.
Please cite the following papers if you use the results provided by the iBEAT V2.0 pipeline:
The iBEAT V2.0 software is developed by the University of North Carolina at Chapel Hill:
For questions/bugs/feedback, please contact:
Zhengwang Wu, Ph.D., zhwwu@med.unc.edu\ Li Wang, Ph.D., li_wang@med.unc.edu\ Gang Li, Ph.D., gang_li@med.unc.edu\ Department of Radiology and Biomedical Research Imaging Center\ The University of North Carolina at Chapel Hill