TIO-IKIM / CellViT

CellViT: Vision Transformers for Precise Cell Segmentation and Classification
https://doi.org/10.1016/j.media.2024.103143
Other
236 stars 41 forks source link

Training problem met: " if scale_min > scale_max: TypeError: '>' not supported between instances of 'NoneType' and 'NoneType' " #67

Closed LEZHOU26 closed 1 month ago

LEZHOU26 commented 1 month ago

Dear developer,

I was trying to use train_cellvit.yaml to train Pannuke dataset. But I got a problem:

2024-10-02 16:15:03,591 [INFO] - Loaded Adam Optimizer with following hyperparameters: 2024-10-02 16:15:03,591 [INFO] - {'lr': 0.001, 'betas': [0.85, 0.9]} 2024-10-02 16:15:03,591 [INFO] - Using early stopping with a range of 10 and maximize strategy Traceback (most recent call last): File "/home/zhoule/workspace/CellViT/cell_segmentation/run_cellvit.py", line 85, in outdir = experiment.run_experiment() File "/.../workspace/CellViT/cell_segmentation/experiments/experiment_cellvit_pannuke.py", line 182, in run_experiment train_transforms, val_transforms = self.get_transforms( File "/.../workspace/CellViT/cell_segmentation/experiments/experiment_cellvit_pannuke.py", line 710, in get_transforms A.Downscale(p=p, scale_max=scale, scale_min=scale) File "/.../anaconda3/envs/cellvit_env/lib/python3.9/site-packages/albumentations/augmentations/transforms.py", line 1571, in init if scale_min > scale_max: TypeError: '>' not supported between instances of 'NoneType' and 'NoneType' wandb: 🚀 View run 2024-10-02T161451_CellViT-SAM-H-Fold-1 at: https://wandb.ai/le-zhou-oklahoma-state-university/Cell-Segmentation/runs/1k8118j6

Here is the yaml file I used:

logging: mode: online
project: Cell-Segmentation
notes: CellViT-SAM-H
log_comment: CellViT-SAM-H-Fold-1
tags:

seeding

random_seed: 19

hardware

gpu: 0

setting paths and dataset

data: dataset: PanNuke
dataset_path: /home/zhoule/workspace/CellViT/configs/datasets/PanNuke
train_folds:

model options

model: backbone: SAM-H
pretrained_encoder: /home/zhoule/workspace/CellViT/models/pretrained/sam_vit_h.pth
pretrained:
embed_dim: 1280
input_channels:
depth: 32
num_heads: 16
extract_layers: 10
shared_decoders:
regression_loss:

loss: nuclei_binary_map:
bce:
loss_fn: xentropy_loss
weight: 1
dice:
loss_fn: dice_loss
weight: 1
focaltverskyloss:
loss_fn: FocalTverskyLoss weight: 1

training options

training: batch_size: 16
epochs: 30
unfreeze_epoch: 25
drop_rate: 0
attn_drop_rate: 0.1
drop_path_rate: 0.1
optimizer: Adam
optimizer_hyperparameter: lr: 0.001
betas: [0.85, 0.9]
early_stopping_patience: 10 scheduler:
scheduler_type: exponential
hyperparameters: gamma: 0.85 sampling_strategy: cell+tissue
sampling_gamma: 1
mixed_precision: true
eval_every: 1

Could you please give some suggestions about potential cause the error. Thank you.