Closed Transformer-man closed 1 year ago
How can I use a smaller memory
During training or Inference? The Code is optimized to run on a 48GB Nvidia Device. During Training, reduce the batch size. For inference, you need to reduce the inference batch size in cell_detection.py
. However, we are currently improving the inference performance and will also add a batch-size parameter for smaller GPUs
wsi_inference_dataloader = DataLoader(
dataset=wsi_inference_dataset,
batch_size=8,
num_workers=16,
shuffle=False,
collate_fn=wsi_inference_dataset.collate_batch,
pin_memory=False,
)
Batch-size parameter has been added. Please check the new inference parameters in the cli. Using a size of 2 or 4 should work with your code