Rapisurazurite / CamoDiffusion

This repository is the official implementation of CamoDiffusion: Camouflaged Object Detection via Conditional Diffusion Models.
39 stars 4 forks source link

Are the training sets in the five datasets COD10K, CAMO, CHAMELEON, NC4K, and CDS2K combined for training? #4

Closed FriedaSmith closed 7 months ago

FriedaSmith commented 7 months ago

Hi, according to the code of train.py and the file config/dataset_352x352.yaml, are the training sets in the five datasets COD10K, CAMO, CHAMELEON, NC4K, and CDS2K combined for training?

def get_loader(cfg): train_dataset = instantiate_from_config(cfg.train_dataset) train_loader = DataLoader( train_dataset, batch_size=cfg.batch_size, shuffle=True, num_workers=cfg.num_workers)

test_dataset = instantiate_from_config(cfg.test_dataset.CAMO)
test_dataset_expand = SampleDataset(full_dataset=instantiate_from_config(cfg.test_dataset.COD10K), interval=10)
test_dataset = torch.utils.data.ConcatDataset([test_dataset, test_dataset_expand])
test_dataset_expand = SampleDataset(full_dataset=instantiate_from_config(cfg.test_dataset.NC4K), interval=30)
test_dataset = torch.utils.data.ConcatDataset([test_dataset, test_dataset_expand])

test_loader = DataLoader(
    test_dataset,
    batch_size=cfg.batch_size,
    collate_fn=collate
)
return train_loader, test_loader
Rapisurazurite commented 7 months ago

Like other COD models, the training dataset is a combination of COD10K and CAMO.

FriedaSmith commented 7 months ago

Is CAMO a combination of CAMO-COCO-V.1.0-CVIU2019/Camouflage and CAMO-V.1.0-CVIU2019? In addition, does COD10Konly use 5066 camouflaged images?

Rapisurazurite commented 7 months ago

The dataset used to train our model can be downloaded from https://github.com/GewelsJI/DGNet/tree/main/lib_pytorch. COD10K comprises of COD10K-Tr (3,040 images) and COD10K-Te (2,026 images). We utilize a total of 4,040 images, which includes 3,040 from COD10K-Tr and 1,000 from CAMO-Tr, for training the model.

FriedaSmith commented 7 months ago

Your help was very much appreciated.