NielsRogge / Transformers-Tutorials

This repository contains demos I made with the Transformers library by HuggingFace.
MIT License
8.45k stars 1.32k forks source link

Fine Tuning OneFormer for Semantic Segmentation, how to change model's class number #387

Closed eBeyzaG closed 4 months ago

eBeyzaG commented 5 months ago

I am fine tuning OneFormer for semantic segmentation and I have 5 classes including background. While training, I am also calculating mIOU so I extract segmentation maps. But as the model config has 150 classes, it outputs class numbers that do not exist in my dataset. I also tried changing num_labels parameter as shown below

model=OneFormerForUniversalSegmentation.from_pretrained(
        "shilabs/oneformer_ade20k_swin_tiny",
        num_labels=5,
        ignore_mismatched_sizes=True)

But model gives error

446 # [batch_size, hidden_dim] 447 image_queries = nn.functional.normalize(image_queries.flatten(1), dim=-1) --> 448 text_queries = nn.functional.normalize(text_queries.flatten(1), dim=-1) 450 logit_scale = torch.clamp(self.logit_scale.exp(), max=100) 452 logits_per_text = torch.matmul(text_queries, image_queries.t()) * logit_scale

AttributeError: 'NoneType' object has no attribute 'flatten'

How can I change class count while fine tuning?

BhavanaMP commented 4 months ago

@eBeyzaG I guess you have to add your classes id2label and label2id to the model while instantiating

something like this... model = AutoModelForUniversalSegmentation.from_pretrained( model_ckpt, is_training=is_train, id2label=id2label, label2id=label2id, ignore_mismatched_sizes=True )

eBeyzaG commented 4 months ago

Thank you for answering. While trying your answer I realized the real problem, I didn't specify is_training=True while instantiating the model causing the text queries to not be calculated. The error was caused by this.