microsoft / unilm

Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
https://aka.ms/GeneralAI
MIT License
19.65k stars 2.51k forks source link

got an unexpected keyword argument 'distilled' #898

Open sssimpleboy opened 1 year ago

sssimpleboy commented 1 year ago

Describe the bug Model I am using (UniLM, MiniLM, LayoutLM ...):

The problem arises when using:

A clear and concise description of what the bug is. image image

To Reproduce Steps to reproduce the behavior:

Expected behavior A clear and concise description of what you expected to happen.

sssimpleboy commented 1 year ago

image image image

stevewillson commented 1 year ago

The parameters distilled and pretrained_cfg are in the **kwargs dict but should not be when it is passed to VisionTransformer's init function.

I added the following right before the super().__init__(*args, **kwargs)

The modified version in the class AdaptedVisionTransformer(VisionTransformer): line now looks like this:

self.distilled = kwargs.pop('distilled')
self.pretrained_cfg = kwargs.pop('pretrained_cfg')
super().__init__(*args, **kwargs)
sssimpleboy commented 1 year ago

The parameters distilled and pretrained_cfg are in the **kwargs dict but should not be when it is passed to VisionTransformer's init function.

I added the following right before the super().__init__(*args, **kwargs)

The modified version in the class AdaptedVisionTransformer(VisionTransformer): line now looks like this:

self.distilled = kwargs.pop('distilled')
self.pretrained_cfg = kwargs.pop('pretrained_cfg')
super().__init__(*args, **kwargs)

Thank you.