chmxu / eTT_TMLR2022

20 stars 2 forks source link

Training commands #5

Closed ppalantir closed 1 year ago

ppalantir commented 1 year ago

Hello. Could you please provide the training commands for reproduction ? Thanks a lot

chmxu commented 1 year ago

Hi! Do you mean code for the DINO pretrain phase?

ppalantir commented 1 year ago

Yes. If it is possible, I also expect the test time tuning command. Thanks!!!

chmxu commented 1 year ago

For DINO pretraining you can refer to pretrain_code_snippet and https://github.com/facebookresearch/dino. By replacing the corresponding dataset code and using main_dino_metadataset.py instead of main_dino.py in the original DINO repo, you can pretrain a DINO model required in our pipeline.

For testing, python test_extractor_pa_vit_prefix.py --data.test [DATASET_NAME] is ok. Hope this can solve your problems.

ppalantir commented 1 year ago

thanks! I download the weights from link and try to use the command python test_extractor_pa_vit_prefix.py --data.test ilsvrc_2012 omniglot aircraft cu_birds dtd quickdraw fungi vgg_flower but it reports following errors:

_IncompatibleKeys(missing_keys=['blocks.0.adapter_1.offset', 'blocks.0.adapter_2.offset', 'blocks.1.adapter_1.offset', 'blocks.1.adapter_2.offset', 'blocks.2.adapter_1.offset', 'blocks.2.adapter_2.offset', 'blocks.3.adapter_1.offset', 'blocks.3.adapter_2.offset', 'blocks.4.adapter_1.offset', 'blocks.4.adapter_2.offset', 'blocks.5.adapter_1.offset', 'blocks.5.adapter_2.offset', 'blocks.6.adapter_1.offset', 'blocks.6.adapter_2.offset', 'blocks.7.adapter_1.offset', 'blocks.7.adapter_2.offset', 'blocks.8.adapter_1.offset', 'blocks.8.adapter_2.offset', 'blocks.9.adapter_1.offset', 'blocks.9.adapter_2.offset', 'blocks.10.adapter_1.offset', 'blocks.10.adapter_2.offset', 'blocks.11.adapter_1.offset', 'blocks.11.adapter_2.offset'], unexpected_keys=[])

and

File "mtrand.pyx", line 920, in numpy.random.mtrand.RandomState.choice ValueError: setting an array element with a sequence. The requested array has an inhomogeneous shape after 1 dimensions. The detected shape was (15,) + inhomogeneous part.

Have you encountered such a problem before?

ppalantir commented 1 year ago

I search and find the error "File "mtrand.pyx", line 920, in numpy.random.mtrand.RandomState.choice" resulting from numpy version. I don't know whether "_IncompatibleKeys(missing_keys=['blocks.0.adapter_1.offset'" is a problem.

It takes 8~10s for one task during the test, and 600 tasks take more than one hour. Is it as expected?

chmxu commented 1 year ago

Have you revised the code? I assume this error is caused by missing additional DRA parameters in the initial DINO pretrained weights. But the original version of code for loading the weights sets strict=False (link).

ppalantir commented 1 year ago

I didn't revise the code. Is the version of the weight right in the link? I download the weights here link. I check the weight and don't find the DRA parameters.

model['student'].keys() odict_keys(['module.backbone.cls_token', 'module.backbone.pos_embed', 'module.backbone.patch_embed.proj.weight', 'module.backbone.patch_embed.proj.bias', 'module.backbone.blocks.0.norm1.weight', 'module.backbone.blocks.0.norm1.bias', 'module.backbone.blocks.0.attn.qkv.weight', 'module.backbone.blocks.0.attn.qkv.bias', 'module.backbone.blocks.0.attn.proj.weight', 'module.backbone.blocks.0.attn.proj.bias', 'module.backbone.blocks.0.norm2.weight', 'module.backbone.blocks.0.norm2.bias', 'module.backbone.blocks.0.mlp.fc1.weight', 'module.backbone.blocks.0.mlp.fc1.bias', 'module.backbone.blocks.0.mlp.fc2.weight', 'module.backbone.blocks.0.mlp.fc2.bias', 'module.backbone.blocks.1.norm1.weight', 'module.backbone.blocks.1.norm1.bias', 'module.backbone.blocks.1.attn.qkv.weight', 'module.backbone.blocks.1.attn.qkv.bias', 'module.backbone.blocks.1.attn.proj.weight', 'module.backbone.blocks.1.attn.proj.bias', 'module.backbone.blocks.1.norm2.weight', 'module.backbone.blocks.1.norm2.bias', 'module.backbone.blocks.1.mlp.fc1.weight', 'module.backbone.blocks.1.mlp.fc1.bias', 'module.backbone.blocks.1.mlp.fc2.weight', 'module.backbone.blocks.1.mlp.fc2.bias', 'module.backbone.blocks.2.norm1.weight', 'module.backbone.blocks.2.norm1.bias', 'module.backbone.blocks.2.attn.qkv.weight', 'module.backbone.blocks.2.attn.qkv.bias', 'module.backbone.blocks.2.attn.proj.weight', 'module.backbone.blocks.2.attn.proj.bias', 'module.backbone.blocks.2.norm2.weight', 'module.backbone.blocks.2.norm2.bias', 'module.backbone.blocks.2.mlp.fc1.weight', 'module.backbone.blocks.2.mlp.fc1.bias', 'module.backbone.blocks.2.mlp.fc2.weight', 'module.backbone.blocks.2.mlp.fc2.bias', 'module.backbone.blocks.3.norm1.weight', 'module.backbone.blocks.3.norm1.bias', 'module.backbone.blocks.3.attn.qkv.weight', 'module.backbone.blocks.3.attn.qkv.bias', 'module.backbone.blocks.3.attn.proj.weight', 'module.backbone.blocks.3.attn.proj.bias', 'module.backbone.blocks.3.norm2.weight', 'module.backbone.blocks.3.norm2.bias', 'module.backbone.blocks.3.mlp.fc1.weight', 'module.backbone.blocks.3.mlp.fc1.bias', 'module.backbone.blocks.3.mlp.fc2.weight', 'module.backbone.blocks.3.mlp.fc2.bias', 'module.backbone.blocks.4.norm1.weight', 'module.backbone.blocks.4.norm1.bias', 'module.backbone.blocks.4.attn.qkv.weight', 'module.backbone.blocks.4.attn.qkv.bias', 'module.backbone.blocks.4.attn.proj.weight', 'module.backbone.blocks.4.attn.proj.bias', 'module.backbone.blocks.4.norm2.weight', 'module.backbone.blocks.4.norm2.bias', 'module.backbone.blocks.4.mlp.fc1.weight', 'module.backbone.blocks.4.mlp.fc1.bias', 'module.backbone.blocks.4.mlp.fc2.weight', 'module.backbone.blocks.4.mlp.fc2.bias', 'module.backbone.blocks.5.norm1.weight', 'module.backbone.blocks.5.norm1.bias', 'module.backbone.blocks.5.attn.qkv.weight', 'module.backbone.blocks.5.attn.qkv.bias', 'module.backbone.blocks.5.attn.proj.weight', 'module.backbone.blocks.5.attn.proj.bias', 'module.backbone.blocks.5.norm2.weight', 'module.backbone.blocks.5.norm2.bias', 'module.backbone.blocks.5.mlp.fc1.weight', 'module.backbone.blocks.5.mlp.fc1.bias', 'module.backbone.blocks.5.mlp.fc2.weight', 'module.backbone.blocks.5.mlp.fc2.bias', 'module.backbone.blocks.6.norm1.weight', 'module.backbone.blocks.6.norm1.bias', 'module.backbone.blocks.6.attn.qkv.weight', 'module.backbone.blocks.6.attn.qkv.bias', 'module.backbone.blocks.6.attn.proj.weight', 'module.backbone.blocks.6.attn.proj.bias', 'module.backbone.blocks.6.norm2.weight', 'module.backbone.blocks.6.norm2.bias', 'module.backbone.blocks.6.mlp.fc1.weight', 'module.backbone.blocks.6.mlp.fc1.bias', 'module.backbone.blocks.6.mlp.fc2.weight', 'module.backbone.blocks.6.mlp.fc2.bias', 'module.backbone.blocks.7.norm1.weight', 'module.backbone.blocks.7.norm1.bias', 'module.backbone.blocks.7.attn.qkv.weight', 'module.backbone.blocks.7.attn.qkv.bias', 'module.backbone.blocks.7.attn.proj.weight', 'module.backbone.blocks.7.attn.proj.bias', 'module.backbone.blocks.7.norm2.weight', 'module.backbone.blocks.7.norm2.bias', 'module.backbone.blocks.7.mlp.fc1.weight', 'module.backbone.blocks.7.mlp.fc1.bias', 'module.backbone.blocks.7.mlp.fc2.weight', 'module.backbone.blocks.7.mlp.fc2.bias', 'module.backbone.blocks.8.norm1.weight', 'module.backbone.blocks.8.norm1.bias', 'module.backbone.blocks.8.attn.qkv.weight', 'module.backbone.blocks.8.attn.qkv.bias', 'module.backbone.blocks.8.attn.proj.weight', 'module.backbone.blocks.8.attn.proj.bias', 'module.backbone.blocks.8.norm2.weight', 'module.backbone.blocks.8.norm2.bias', 'module.backbone.blocks.8.mlp.fc1.weight', 'module.backbone.blocks.8.mlp.fc1.bias', 'module.backbone.blocks.8.mlp.fc2.weight', 'module.backbone.blocks.8.mlp.fc2.bias', 'module.backbone.blocks.9.norm1.weight', 'module.backbone.blocks.9.norm1.bias', 'module.backbone.blocks.9.attn.qkv.weight', 'module.backbone.blocks.9.attn.qkv.bias', 'module.backbone.blocks.9.attn.proj.weight', 'module.backbone.blocks.9.attn.proj.bias', 'module.backbone.blocks.9.norm2.weight', 'module.backbone.blocks.9.norm2.bias', 'module.backbone.blocks.9.mlp.fc1.weight', 'module.backbone.blocks.9.mlp.fc1.bias', 'module.backbone.blocks.9.mlp.fc2.weight', 'module.backbone.blocks.9.mlp.fc2.bias', 'module.backbone.blocks.10.norm1.weight', 'module.backbone.blocks.10.norm1.bias', 'module.backbone.blocks.10.attn.qkv.weight', 'module.backbone.blocks.10.attn.qkv.bias', 'module.backbone.blocks.10.attn.proj.weight', 'module.backbone.blocks.10.attn.proj.bias', 'module.backbone.blocks.10.norm2.weight', 'module.backbone.blocks.10.norm2.bias', 'module.backbone.blocks.10.mlp.fc1.weight', 'module.backbone.blocks.10.mlp.fc1.bias', 'module.backbone.blocks.10.mlp.fc2.weight', 'module.backbone.blocks.10.mlp.fc2.bias', 'module.backbone.blocks.11.norm1.weight', 'module.backbone.blocks.11.norm1.bias', 'module.backbone.blocks.11.attn.qkv.weight', 'module.backbone.blocks.11.attn.qkv.bias', 'module.backbone.blocks.11.attn.proj.weight', 'module.backbone.blocks.11.attn.proj.bias', 'module.backbone.blocks.11.norm2.weight', 'module.backbone.blocks.11.norm2.bias', 'module.backbone.blocks.11.mlp.fc1.weight', 'module.backbone.blocks.11.mlp.fc1.bias', 'module.backbone.blocks.11.mlp.fc2.weight', 'module.backbone.blocks.11.mlp.fc2.bias', 'module.backbone.norm.weight', 'module.backbone.norm.bias', 'module.head.mlp.0.weight', 'module.head.mlp.0.bias', 'module.head.mlp.2.weight', 'module.head.mlp.2.bias', 'module.head.mlp.4.weight', 'module.head.mlp.4.bias', 'module.head.last_layer.weight_g', 'module.head.last_layer.weight_v'])

chmxu commented 1 year ago

It's true that the pretrained weights don't have additional learnable parameters introduced by APT and DRA, but this can't cause the error. Please provide the complete error log so that I can check the problem.

ppalantir commented 1 year ago

sorry. it starts running after I downgrade the numpy version. it takes around 1 hour on each dataset, and the current results are: ilsvrc_2012: test_acc 71.93% omniglot: test_acc 80.32% aircraft: test_acc 81.73%

chmxu commented 1 year ago

Cool. These are even better than the reported ones. Would you mind showing me the current numpy version you are using so that I can update the requirement in readme?

ppalantir commented 1 year ago

sure. I use 1.23.0

chmxu commented 1 year ago

OK. If you don't have further problems I'll close this thread.