kerrj / lerf

Code for LERF: Language Embedded Radiance Fields
https://www.lerf.io/
MIT License
649 stars 63 forks source link

How to load the torch models from internet if there is bad network connection? #47

Open elenacliu opened 1 year ago

elenacliu commented 1 year ago

Thank you for your great work!

When trying to extract dino features, we need to load model from github through the code:

https://github.com/kerrj/lerf/blob/3b2cb902ea348cb6abf0cc02511ec0f4a0e38c09/lerf/data/utils/dino_extractor.py#L67-L82

but I cannot find a way to do that without available network. Do you have any alternative methods?

elenacliu commented 1 year ago

I have tried to download the corresponding weights from github to my local machine, and scp it to the server and modify your code to:

from torchvision.models import vit_b_16

if 'dino' in model_type:
            model_path = '/path/to/ckpt/dino_vitbase16_pretrain_full_checkpoint.pth'
            if os.path.exists(model_path):
                 model = vit_b_16(pretrained=True)
                 state_dict = torch.load(model_path)
                 model.load_state_dict(state_dict)
                 model.eval()
            else: 
                 model = torch.hub.load('facebookresearch/dino:main', model_type)

But it seems that the backbone cannot match with the weights, with some keys missing.

image
elenacliu commented 1 year ago

I also tried to use torch.hub.load(source='local'):

model_dir = '~/.cache/torch/checkpoints'  # I have a hubconf.py and the weights dino_vitbase16_pretrain_full_checkpoint.pth under the directory
model_name = 'vit_b_16'
model = torch.hub.load(model_dir, model=model_name, source='local')

which also encountered into error in dino_extractor.py:

https://github.com/kerrj/lerf/blob/3b2cb902ea348cb6abf0cc02511ec0f4a0e38c09/lerf/data/utils/dino_extractor.py#L128

AttributeError: 'VisionTransformer' object has no attribute 'patch_embed'
mini-full commented 9 months ago

Hi, I am having the same problem. Have you fixed that?

elenacliu commented 9 months ago

Sorry about that, but I haven't fixed that.