yuqunw / monopatch_nerf

11 stars 1 forks source link

About process custom datasets #2

Closed ywaison closed 1 week ago

ywaison commented 3 months ago

Thank you for sharing the great work, but I had the following difficulties when working with custom datasets. The command line is :python scripts/preprocess_eth3d.py -i data/horse -o output/horse -s data/horse/sparse/0 UD$S{E9UBFX}B(X@AT%N49T

Hope to get your prompt reply, thank you!

yuqunw commented 3 months ago

Hi, thank you for your interest in the work. Based on the error message, it's probably because it fails to download the pretrained weight of preprocessing modules. Could you try to change the Internet, or download it manually and place it locally?

ywaison commented 3 months ago

Hi, thank you for your interest in the work. Based on the error message, it's probably because it fails to download the pretrained weight of preprocessing modules. Could you try to change the Internet, or download it manually and place it locally?

Thank you for your timely reply. I would like to know where I can go to the pre-training weights or do you provide a link? Thank you again!

yuqunw commented 3 months ago

Hi, I am not fully sure which model you fail to download, but you might want to locate it and find the link for the model weight in the specific installation file, i.e., the google drive id of https://github.com/leejaeyong7/OmnidataModels/blob/main/omnidata/dpt_depth.py or the downloading link in https://github.com/leejaeyong7/ADE20KSegmenter/blob/main/ade20k_segmenter/segmentation_module.py. Thanks!