Open chenttt2001 opened 2 days ago
When publishing the code to GitHub, I deleted some unused parts. That is why the parameters do not match.
Okay, I have adjusted the parameters in base.yaml so that most of the model's structure matches the parameter file, but there are still some MLP parameters that I couldn't find in the pretrained weights, as shown in the image below. Could you please share the pretrained weights that include these parameters?
In the fine-tuning stage of downstream tasks, only the plain transformer structure is used. Hence, the pre-trained weights of those parameters can be ignored in the fine-tuning stage.
Well, thank you for your help
I'm sorry to bother you again. Can you share the weights with these parameters? Maybe my research direction needs them to make some attemptsattempts
Those parameters are not trained by loss and thus are just random numbers.
Does that include img_proj?
And why are the 3dter parameters I download at 3.Bridge3D Weights the same as pointmae weights?
I have corrected the link in the readme file. Now you can download the pre-trained 3dter parameters and the correct pre-training weight. Sorry about that.
you can now check the parameters in the pre-trained encoder. I just realized that the link in the README file was incorrect after the last update... Thanks for pointing out!
You're welcome. Glad I could help
Hello, author! I am encountering an issue where the pretrained model parameters I downloaded do not match the model constructed using the base.yaml configuration file.