-
![image](https://github.com/WennyXY/DINO-MC/assets/83155460/0b7b5699-51b4-4e7c-bd9c-3b8d3bb1fa8f)
Hello author, your pre-training model seems to be vit_small_8 instead of vit_small_16.
-
Thank you for your work, the pre-training model you provided has expired, can you update it again, thank you very much!
-
Thank you for thr job,when will pre-training models of other data be available?
-
Hi,
I probably looked at your train.py and c3d.py files. I want to know why your accuracy is so high without a pre-trained model. I'm confused. If I change to the HMDB51 dataset (only 51 classes), ho…
-
There is no way to download the pre-training model
-
-
Hello, thank you for your excellent work!
Can you provide the pre-training code?
-
## Description
This issue summarizes some early experiments with supervised pre-training SC segmentations.
WIP branch: [nk/jv_vit_unetr_ssl](https://github.com/ivadomed/model-seg-dcm/tree/nk/jv_…
-
We would like to evaluate the model performance for various LLM fine tuning approaches and compare them with the standard benchmarks. An experiment we would like to try is:
- **Compare the full car…
-
Hi,
I want to pre-train bge-large-en on my own data. Is there a requirement on the length of each {"text": str} in the [pre-training](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/…