-
I can see in the config file we need to set the path of ( pre-trained weight ), Not sure where to get it from.
Model -> sent1floods11_config.py
Need to Set path to pretrained backbone weights
pre…
-
### Is this a new feature, an improvement, or a change to existing functionality?
Improvement
### How would you describe the priority of this feature request
High
### Please provide a clear descri…
-
## ❓ General Questions
Hello, in phi model, attention and mlp blocks can be executed in parallel because they do not have dependency. In the following code, self.mixer and self.mlp can be executed …
-
I decided to try the nnx vs. equinox for performance and am seeing significant differences (3'ish times slower for nnx). Could be that I wrote a poor MLP implementation or made a collosal profiling m…
-
Hello, thanks for sharing, great work!
i want to train my own network following the tutorial in ./training/README.md , but there were some problems
1. Could not download the pretrained checkpo…
-
Mamba2 is provided in the classification folder, how to apply to VMamba
-
Hello,
Could you provide the weights to try SimpleYOLOWorldDetector by giving image embedding as an input instead of text? When I load from "yolow-v8_l_clipv2_frozen_t2iv2_bn_o365_goldg_pretrain.pt…
-
Hello Xingyu, may I ask how to extract the mask of the entire image based on the pre trained model you provided?
-
Is there any plan to share the implementation of transformer or mlp-like backbone?
g1239 updated
2 months ago
-
Platforms: rocm
This test was disabled because it is failing on main branch ([recent examples](https://torch-ci.com/failure?failureCaptures=%5B%22test%2Fdistributed%2F_composable%2Ffsdp%2Ftest_full…