-
Hi, when I tried to run test. I got this error.
```bash
Unexpected key(s) in state_dict: "backbone.patch_embed.backbone.layers.3.downsample.norm.weight", "backbone.patch_embed.backbone.layers.3.down…
-
Some weights of the model checkpoint were not used when initializing CLIPTextModel:
['text_model.embeddings.position_ids']
Loading pipeline components...: 100%|█████████████████████████████████████…
-
### Checklist
- [x] The issue exists after disabling all extensions
- [x] The issue exists on a clean installation of webui
- [ ] The issue is caused by an extension, but I believe it is caused b…
-
Repost from the [PyTorch forum](https://discuss.pytorch.org/t/flex-attention-gaps-in-profiler/211917/1)
I have recently been playing with Flex attention, trying to replace some of my custom triton …
-
I download the cotracker2.pth locally and move it into the /.cache/torch/hub/checkpoints/ to load.But when I run the demo "bash configs/examples/constant_motion/head6.sh",some errors occurs.
Runtim…
-
Hi,
Thank you for your open source work!
I have a question about loading checkpoints. I want to reproduce the code in "downstream_finetune_example". When I tried to load checkpoints from Visuali…
-
### What is your question?
## How to Pass Weights as Parameters in Flower?
I’m trying to use the Flower framework to train a YOLO model in a federated learning setting. I’m having trouble figuring…
-
报错如下所示。使用的是README中提供的预训练模型Apolloscape/model.ckpt,似乎是模型尺寸不匹配。
```shell
> python3 scripts/benchmark_val.py …
-
No operator found for `memory_efficient_attention_forward` with inputs:
query : shape=(1, 577, 16, 64) (torch.bfloat16)
key : shape=(1, 577, 16, 64) (torch.bfloat16)
value : shape=(1, 577, 16, 64) …
-
def build_cotracker(checkpoint=None, offline=True, window_len=16, v2=False):
if offline:
cotracker = CoTrackerThreeOffline(
stride=4, corr_radius=3, window_len=window_len
…