-
def forward(self, v, b, q, labels):
"""Forward
v: [batch, num_objs, obj_dim]
b: [batch, num_objs, b_dim]
q: [batch_size, seq_length]
return: logits, n…
-
This is either a GLM bug, or a documentation bug.
For link, family="binomial", the R help says
"binomial": "logit", "log"
But if you try "log" you get:
Incompatible link function for s…
-
### Bug description
`trainer.test(model=model, ckpt_path='best')` works after `trainer.fit` but not otherwise
We get ```ValueError: `.test(ckpt_path="best")` is set but `ModelCheckpoint` is not …
-
### System Info
- `transformers` version: 4.29.2
- Platform: Linux-5.15.107+-x86_64-with-glibc2.31
- Python version: 3.10.11
- Huggingface_hub version: 0.14.1
- Safetensors version: not installed…
-
https://github.com/Kyubyong/transformer/blob/master/model.py
In this code from line 176 ~ 181, you are using "==" inside of tensorflow model which won't work.
`for _ in tqdm(range(self.hp.m…
-
Hi @jotaf98,
how can we get max score indices in logits? I want to calculate a verification loss for a classified person.
For example, let scr is logits of network , for a person, fea_vect is multip…
-
Thank you very much for your implementation of ctc varients. To be frank I think that is the main value of this repo and I would change its name to pytorch ctc varients or something of the like becaus…
-
Dear author,
In below eval_foreard function, it seems not the real autoregressive decoding. since you concate the input and answer_ids together to form the new input_ids, it performs decoding in th…
-
The input generation, inference, and embeddings/logits extraction functions (as appropriate) `tfsemb_main.py` should be moved into separate scripts for `causal`, `mlm`, and `seq2seq` models.
-
I got confused with:
```python
# compute logits
anchor_dot_contrast = torch.div(
torch.matmul(anchor_feature, contrast_feature.T),
self.temperature)
…