-
I'm just trying to train the model on my own dataset and I keep getting this error:
Traceback (most recent call last):
File "train.py", line 95, in
main(args.mode)
File "train.py", line…
-
Creating placeholder issue to integrate Open-domain long-form question answering (LFQA) with Haystack.
I feel it is very relevant with Haystack.
Hopefully soon we will see good implementation in …
-
**System information**
- Have I written custom code (as opposed to using a stock example script provided in TensorFlow): Yes
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Linux Ubuntu 1…
-
I am trying to fine-tune the ernie-gen base model for squad-qg task, but I get the following error.
----------- Configuration Arguments -----------
current_node_ip: 11.111.0.8
log_prefix:
node_…
-
dec_cell = tf.contrib.seq2seq.DynamicAttentionWrapper(dec_cell,
AttributeError: module 'tensorflow.contrib.seq2seq' has no attribute 'DynamicAttentionWrapper'
-
```
WARNING!!! Argument "--load_fn" is not found in saved model. Use current value: ckpts/transformer_model_test.05.4.78-119.61.4.75-115.20.pth
WARNING!!! You changed value for argument "--model_fn"…
-
Hi,
Lately I've been working on an implementation of Relative Position Representations (RPR), proposed by [Shaw et al. (2018)](https://arxiv.org/pdf/1803.02155.pdf), for the Transformer model. By d…
ksbsk updated
3 years ago
-
Hi Ben,
As you advised, I tried creating a seq2seq model using pretrained BERT model following your tutorials:
https://github.com/bentrevett/pytorch-sentiment-analysis/blob/master/6%20-%20Transfor…
-
Hi,
I tried creating a seq2seq model using pretrained BERT model following your tutorials:
https://github.com/bentrevett/pytorch-sentiment-analysis/blob/master/6%20-%20Transformers%20for%20Sentime…
-
# 🚀 Feature request
All Seq2Seq models that make use of `generate()` usually allow `past_key_values` to be cached for both the cross-attention layer and the uni-directional decoder self-attenti…