-
如题,我在train.pt中没看到文字识别的ground truth,如果没有文字的标签,文字识别部分如何训练呢?
-
https://www.google.co.kr/url?sa=t&rct=j&q=&esrc=s&source=web&cd=2&ved=0ahUKEwiOl5Pj19LUAhVKvLwKHVpoDdcQFggvMAE&url=https%3A%2F%2Farxiv.org%2Fpdf%2F1611.00471&usg=AFQjCNEkNnTcTYyq7AI9uFuQKDHom0ai1w
CV…
-
![图片](https://user-images.githubusercontent.com/40679769/71860103-910b9980-312c-11ea-89b1-e14eef877021.png)
address:
tree_evaluator.h file, in line 122
compile error:
tree_evaluator.h(123,1): erro…
-
I'm currently writing a recurrent reinforcement library, with LSTMs, linear attention, etc that I would like to add S4 to.
Unfortunately, I find S4D unable to learn in even simple RL tasks (e.g. outp…
-
**There is a bug in line 70 of test_seq2seq.py file.** The weights you save after training and the weights you try to load for testing are different.
**Please replace with this line:**
self.model.…
-
## 一言でいうと
メモリネットワークを改良し、LSTMに組み込んだ研究。メモリは行列で表現され、入力を含めた各行に対しクエリ(Q)とキー(K)・値(V)を算出し、クエリとキーの近さで重み(=Attention)を計算し値に乗じて更新を行う(softmax(QK^T)V)。言語モデルなどの教師ありと、強化学習で効果を確認。
![image](https://user-images.gi…
-
## 一言でいうと
テキストアドベンチャーゲームを、強化学習+知識グラフで攻略したという研究。ゲームは選択肢で分岐して進むため、得られた状態で内部のグラフを更新していく。グラフ表現(Graph Convolution + Attention)+テキスト表現(一定範囲のBi-LSTM)で行動価値を出力する(行動数はグラフで絞り込む)。
### 論文リンク
https://arxiv…
-
Ping me
-
Hello,
I'm using the latest release of OpenNMT and I got an issue with crash with no error message. I have no idea what could be wrong.
I launched it with this command:
```
th train.lua -data …
-
Seq2Seq(Attention)\Seq2Seq(Attention)-Tensor.py
The shape of the input should be [max_time, batch_size,...]. The input = tf. transpose (dec_inputs, [1, 0, 2]) has already been transformed. In tf. e…