-
- https://arxiv.org/abs/1609.05473
- 2017 AAAI
識別モデルを用いて生成モデルを学習するGenerative Adversarial Nets (GAN)は、生成モデルを学習する新しい方法として、実値データの生成に大きな成功を収めている。
しかし、離散的なトークンのシーケンスを生成することを目的とした場合には、限界があります。
その主な理由は…
e4exp updated
3 years ago
-
I'm trying to reproduce the Poem BLEU-2 result in the SeqGan paper, but I couldn't find out the vocabulary size used in the paper. In the RankGan paper, it uses a different dataset with size of 13,123…
zl1zl updated
5 years ago
-
https://arxiv.org/abs/1611.01626
-
## Abstract
#### Problem
- GAN has considerable success in generating real-valued data. **However, it has limitations when the goal is for generating sequences of discrete tokens.**
1. the discre…
hon9g updated
5 years ago
-
I am new to SeqGAN, what if i want to use my own training data like chinese poem to do some practise, how can i change the code? thanks a lot ~
dieey updated
7 months ago
-
I'd like to generate my custom data by SeqGAN, could you give more information about `real.data` file for that?
-
https://github.com/ZiJianZhao/SeqGAN-PyTorch/blob/master/main.py#L87
why?
In the original implementation by LantaoYu, exp is not used for loss values.
-
Hi @Anjaney1999 ,
I was looking at your code and trying to find adversarial loss in the generator training scheme:
https://github.com/Anjaney1999/image-captioning-seqgan/blob/10e60ad272070dd90f29004…
-
Based on my understanding, gpt or gpt-2 are using language model loss to train and generate text, which do not contains GAN.
So which is better: GPT vs RelGAN/LeakGAN/SeqGAN/TextGAN
I am so conf…
-
作者您好,很感谢您开源的代码!
我想请教一下,seqGAN中判别器是用到了RNN吗?为什么不是CNN?