-
Based on my understanding, gpt or gpt-2 are using language model loss to train and generate text, which do not contains GAN.
So which is better: GPT vs RelGAN/LeakGAN/SeqGAN/TextGAN
I am so conf…
-
Ideas:
- SeqGAN treats their RNN Generator as an agent of Reinforcement Leraning and uses Monte Carlo Search to find the best possible next word.
- RelGAN: ?
- Treat the output as a single 1D chara…
-
Dear Dev Team,
I am using seq_gan model for generating text sequence. I am trying to use list of valid "subdomains" as input and generate new "text" as output. I am making changes in run/run_seqga…
-
Excuse me, I have some questions. If I want to train SeqGan on the real data like natural text(poem, article), I can not pre-train the generator because I can not know the distribution of the text rig…
-
hello everyone,
I have learned that in order to reduce the variance of gradient estimator,
usually we apply the "reward baseline" technique in the gradient optimization function like
![image](http…
-
The figure of the sentiGAN show that the discriminator of sentiGAN is a multi-class classifier, but in your code, I think it is a binary classifier. So the difference between sentiGAN and seqGAN is ju…
-
Hello!
This is a great effort and has been very useful for my work.
I am trying to change the number of records being generated (and saved to test_file.txt) by changing the generate_num parameter…
-
When I use my dataset,it happens
could you help me?
training arguments:
>>> if_test: False
>>> run_model: seqgan
>>> k_label: 2
>>> dataset: obama
>>> model_type: vanilla
>>> loss_type: rsga…
-
Generative adversarial networks description
mhrmm updated
4 years ago
-
I checked files are being produced by the name of samples_MLE in sample folder but they consist of numbers. I guess they are the tokens of the individual word, if this is the case can you tell me how …
ghost updated
2 years ago