CR-Gjx / LeakGAN

The codes of paper "Long Text Generation via Adversarial Training with Leaked Information" on AAAI 2018. Text generation using GAN and Hierarchical Reinforcement Learning.
https://arxiv.org/abs/1709.08624
576 stars 180 forks source link

a little confused to bleu #6

Closed hscspring closed 6 years ago

hscspring commented 6 years ago

I am a little confused to bleu calculation nltk.translate.bleu_score.sentence_bleu(reference, h, weight) in which reference is save/realtest_coco.txt and h is save/generator_sample.txt,
is that somewhat demands on realtest_coco.txt, size and contents?

Also , this sentence in the paper

In each step, it receives generator D’s high-level feature representation, e.g., the feature map of the CNN, and uses it to form the guiding goal for the W ORKER module in that timestep.

i am not sure whether it's "generator" or "discriminator".

btw, it's really a very nice job, so many thanks to you.

CR-Gjx commented 6 years ago
  1. The details can be found in the "Image Coco" subsection in the paper, we randomly choose 80000 sentences from Coco dataset as the training dataset, then other 5000 sentences are also randomly chosen as the test dataset( save/realtest_coco.txt ).
  2. CNN is discriminator and WORKER module belongs to the generator. Thanks for your attention. I hope the above can answer your questions.
hscspring commented 6 years ago

i get it.
for '1', i've seen the details in the paper, as i am a beginner, i just, a little confused, no doubts :).
thanks for your answer and thank you so much..