-
when i run training command,
$ bazel-bin/textsum/seq2seq_attention \
--mode=train \
--article_key=article \
--abstract_key=abstract \
--data_path=data/training-* \
--voca…
-
## Interpreting textsum decode params & results
I ran textsum decode with the following parameters
`beamsize=2`
`max_decode_steps=100`
`decode_batches_per_ckpt=8`
I ran the textsum decoder with the …
-
-
## Please let us know which model this issue is about (specify the top-level directory)
I've trained the model using the command provided. But I don't see any folder 'train' in 'textsum/log_root/' di…
-
In `textsum/batch_reader.py`, each thread creates its own `input_gen` like this:
`input_gen = self._TextGenerator(data.ExampleGen(self._data_path))`
In my experiment, this could lead to dupli…
-
## Please let us know which model this issue is about (specify the top-level directory)
The standard recordio line, first 8 byte for message length, then 4 byte for crc , the last is message body.
bu…
-
Hello,
I've successfully trained and tested the textsum model on the toy data (data/data) and would like to test the model on my own data. There is a very brief comment in the DataSet section about …
-
When I try running the textsum in decode mode, it prints our some vectors rather than doing the work. I traced the problem and found that in the seq2seq_attention_model.py file under decode_topk funct…
-
I have found some debug in seq2seq_attention_model.py, the seq2seq_lib.sampled_sequence_loss function 's fourth parameters is a boolean variable. so the right coding is sampled_sequence_loss(
…
-
In the textsum model, the readme says
`# Run the eval. Try to avoid running on the same matchine as training.`
Why is this?
Also, are questions allowed in the issues?