Closed sl403 closed 4 years ago
Sorry another question.
If I want to use a trained model (models/110518_multi) to make predictions, how should I perform it? Thanks
Hi, thanks for your interest in this work!
The current model uses word+POS-tag features by combining them into joint "word_POS" tokens (see here). You can do the same for prosody or change the architecture so that there's separate LSTM encoders for each signal.
You can interact with the model using this script, running predictions for a dataset is done using this script.
Please let me know if there's anything else you want to know.
Thank for your reply.
I try to use your "predict.py"
code for predict the trained model.
But it doesn't work.
#> python predict.py swbd/testset.json models/110518_multi result
...
Traceback (most recent call last):
File "predict.py", line 42, in <module>
main(args.dataset_folder, args.model_folder, args.result_file)
File "predict.py", line 21, in main
model, vocab, char_vocab, label_vocab = load(in_model_folder, sess)
ValueError: too many values to unpack
Hey, thanks for the bug report!
I fixed some bugs - could you please git pull
and try again now, with requirements reinstalled in case any tensorflow/tensorboard import errors?
Hi, thank you for fixing bugs.
I tried to rebuild the conda environment with the new requirement.txt
, but the following error occurred.
#>python predict.py swbd/testset.json models/110518_multi result
...
Done loading
Processed 0 out of 1455 batches
Processed 1000 out of 1455 batches
Traceback (most recent call last):
File "predict.py", line 43, in <module>
main(args.dataset_folder, args.model_folder, args.result_file)
File "predict.py", line 27, in main
y_pred = predict(model, (X, y), rev_label_vocab, sess)
File "/home/user/multitask_disfluency_detection/dialogue_denoiser_lstm.py", line 287, in predict
predictions = map(rev_label_vocab_main_task.get, y_pred_main_task)
AttributeError: 'unicode' object has no attribute 'get'
Please try now, it might finally work :) thank you for your patience!
Thank you for your quick respond.
The train.py
and predict.py
code run perfectly.
The evaluate.py
seems to be unable to run normally.
#> python evaluate.py models/110518_multi deep_disfluency/deep_disfluency/data/disfluency_detection/switchboard/swbd_disf_heldout_data_timings.csv deep_disfluency
...
input file has timings
loading data deep_disfluency/deep_disfluency/data/disfluency_detection/switchboard/swbd_disf_heldout_data_timings.csv
loaded 102 sequences
Processed 0 out of 1487 batches
Processed 1000 out of 1487 batches
Traceback (most recent call last):
File "evaluate.py", line 62, in <module>
main(args.dataset, args.model_folder, args.mode)
File "evaluate.py", line 45, in main
sess)
File "/home/user/multitask_disfluency_detection/dialogue_denoiser_lstm.py", line 442, in eval_deep_disfluency
target_file_path=increco_file)
File "/home/user/multitask_disfluency_detection/dialogue_denoiser_lstm.py", line 380, in predict_increco_file
in_session)
File "/home/user/multitask_disfluency_detection/dialogue_denoiser_lstm.py", line 287, in predict
predictions = map(rev_label_vocab_main_task.get, y_pred_main_task)
AttributeError: 'tuple' object has no attribute 'get'
Hi, Fixed now - could you please try again?
Hi, I have another question about this code "post_train_lm.py code". How do you use this code "post_train_lm.py" in your program? In lines 70-72 of the code, LMs are required as input data. How should it be generated? Thank you.
Hi, I don't think post_train_lm.py
was used in the paper - I did some follow-up experiments with this technique but that's about it.
Therefore, no guarantees on it being in a working state..:)
okay, I see.
Is it possible to test the quality of your experiment LM ? Thank you.
In the existing code, I believe we only report the combined loss (tagging + lm + l2), and after training, we disregard the LM head altogether. So you'd have to write this logic yourself.
ok, thanks for your reply.
Hi,
Is it possible to add another input features (such as a prosodic feature) to your "multitasking dissatisfaction detection" experiment? Thank you.