Open cryptexcode opened 6 years ago
Can you provide the full code? more than '''sess.run(metrics, compiler.build_feed_dict(test_trees))'''
I want to see your code.
I didn't code anything. I actually don't work with Tensorflow. I just wanted to get the predictions using this system.
Okay, I hope this comment can help. Let me know if it doesn't
I will be using some of the photo from my blog post: https://towardsdatascience.com/iclr-2015-striving-for-simplicity-the-all-convolutional-net-with-interactive-code-manual-b4976e206760
We can see that we have a simple graph of 9 layers of convolution layers right?
Now inside the code, from my best guess that what your metric variable would be something like accuracy. As in the metrics.
And what you want (again my best guess) would be the predictions them self so after the softmax layer operation. In that case, it is easy to just call the 'final_soft' rather than the accuracy.
Now I know this is not much of a help. So let me know
@cryptexcode let me know if this helped.
Hi @JaeDukSeo, Thank you so much for your guideline. But I am actually clueless. May be if you can look at this notebook, you will get the scenario. https://github.com/tensorflow/fold/blob/master/tensorflow_fold/g3doc/sentiment.ipynb As I am not much familiar with TF, having trouble understanding. Here the metric thing is a function. Thats very confusing.
Thank you again.
here that is variable what you want. rather calling the metrics we can call that variable directly and get the softmax values. And I know it is very complicated so I'll try to learn the code, as well as NLP, is another area I want to get into.
Anyways keep in touch soon.
@cryptexcode so one problem, I am on a window system and it might take a while, since I can't install TF fold here and the only way I can think to make this work is understand the code and rewrite everything
How can I print the fine grained predictions for the test / dev set after the training is done?
prints the accuracies. But I want the actual predictions. How can I get them? Thanks