Closed kuntzer closed 7 years ago
Nice!
Done. This is now part of the autoplots with, if availble, the correct labelling of the inputs and outputs. Will merge soon.
Done!
Just thinking (not urgent -- once all the rest is working nicely): could be interesting to
Further points:
A netviz animation (this one does not yet start at iteration 0, and simultaneously optimizes the Mult and the Sum layer: https://uni-bonn.sciebo.de/index.php/s/DqbiMgTByiEkFcy
A summary of the above points and a few others:
Nice -- I really like this plot, I guess we'll also use it in papers!
I would say that
With this option, I guess that multnets don't even need special handling. They are just normal nets which happen to have a mult layer specially initialized.
working on this in branch viznet
Should be taking a net rather than a Training
mmmh, that would be a big change: a net does not know the error function or the in-/outputs names. The plot would be less interesting. I could imagine a very adaptive plot that could take a train
or a net
, but the polymorphism in Python is a bit messy.
The plot looks like this now:
if the net contains only summation layers:
if the net contains both modes:
The title is an argument that is by default "default"
. This will generate the string shown in the plots, but any other string can be passed to title
.
Other suggestions?
But this plot is really about a net, and not about a training! A yes, a net knows and nicely handles its inames and onames! Information about the cost function, iteration number, should be passed into a "title" or "textlines" argument. Note that the first part of the current title is in fact what is shown in the plot.
Okay, done. The changes were easy enough
Let's close this. (at least for now)
I wrote a script that produces Tensorflow Playground-like plots for Tenbilac. I will add to
tenbilac.plot.py
this in a new branch (#16).