Closed xlfe closed 1 year ago
Both cases are covered (and I'm interested in feedback and ideas how to make it even better).
map-channel
and then (transfer! my-network that-channel)
and (transfer! that-channel my-network)
. This works with raw bytes, so it assumes that you'll only transfer data that fits your network's structure.Parameters
and ParameterSeq
protocols in ...internal.protocols
, with methods weights
, bias
, and parameters
. It's not public, because why would anyone want to look at millions of floating point numbers, other than debugging (and even then, it's difficult for a human to do). Mind you, the saving/loading works on the network as a whole.thanks @blueberry - I will have a go and provide some feedback
@blueberry thank you for all your work on this libary, and the other uncomplicate projects!
I'm wondering if it is missing, or if I have missed how, but once I have a trained network, there doesn't seem to be a way to save the trained network and to load it back later?
Also, for a word2vec style use case, one needs access to the trained weights (rather than the trained network). But save poking around the "uncomplicate.diamond.internal" ns, you don't seem to expose access to those either?
Appreciate your thoughts!