Closed leondavi closed 12 months ago
Hi @leondavi ,
OpenNN Neural Network Models support serialisation. This is done through the XML file format and can be done with the following commands:
/// Saves to an XML file the members of a neural network object.
/// @param file_name Name of neural network XML file.
void NeuralNetwork::save(const string& file_name) const
/// Loads from an XML file the members for this neural network object.
/// Please mind about the file format, which is specified in the User's Guide.
/// @param file_name Name of neural network XML file.
void NeuralNetwork::load(const string& file_name)
You can also load and save the parameters via a binary file with the methods NeuralNetwork::save_parameters(const string& file_name) and NeuralNetwork::load_parameters_binary(const string& file_name).
Another way to export/save models are the following methods:
string NeuralNetwork::write_expression() const;
string NeuralNetwork::write_expression_python() const;
string NeuralNetwork::write_expression_c() const;
string NeuralNetwork::write_expression_api() const;
string NeuralNetwork::write_expression_javascript() const;
void NeuralNetwork::save_expression_c(const string&) const;
void NeuralNetwork::save_expression_python(const string&) const;
void NeuralNetwork::save_expression_python(const string&) const;
void NeuralNetwork::save_expression_api(const string&) const;
void NeuralNetwork::save_expression_api(const string&) const;
void NeuralNetwork::save_expression_javascript(const stri0ng&) const;
However, the loading of these models in OpenNN is not implemented.
I hope all this solves your question, thanks for your comment and interest in the OpenNN library.
Does OpenNN support save/load of models and if the answer is yes what format kinds are supported?