Open innat opened 10 months ago
import scipy
sparse_weights = scipy.sparse.csr_matrix(my_sum.get_weights())
print(sparse_weights)
How about obtaining the weight?
This might kinda look invalid as I'm using torch backend and yet using tf.saved_model.save
to get saved-model. But I was hoping keras would do some majic here :D
@innat So do i. 👍
My understanding is that, if you want to "switch backends" like this, the only way is to save the model as .keras, and reload it after having enabled another. This assumes that all custom layers are implemented with keras ops, and not directly in one of the backends.
+1 to what lbortolotti said above: use the .keras format to swwitch backends.
Exporting from PyTorch to SavedModel has been done elsewhere though: https://github.com/pytorch/xla/blob/r2.1/docs/stablehlo.md#convert-saved-stablehlo-for-serving
We might wan to explore that to implement the model.export functionality for the PyTorch backend.
With
torch
backend, the keras model is unable to save in SavedModel format. Is it expected? If so, if I develop my code with torch backend and later want to convert to SM format, I have to ensure that the code is runable to both backend (cost). !!!This is also same if we try to load a savedmodel as follows other than tensorflow backend, it won't work