Closed cui-jia-hua closed 3 years ago
Corresponding Java code:
@cui-jia-hua Your model was saved with NumpyMode.OFF, this was the MXNet 1.5.0 default behavior, we expect new models are all in numpymode. MXNet suppose handle the conversion on model loading, It seems there is bug on MXNet side.
DJL by default set NumpyModel.GLOBAL (this should work for all the model saved with MXNet 1.7.0+).
However, I am able verify this by adding the following in the begining of the code:
Engine engine = Engine.getInstance(); // Make sure engine is loaded first, otherwise the flag will be lost
JnaUtils.setNumpyMode(JnaUtils.NumpyMode.OFF);
Thank you! I added these codes and got the correct results. This really helps me a lot.
I want to train a time series forecasting model by gluonts and run inference in djl. But i meet some errors like:
Here is my code, model and data: https://github.com/cui-jia-hua/djlTest This model(DeepAR) is trained by python with gluonts. It seems like some shape is incomatible, but i try to load this model and run inference in gluon and it works without error message.
By the way, there are six NDArray in testdata file, shape of them are (32,1),(32,1),(32,745,5),(32,745),(32,745),(32,24,5). I check my mxnet version and it is 1.7.0, so i choose 1.7.0-b version of my mxnet-native-auto.
Is there anything i missed when i load model with djl?