GantMan / nsfw_model

Keras model of NSFW detector
Other
1.67k stars 267 forks source link

Trying to cache the loaded model in Django #139

Open had6r3 opened 1 year ago

had6r3 commented 1 year ago

In Django, if I set the loaded model in cache:

model = predict.load_model('path\model.h5') cache.set('model', model)

When I try to retrieve it:

cache.get('model')

I get this error:

Unsuccessful TensorSliceReader constructor: Failed to find any matching files for ram://cf101767-7409-47ba-8f98-0921cc47a20a/variables/variables You may be trying to load on a different device from the computational device. Consider setting the experimental_io_device option in tf.saved_model.LoadOptions to the io_device such as '/job:localhost'.

Is it possible to load the model once and keep using it in the Django webapp?