Open niederle opened 2 years ago
I'm not sure what's going on here, I haven't seen this error before. You're running the same notebook in the repo?
FileNotFoundError: Unsuccessful TensorSliceReader constructor: Failed to find any matching files for ram://3bbf5d07-6468-48ed-be1d-e056e006589b/variables/variables
You may be trying to load on a different device from the computational device. Consider setting the `experimental_io_device` option in `tf.saved_model.LoadOptions` to the io_device such as '/job:localhost'.
It sounds like a low-level tensorflow issue with finding a variable that is supposed to be stored in RAM. Do you get the same error when you try saving other tensorflow models?
Yes, I ran the notebook from the repo.
If I run the example code of saving a tensorflow model I do not get any error. Also no problems, if I run the example code to save a keras model. I that the way to test?
jonathan-conder-sm Thanks! I could save model with pkl file. However, Warning is turned out:tensorflow:No training configuration found in save file, so the model was not compiled. Compile it manually.
And then, load file. load_embedder = load_ParametricUMAP("load file name") additional_embedding = load_embedder.transform(test_images)
raise ValueError(f'Input {input_index} of layer "{layer_name}" is ' ValueError: Input 0 of layer "sequential" is incompatible with the layer: expected shape=(None, 28, 28, 1), found shape=(1000, 784)
Please how to solve.
Does anyone know how to save and load the base model?
The ParametricUMAP module is useless without this functionality.
I found a quick fix for this issue if you're experiencing the "FileNotFoundError: Unsuccessful TensorSliceReader constructor: ... ".
The error indicates that Keras models shouldn't be pickled and instead should be saved in a supported format like .keras, .h5, etc. The issue arises when it's trying to pickle the whole ParametricUMAP (embedder) object. However, the encoder/decoder and the parametric umap models should be saved without issues.
The should_pickle
function doesn't account for the FileNotFoundError exception, which in turns results in the script crashing in the try block without catching the exception. Here is a slighly modified version of the funciton:
import umap
import pickle, codecs
from numba import TypingError
from warnings import warn
def should_pickle(key, val):
"""
Checks if a dictionary item can be pickled
Parameters
----------
key : try
key for dictionary element
val : None
element of dictionary
Returns
-------
picklable: bool
whether the dictionary item can be pickled
"""
print(f"KEY TO PICKLE: {key}")
try:
## make sure object can be pickled and then re-read
# pickle object
pickled = codecs.encode(pickle.dumps(val), "base64").decode()
# unpickle object
unpickled = pickle.loads(codecs.decode(pickled.encode(), "base64"))
except (
pickle.PicklingError,
tf.errors.InvalidArgumentError,
TypeError,
tf.errors.InternalError,
tf.errors.NotFoundError,
OverflowError,
TypingError,
AttributeError,
) as e:
warn("Did not pickle {}: {}".format(key, e))
return False
except ValueError as e:
warn(f"Failed at pickling {key}:{val} due to {e}")
return False
except FileNotFoundError as e:
warn(f"Failed at pickling {key}:{val} due to {e}")
return False
return True
Then override the function as follows:
umap.parametric_umap.should_pickle = should_pickle
Now you should be able to save the model as follows:
embedder.save(save_location=filePath)
When you load the model back using umap.parametric_umap.load_ParametricUMAP()
, you should be able to transform new samples. One thing that didn't get pickled are the weights of the parametric umap's optimizer. However, you don't need them for transforming new samples as the embedding has already been fit
Hello,
I played around with the parametric umap going through the mnist notebook.
when running
embedder.save('/path/to/my/model')
I get the this output followed by the following error message:
I am running the code on a windows10 machine in a conda environment consisting of the following packages:
I have no idea what goes wrong and whether I can do anything to solve the issue. I would be happy about feedback and I hope this is the right place to ask for help.