Closed dmpetrov closed 4 months ago
CC @amritghimire
note, this code uses dynamic schema
At this point with all this issue, we should probably plan for moving away from pydantic issue if possible in my opinion. I am not sure how to identify the dynamic object during parsing stage if it is feature class or not
cc. @dtulga Were you able to figure out any alternatives to make these pickle friendly?
If that helps - all dynamic Feature classes are registered in the cache in feature_utils.py
:
feature_cache: dict[type[BaseModel], type[Feature]] = {}
@dtulga Were you able to figure out any alternatives to make these pickle friendly?
That would be great if we can find some solution.
Also, I was thinking... we can try re-creating all dynamic Feature classes in each of the thread/process. I'll try this on the weekends.
we can try re-creating all dynamic Feature classes in each of the thread/process. I'll try this on the weekends.
Well... re-creating the objects should not be a problem. However, it's not clear how to delete the objects before deserialization 😅 Cleaning up the cache is not enough - the objects can be created in user code.
Is there a way to specify for pickle/dill not to serialize particular type of objects (subclasses of Feature
)?
Is there a way to specify for pickle/dill not to serialize particular type of objects (subclasses of Feature)?
I dont think it is possible implicitly but we can determine it and make sure it is not pickled when forming the udf_info that is passed as pickled.
Closed by #45
Description
The code below fails in parallel mode (works with no parallelizm).
We had this issue several times 🙁 - iterative/dvcx#1620
Error:
Version Info