hi!I want to use the pytorch model in a distributed environment, but many classes in djl do not implement the Serializable interface.
Inheriting classes and then implementing Serializable interface will involve many other classes that still cannot be serialized, and the whole process is very troublesome, is there an easy way to do this?
writeReplace data (class: java.lang.invoke.SerializedLambda)
object (class org.apache.spark.graphx.impl.VertexRDDImpl$$Lambda$1501/1769042905, org.apache.spark.graphx.impl.VertexRDDImpl$$Lambda$1501/1769042905@7fb53256)
at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:41)
at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:47)
at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:101)
at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:413)
... 13 more`
hi!I want to use the pytorch model in a distributed environment, but many classes in djl do not implement the Serializable interface.
Inheriting classes and then implementing Serializable interface will involve many other classes that still cannot be serialized, and the whole process is very troublesome, is there an easy way to do this?
error example: `Caused by: java.io.NotSerializableException: ai.djl.pytorch.engine.PtModel Serialization stack:
Thanks!