Open-EO / openeo-geotrellis-extensions

Java/Scala extensions for Geotrellis, for use with OpenEO GeoPySpark backend.
Apache License 2.0
5 stars 3 forks source link

exception: saving GTiff with colormap on datacube with time dimension #103

Closed jdries closed 1 year ago

jdries commented 1 year ago
Traceback (most recent call last):
  File "/opt/openeo/lib64/python3.8/site-packages/openeogeotrellis/deploy/batch_job.py", line 354, in main
    run_driver()
  File "/opt/openeo/lib64/python3.8/site-packages/openeogeotrellis/deploy/batch_job.py", line 325, in run_driver
    run_job(
  File "/opt/openeo/lib/python3.8/site-packages/openeogeotrellis/utils.py", line 48, in memory_logging_wrapper
    return function(*args, **kwargs)
  File "/opt/openeo/lib64/python3.8/site-packages/openeogeotrellis/deploy/batch_job.py", line 437, in run_job
    assets_metadata = result.write_assets(str(output_file))
  File "/opt/openeo/lib/python3.8/site-packages/openeo_driver/save_result.py", line 111, in write_assets
    return self.cube.write_assets(filename=directory, format=self.format, format_options=self.options)
  File "/opt/openeo/lib/python3.8/site-packages/openeogeotrellis/geopysparkdatacube.py", line 1592, in write_assets
    timestamped_paths = get_jvm().org.openeo.geotrellis.geotiff.package.saveRDDTemporal(
  File "/usr/local/spark/python/lib/py4j-0.10.9.2-src.zip/py4j/java_gateway.py", line 1309, in __call__
    return_value = get_return_value(
  File "/usr/local/spark/python/lib/py4j-0.10.9.2-src.zip/py4j/protocol.py", line 326, in get_return_value
    raise Py4JJavaError(
py4j.protocol.Py4JJavaError: An error occurred while calling z:org.openeo.geotrellis.geotiff.package.saveRDDTemporal.
: org.apache.spark.SparkException: Task not serializable
        at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:416)
        at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:406)
        at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162)
        at org.apache.spark.SparkContext.clean(SparkContext.scala:2477)
        at org.apache.spark.rdd.RDD.$anonfun$map$1(RDD.scala:422)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
        at org.apache.spark.rdd.RDD.withScope(RDD.scala:414)
        at org.apache.spark.rdd.RDD.map(RDD.scala:421)
        at org.openeo.geotrellis.geotiff.package$.saveRDDTemporal(package.scala:137)
        at org.openeo.geotrellis.geotiff.package.saveRDDTemporal(package.scala)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.base/java.lang.reflect.Method.invoke(Method.java:566)
        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
        at py4j.Gateway.invoke(Gateway.java:282)
        at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
        at py4j.commands.CallCommand.execute(CallCommand.java:79)
        at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
        at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
        at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.io.NotSerializableException: scala.collection.immutable.MapLike$$anon$2
Serialization stack:
        - object not serializable (class: scala.collection.immutable.MapLike$$anon$2, value: Map(0.0 -> -1362630401, 5.0 -> 2139062271, 10.0 -> -1805140481, 14.0 -> -996371201, 1.0 -> -702076673, 6.0 -> -8450305, 9.0 -> -1128455425, 13.0 -> 748694783, 2.0 -> -139013377, 17.0 -> -6777089, 12.0 -> -4491009, 7.0 -> 398381055, 3.0 -> -606368257, 18.0 -> 527938815, 16.0 -> -1940501505, 11.0 -> -478690561, 8.0 -> -1730180353, 4.0 -> -943208449, 15.0 -> -978266625))
        - field (class: geotrellis.raster.render.BreakMap, name: breakMap, type: interface scala.collection.immutable.Map)
        - object (class geotrellis.raster.render.BreakMap$mcDI$sp, <function1>)
        - field (class: geotrellis.raster.render.DoubleColorMap, name: breakMap, type: class geotrellis.raster.render.BreakMap)
        - object (class geotrellis.raster.render.DoubleColorMap, geotrellis.raster.render.DoubleColorMap@786fca4a)
        - field (class: scala.Some, name: value, type: class java.lang.Object)
        - object (class scala.Some, Some(geotrellis.raster.render.DoubleColorMap@786fca4a))
        - field (class: org.openeo.geotrellis.geotiff.GTiffOptions, name: colorMap, type: class scala.Option)
        - object (class org.openeo.geotrellis.geotiff.GTiffOptions, org.openeo.geotrellis.geotiff.GTiffOptions@66be9aa5)
        - element of array (index: 7)
        - array (class [Ljava.lang.Object;, size 8)
        - field (class: java.lang.invoke.SerializedLambda, name: capturedArgs, type: class [Ljava.lang.Object;)
        - object (class java.lang.invoke.SerializedLambda, SerializedLambda[capturingClass=class org.openeo.geotrellis.geotiff.package$, functionalInterfaceMethod=scala/Function1.apply:(Ljava/lang/Object;)Ljava/lang/Object;, implementation=invokeStatic org/openeo/geotrellis/geotiff/package$.$anonfun$saveRDDTemporal$4:(ILjava/lang/String;Lgeotrellis/raster/GridBounds;Lgeotrellis/vector/Extent;Lorg/apache/spark/rdd/RDD;Lgeotrellis/raster/TileLayout;Lgeotrellis/raster/io/geotiff/compression/DeflateCompression;Lorg/openeo/geotrellis/geotiff/GTiffOptions;Lscala/Tuple2;)Lscala/Tuple3;, instantiatedMethodType=(Lscala/Tuple2;)Lscala/Tuple3;, numCaptured=8])
        - writeReplace data (class: java.lang.invoke.SerializedLambda)
        - object (class org.openeo.geotrellis.geotiff.package$$$Lambda$3358/0x0000000841363840, org.openeo.geotrellis.geotiff.package$$$Lambda$3358/0x0000000841363840@67ce2ff0)
        at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:41)
        at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:47)
        at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:101)
        at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:413)
        ... 22 more
jdries commented 1 year ago

I think this was a duplicate