DHI-GRAS / worldwater-toolbox

WorldWater Toolbox
GNU General Public License v3.0
4 stars 2 forks source link

3_WorldWaterToolbox_BU.ipynb #7

Closed AlessandroScremin closed 1 year ago

AlessandroScremin commented 1 year ago

I tried the 3_WorldWaterToolbox_BU.ipynb on openeo jupyter lab

Unfortunately at cell: C) Mask Terrain Shadow In Progress

hillshade package opensource package maintained by DHI was installed on the platform

I received the following error. Could you please address me to a possible solution?

Your batch job 'j-78fe0c18db55451fa6fe03c642d2643b' failed. Error logs: [{'id': '[1683556709353, 0]', 'time': '2023-05-08T14:38:29.353Z', 'level': 'error', 'message': 'Error communicating with MapOutputTracker'}, {'id': '[1683556710284, 0]', 'time': '2023-05-08T14:38:30.284Z', 'level': 'error', 'message': 'Error communicating with MapOutputTracker'}, {'id': '[1683556710334, 0]', 'time': '2023-05-08T14:38:30.334Z', 'level': 'error', 'message': 'Error communicating with MapOutputTracker'}, {'id': '[1683556780646, 187107]', 'time': '2023-05-08T14:39:40.646Z', 'level': 'error', 'message': 'Task 14 in stage 58.0 failed 4 times; aborting job'}, {'id': '[1683556780652, 187443]', 'time': '2023-05-08T14:39:40.652Z', 'level': 'error', 'message': 'Stage error: Job aborted due to stage failure: Task 14 in stage 58.0 failed 4 times, most recent failure: Lost task 14.3 in stage 58.0 (TID 666) (epod166.vgt.vito.be executor 37): org.apache.spark.api.python.PythonException: Traceback (most recent call last):\n File "/opt/spark3_4_0/python/lib/pyspark.zip/pyspark/worker.py", line 830, in main\n process()\n File "/opt/spark3_4_0/python/lib/pyspark.zip/pyspark/worker.py", line 822, in process\n serializer.dump_stream(out_iter, outfile)\n File "/opt/spark3_4_0/python/lib/pyspark.zip/pyspark/serializers.py", line 146, in dump_stream\n for obj in iterator:\n File "/opt/spark3_4_0/python/lib/pyspark.zip/pyspark/util.py", line 81, in wrapper\n return f(*args, kwargs)\n File "/opt/venv/lib64/python3.8/site-packages/openeogeotrellis/utils.py", line 50, in memory_logging_wrapper\n return function(*args, *kwargs)\n File "/opt/venv/lib64/python3.8/site-packages/epsel.py", line 44, in wrapper\n return _FUNCTION_POINTERS[key](args, kwargs)\n File "/opt/venv/lib64/python3.8/site-packages/epsel.py", line 37, in first_time\n return f(*args, **kwargs)\n File "/opt/venv/lib64/python3.8/site-packages/openeogeotrellis/geopysparkdatacube.py", line 707, in tile_function\n result_data = run_udf_code(code=udf_code, data=data)\n File "/opt/venv/lib64/python3.8/site-packages/openeogeotrellis/udf.py", line 20, in run_udf_code\n return openeo.udf.run_udf_code(code=code, data=data)\n File "/opt/venv/lib64/python3.8/site-packages/openeo/udf/run_code.py", line 175, in run_udf_code\n result_cube = func(data.get_datacube_list()[0], data.user_context)\n File "", line 59, in apply_datacube\n File "", line 40, in _run_shader\n File "/opt/venv/lib64/python3.8/site-packages/hillshade/hillshade.py", line 45, in hillshade\n raise ValueError("xy-direction is not rasterized.")\nValueError: xy-direction is not rasterized.\n\n\tat org.apache.spark.api.python.BasePythonRunner$ReaderIterator.handlePythonException(PythonRunner.scala:561)\n\tat org.apache.spark.api.python.PythonRunner$$anon$3.read(PythonRunner.scala:767)\n\tat org.apache.spark.api.python.PythonRunner$$anon$3.read(PythonRunner.scala:749)\n\tat org.apache.spark.api.python.BasePythonRunner$ReaderIterator.hasNext(PythonRunner.scala:514)\n\tat org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)\n\tat scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)\n\tat scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:513)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:491)\n\tat scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)\n\tat scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)\n\tat scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)\n\tat org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:140)\n\tat org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)\n\tat org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:101)\n\tat org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)\n\tat org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)\n\tat org.apache.spark.scheduler.Task.run(Task.scala:139)\n\tat org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:554)\n\tat org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1529)\n\tat org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:557)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)\n\tat java.base/java.lang.Thread.run(Thread.java:829)\n\nDriver stacktrace:'}, {'id': '[1683556781004, 150902]', 'time': '2023-05-08T14:39:41.004Z', 'level': 'error', 'message': 'OpenEO batch job failed: UDF Exception during Spark execution: File "/opt/venv/lib64/python3.8/site-packages/openeo/udf/run_code.py", line 175, in run_udf_code\n result_cube = func(data.get_datacube_list()[0], data.user_context)\n File "", line 59, in apply_datacube\n File "", line 40, in _run_shader\n File "/opt/venv/lib64/python3.8/site-packages/hillshade/hillshade.py", line 45, in hillshade\n raise ValueError("xy-direction is not rasterized.")\nValueError: xy-direction is not rasterized.'}, {'id': '[1683556819984, 2815319]', 'time': '2023-05-08T14:40:19.984Z', 'level': 'error', 'message': 'YARN application status reports error diagnostics: User application exited with status 1'}]

sulova commented 1 year ago

Now, it should work 👍