Closed JenniferYingyiWu2020 closed 3 years ago
It's hard to do anything with this without a repeatable test case. If you are able to create an integration test or some other automated mechanism to reproduce this, reopen this and associate with a PR exemplifying the problem.
Hi @metasim, I have resolve the above issue. Thanks also!
Hi, I have tried to execute the command of "spark.read.raster('/vsihdfs/hdfs://192.168.101.201:9000/Jennifer_hadoop/Yunyao_Data_Set/split_20200613clip/B1.tif')" and "rf.select(rf_crs("proj_raster").alias("value")).first()", however the below errors appear: [1 of 1000] FAILURE(3) CPLE_OpenFailed(4) "Open failed." /vsihdfs/hdfs://192.168.101.201:9000/Jennifer_hadoop/Yunyao_Data_Set/split_20200613clip/B1.tif: No such file or directory [2 of 1000] FAILURE(3) CPLE_OpenFailed(4) "Open failed." /vsihdfs/hdfs://192.168.101.201:9000/Jennifer_hadoop/Yunyao_Data_Set/split_20200613clip/B1.tif: No such file or directory 21/03/24 09:26:38 ERROR Executor: Exception in task 61.0 in stage 9.0 (TID 201) java.lang.IllegalArgumentException: Error fetching data for one of: GDALRasterSource(/vsihdfs/hdfs://192.168.101.201:9000/Jennifer_hadoop/Yunyao_Data_Set/split_20200613clip/B1.tif)
Caused by: geotrellis.raster.gdal.MalformedDataException: Unable to construct a RasterExtent from the Transformation given. GDAL Error Code: 4
(base) hduser_@jenniferwu-OptiPlex-7070:~$ hdfs dfs -ls hdfs://192.168.101.201:9000/Jennifer_hadoop/Yunyao_Data_Set/split_20200613clip/B1.tif -rw-r--r-- 1 geotrellis supergroup 6712825 2021-03-23 14:17 hdfs://192.168.101.201:9000/Jennifer_hadoop/Yunyao_Data_Set/split_20200613clip/B1.tif Lastly, my python codes to set HADOOP_USER is the followings: