OSGeo / gdal

GDAL is an open source MIT licensed translator library for raster and vector geospatial data formats.
https://gdal.org
Other
4.79k stars 2.51k forks source link

Read HDFS files result JVM fatal error #1333

Closed qfjiang1993 closed 5 years ago

qfjiang1993 commented 5 years ago

Using GDAL read local files works fine

I build GDAL-2.4.0 with HDFS support and Java binding, When I use GDAL Java API to read GeoTiff files in local filesystem, /home/jiang/sampleRGB.tif , it works fine, and I get the GeoTiff file attributes like below:

GeoTiff format

[jiang@lake-geo demo]$ java -cp .:gdal.jar:$CLASSPATH RasterDemo /home/jiang/sampleRGB.tif
Driver: GTiff/GeoTIFF
/home/jiang/sampleRGB.tif
Size: 4550, 3886
Coordinate System is:
GEOGCS["WGS 84",DATUM["WGS_1984",SPHEROID["WGS 84",6378137,298.257223563,AUTHORITY["EPSG","7030"]],AUTHORITY["EPSG","6326"]],PRIMEM["Greenwich",0],UNIT["degree",0.0174532925199433],AUTHORITY["EPSG","4326"]]
Origin = (114.151033, 30.102782)
PixelSize = (0.000038, -0.000038)
Metadata:
        AREA_OR_POINT=Area
        TIFFTAG_XRESOLUTION=1
        TIFFTAG_YRESOLUTION=1
IMAGE_STRUCTURE Metadata:
        INTERLEAVE=PIXEL
Band 1 Block=4550x1 Type=1, ColorInterpretation=3
        Description =
        NoData Value=null
Band 2 Block=4550x1 Type=1, ColorInterpretation=4
        Description =
        NoData Value=null
Band 3 Block=4550x1 Type=1, ColorInterpretation=5
        Description =
        NoData Value=null

AAIGrid & HFA formats

Then, I test Arc/Info ASCII Grid (.asc) file and Erdas Imagine Images (.img) file, both work fine, out was below:

[jiang@lake-geo demo]$ java -cp .:gdal.jar:$CLASSPATH RasterDemo /home/jiang/srtm_59_06.asc
Driver: AAIGrid/Arc/Info ASCII Grid
/home/jiang/srtm_59_06.asc
/home/jiang/srtm_59_06.prj
Size: 6001, 6001
Coordinate System is:
GEOGCS["WGS 84",DATUM["WGS_1984",SPHEROID["WGS 84",6378137,298.257223563,AUTHORITY["EPSG","7030"]],AUTHORITY["EPSG","6326"]],PRIMEM["Greenwich",0,AUTHORITY["EPSG","8901"]],UNIT["degree",0.0174532925199433,AUTHORITY["EPSG","9122"]],AUTHORITY["EPSG","4326"]]
Origin = (109.999584, 35.000417)
PixelSize = (0.000833, -0.000833)
Band 1 Block=6001x1 Type=5, ColorInterpretation=0
        Description =
        NoData Value=-9999.0

[jiang@lake-geo demo]$ java -cp .:gdal.jar:$CLASSPATH RasterDemo /home/jiang/srtm_59_06.img
Driver: HFA/Erdas Imagine Images (.img)
/home/jiang/srtm_59_06.img
Size: 6000, 6000
Coordinate System is:
GEOGCS["WGS 84",DATUM["WGS_1984",SPHEROID["WGS 84",6378137,298.257223563,AUTHORITY["EPSG","7030"]],TOWGS84[0,0,0,-0,-0,-0,0],AUTHORITY["EPSG","6326"]],PRIMEM["Greenwich",0,AUTHORITY["EPSG","8901"]],UNIT["degree",0.0174532925199433,AUTHORITY["EPSG","9122"]],AUTHORITY["EPSG","4326"]]
Origin = (110.000000, 35.000000)
PixelSize = (0.000833, -0.000833)
Metadata:
        AREA_OR_POINT=Area
Band 1 Block=64x64 Type=3, ColorInterpretation=0
        Description = Layer_1
        NoData Value=-32768.0
        IMAGE_STRUCTURE Metadata:
                COMPRESSION=RLE
        Metadata:
                LAYER_TYPE=athematic

Read the HDFS files results JVM fatal error

Here is the problem, when I read file from HDFS , it still read the attributes of the file properly, but when the program exited , it resulted JVM fatal error :

[jiang@lake-geo demo]$ java -cp .:gdal.jar:$CLASSPATH RasterDemo /vsihdfs/hdfs://lake-geo:8020/raster/sampleRGB.tif
Driver: GTiff/GeoTIFF
/vsihdfs/hdfs://lake-geo:8020/raster/sampleRGB.tif
Size: 4550, 3886
Coordinate System is:
GEOGCS["WGS 84",DATUM["WGS_1984",SPHEROID["WGS 84",6378137,298.257223563,AUTHORITY["EPSG","7030"]],AUTHORITY["EPSG","6326"]],PRIMEM["Greenwich",0],UNIT["degree",0.0174532925199433],AUTHORITY["EPSG","4326"]]
Origin = (114.151033, 30.102782)
PixelSize = (0.000038, -0.000038)
Metadata:
        AREA_OR_POINT=Area
        TIFFTAG_XRESOLUTION=1
        TIFFTAG_YRESOLUTION=1
IMAGE_STRUCTURE Metadata:
        INTERLEAVE=PIXEL
Band 1 Block=4550x1 Type=1, ColorInterpretation=3
        Description =
        NoData Value=null
Band 2 Block=4550x1 Type=1, ColorInterpretation=4
        Description =
        NoData Value=null
Band 3 Block=4550x1 Type=1, ColorInterpretation=5
        Description =
        NoData Value=null
#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGSEGV (0xb) at pc=0x00007ff933cd2f26, pid=11566, tid=0x00007ff935269700
#
# JRE version: OpenJDK Runtime Environment (8.0_191-b12) (build 1.8.0_191-b12)
# Java VM: OpenJDK 64-Bit Server VM (25.191-b12 mixed mode linux-amd64 compressed oops)
# Problematic frame:
# V  [libjvm.so+0x88af26]
#
# Failed to write core dump. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again
#
# An error report file with more information is saved as:
# /home/jiang/demo/hs_err_pid11566.log
#
# If you would like to submit a bug report, please visit:
#   http://bugreport.java.com/bugreport/crash.jsp
#
Aborted (core dumped)

I also tried .asc and .img file format, the .asc format occurred same problem with GeoTiff , and the .img format throw error ERROR 3: VSIFReadL() failed in HFAEntry::LoadData(). , output :

[jiang@lake-geo demo]$ java -cp .:gdal.jar:$CLASSPATH RasterDemo /vsihdfs/hdfs://lake-geo:8020/raster/srtm_59_06.asc
Driver: AAIGrid/Arc/Info ASCII Grid
/vsihdfs/hdfs://lake-geo:8020/raster/srtm_59_06.asc
/vsihdfs/hdfs://lake-geo:8020/raster/srtm_59_06.prj
Size: 6001, 6001
Coordinate System is:
GEOGCS["WGS 84",DATUM["WGS_1984",SPHEROID["WGS 84",6378137,298.257223563,AUTHORITY["EPSG","7030"]],AUTHORITY["EPSG","6326"]],PRIMEM["Greenwich",0,AUTHORITY["EPSG","8901"]],UNIT["degree",0.0174532925199433,AUTHORITY["EPSG","9122"]],AUTHORITY["EPSG","4326"]]
Origin = (109.999584, 35.000417)
PixelSize = (0.000833, -0.000833)
DERIVED_SUBDATASETS Metadata:
        DERIVED_SUBDATASET_1_NAME=DERIVED_SUBDATASET:LOGAMPLITUDE:/vsihdfs/hdfs://lake-geo:8020/raster/srtm_59_06.asc
        DERIVED_SUBDATASET_1_DESC=log10 of amplitude of input bands from /vsihdfs/hdfs://lake-geo:8020/raster/srtm_59_06.asc
Band 1 Block=6001x1 Type=5, ColorInterpretation=0
        Description =
        NoData Value=-9999.0
#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGSEGV (0xb) at pc=0x00007ff38b6bff26, pid=15085, tid=0x00007ff38cc56700
#
# JRE version: OpenJDK Runtime Environment (8.0_191-b12) (build 1.8.0_191-b12)
# Java VM: OpenJDK 64-Bit Server VM (25.191-b12 mixed mode linux-amd64 compressed oops)
# Problematic frame:
# V  [libjvm.so+0x88af26]
#
# Failed to write core dump. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again
#
# An error report file with more information is saved as:
# /home/jiang/demo/hs_err_pid15085.log
#
# If you would like to submit a bug report, please visit:
#   http://bugreport.java.com/bugreport/crash.jsp
#
Aborted (core dumped)

[jiang@lake-geo demo]$ java -cp .:gdal.jar:$CLASSPATH RasterDemo /vsihdfs/hdfs://lake-geo:8020/raster/srtm_59_06.img
ERROR 3: VSIFReadL() failed in HFAEntry::LoadData().
GDAL open failed - 3: VSIFReadL() failed in HFAEntry::LoadData().

My Java Program

Here is my RasterDemo code:

import org.gdal.gdal.Band;
import org.gdal.gdal.Dataset;
import org.gdal.gdal.Driver;
import org.gdal.gdal.gdal;
import org.gdal.gdalconst.gdalconstConstants;

import java.util.Vector;

/**
 * @author QFJiang on 2018/10/25 10:22
 */
public class RasterDemo {

    public static void main(String[] args) {

        gdal.AllRegister();
        Dataset dataset = gdal.Open(args[0], gdalconstConstants.GA_ReadOnly);
        if (dataset == null) {
            System.err.println(String.format("GDAL open failed - %s: %s", gdal.GetLastErrorNo(), gdal.GetLastErrorMsg()));
            System.exit(1);
        }
        Driver driver = dataset.GetDriver();
        System.out.println("Driver: " + driver.getShortName() + "/" + driver.getLongName());
        dataset.GetFileList().forEach(System.out::println);
        int xSize = dataset.getRasterXSize();
        int ySize = dataset.getRasterYSize();
        System.out.println(String.format("Size: %d, %d", xSize, ySize));

        System.out.println("Coordinate System is:");
        System.out.println(dataset.GetProjection());
//        System.out.println(dataset.GetProjectionRef());

        double[] geoTrans = dataset.GetGeoTransform();
        System.out.println(String.format("Origin = (%f, %f)", geoTrans[0], geoTrans[3]));
        System.out.println(String.format("PixelSize = (%f, %f)", geoTrans[1], geoTrans[5]));

        // Metadata
        Vector<String> domainList = dataset.GetMetadataDomainList();
        for (String domain : domainList) {
            if ("".equals(domain))
                System.out.println("Metadata:");
            else
                System.out.println(domain + " Metadata:");
            dataset.GetMetadata_List(domain).forEach(v -> System.out.println("\t" + v));
        }

        int count = dataset.getRasterCount();
        for (int i = 0; i < count; i++) {
            Band band = dataset.GetRasterBand(i + 1);
            System.out.println(String.format("Band %d Block=%dx%d Type=%d, ColorInterpretation=%d",
                    i + 1, band.GetBlockXSize(), band.GetBlockYSize(), band.getDataType(), band.GetColorInterpretation()));
            System.out.println("\tDescription = " + band.GetDescription());
            Double[] nodataValue = new Double[1];
            band.GetNoDataValue(nodataValue);
            System.out.println(String.format("\tNoData Value=%s", nodataValue[0]));

            Vector<String> domains = band.GetMetadataDomainList();
            for (String domain : domains) {
                if ("".equals(domain))
                    System.out.println("\tMetadata:");
                else
                    System.out.println("\t" + domain + " Metadata:");
                band.GetMetadata_List(domain).forEach(v -> System.out.println("\t\t" + v));
            }
        }
    }
}

Operating system & GDAL version

[jiang@lake-geo demo]$ gdalinfo --version
GDAL 2.4.0, released 2018/12/14

[jiang@lake-geo demo]$ hadoop version
Hadoop 2.7.7
Subversion Unknown -r c1aad84bd27cd79c3d1a7dd58202a8c3ee1ed3ac
Compiled by stevel on 2018-07-18T22:47Z
Compiled with protoc 2.5.0
From source with checksum 792e15d20b12c74bd6f19a1fb886490
This command was run using /opt/hadoop-2.7.7/share/hadoop/common/hadoop-common-2.7.7.jar

[jiang@lake-geo demo]$ java -version
openjdk version "1.8.0_191"
OpenJDK Runtime Environment (build 1.8.0_191-b12)
OpenJDK 64-Bit Server VM (build 25.191-b12, mixed mode)

[jiang@lake-geo demo]$ cat /etc/redhat-release
CentOS Linux release 7.6.1810 (Core)

More questions

Oracke JDK

I switched to Oracle JDK-1.8.0_191 to test my code, It still came the same results.

Another

At the end of main, I let thread sleep 10s, the program works fine until the main exited, after 10s sleep, when main exited the JVM fatal error occurred! So, I wonder if whether I do not close the dataset correctly, that result JVM fatal error.

rouault commented 5 years ago

Maybe try to add dataset.delete()at the end of your main. This is normally not needed, but there might be a bad interaction with libhdfs if the closing is done at JVM termination.

qfjiang1993 commented 5 years ago

Maybe try to add dataset.delete()at the end of your main. This is normally not needed, but there might be a bad interaction with libhdfs if the closing is done at JVM termination.

I tried, but it did not work, the same result.

rouault commented 5 years ago

Perhaps @jamesmcclain has some clues

jamesmcclain commented 5 years ago

Perhaps @jamesmcclain has some clues

I will try to reproduce this weekend.

qfjiang1993 commented 5 years ago

Perhaps @jamesmcclain has some clues

I will try to reproduce this weekend.

By the way, I build GDAL-2.4.0 as follows:

./configure --with-geos=yes --with-proj=yes --with-python=yes --with-java=/usr/lib/jvm/java-1.8.0-openjdk/ --with-hdfs=/opt/hadoop-2.7.7/
make
make install

and there is my envionment variables:

export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib:$HADOOP_HOME/lib/native:$JAVA_HOME/jre/lib/amd64/server
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/geos-3.7.1/lib:/opt/proj-5.2.0/lib:/opt/gdal-2.4.0/lib:/opt/gdal-2.4.0/lib64

At last, I test vsihdfs using gdalinfo facility, results as follows:

## file stored in HDFS
[jiang@lake-geo demo]$ gdalinfo /vsihdfs/hdfs://lake-geo:8020/raster/sampleRGB.tif
loadFileSystems error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
hdfsGetPathInfo(hdfs://lake-geo:8020/raster/sampleRGB.tif): constructNewObjectOfPath error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
ERROR 4: `/vsihdfs/hdfs://lake-geo:8020/raster/sampleRGB.tif' does not exist in the file system, and is not recognized as a supported dataset name.
gdalinfo failed - unable to open '/vsihdfs/hdfs://lake-geo:8020/raster/sampleRGB.tif'.

## local file accessed through HDFS
[jiang@lake-geo demo]$ gdalinfo /vsihdfs/file:/home/jiang/sampleRGB.tif
loadFileSystems error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
hdfsGetPathInfo(file:/home/jiang/sampleRGB.tif): constructNewObjectOfPath error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
ERROR 4: `/vsihdfs/file:/home/jiang/sampleRGB.tif' does not exist in the file system, and is not recognized as a supported dataset name.
gdalinfo failed - unable to open '/vsihdfs/file:/home/jiang/sampleRGB.tif'.

Or, is something wrong with hadoop?

qfjiang1993 commented 5 years ago

Perhaps @jamesmcclain has some clues

I will try to reproduce this weekend.

After I see vsihdfs support at https://github.com/OSGeo/gdal/pull/714 , I changed my $CLASSPATH, I add --glob param

## previous
export CLASSPATH=$($HADOOP_HOME/bin/hadoop classpath):$CLASSPATH

## now
export CLASSPATH=$($HADOOP_HOME/bin/hadoop classpath --glob):$CLASSPATH

then, I succeed using gdalinfo facility read file stored in HDFS, but not work for local file through HDFS

[jiang@lake-geo demo]$ gdalinfo /vsihdfs/hdfs://lake-geo:8020/raster/sampleRGB.tif
Driver: GTiff/GeoTIFF
Files: /vsihdfs/hdfs://lake-geo:8020/raster/sampleRGB.tif
Size is 4550, 3886
Coordinate System is:
GEOGCS["WGS 84",
    DATUM["WGS_1984",
        SPHEROID["WGS 84",6378137,298.257223563,
            AUTHORITY["EPSG","7030"]],
        AUTHORITY["EPSG","6326"]],
    PRIMEM["Greenwich",0],
    UNIT["degree",0.0174532925199433],
    AUTHORITY["EPSG","4326"]]
Origin = (114.151033273395370,30.102782374104656)
Pixel Size = (0.000037769784173,-0.000037769784173)
Metadata:
  AREA_OR_POINT=Area
  TIFFTAG_XRESOLUTION=1
  TIFFTAG_YRESOLUTION=1
Image Structure Metadata:
  INTERLEAVE=PIXEL
Corner Coordinates:
Upper Left  ( 114.1510333,  30.1027824) (114d 9' 3.72"E, 30d 6'10.02"N)
Lower Left  ( 114.1510333,  29.9560090) (114d 9' 3.72"E, 29d57'21.63"N)
Upper Right ( 114.3228858,  30.1027824) (114d19'22.39"E, 30d 6'10.02"N)
Lower Right ( 114.3228858,  29.9560090) (114d19'22.39"E, 29d57'21.63"N)
Center      ( 114.2369595,  30.0293957) (114d14'13.05"E, 30d 1'45.82"N)
Band 1 Block=4550x1 Type=Byte, ColorInterp=Red
Band 2 Block=4550x1 Type=Byte, ColorInterp=Green
Band 3 Block=4550x1 Type=Byte, ColorInterp=Blue

## not work for local file accessed through HDFS
[jiang@lake-geo demo]$ gdalinfo /vsihdfs/file:/home/jiang/sampleRGB.tif
hdfsGetPathInfo(file:/home/jiang/sampleRGB.tif): getFileInfo error:
java.lang.IllegalArgumentException: Wrong FS: file:/home/jiang/sampleRGB.tif, expected: hdfs://lake-geo:8020
        at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:647)
        at org.apache.hadoop.hdfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:194)
        at org.apache.hadoop.hdfs.DistributedFileSystem.access$000(DistributedFileSystem.java:106)
        at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
        at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1301)
        at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1425)
ERROR 4: `/vsihdfs/file:/home/jiang/sampleRGB.tif' does not exist in the file system, and is not recognized as a supported dataset name.
gdalinfo failed - unable to open '/vsihdfs/file:/home/jiang/sampleRGB.tif'.

Finally, when testing my Java code, it still results JVM fatal error

## Java API access file stored in HDFS
[jiang@lake-geo demo]$ java -cp .:gdal.jar:$CLASSPATH RasterDemo /vsihdfs/hdfs://lake-geo:8020/raster/sampleRGB.tif
Driver: GTiff/GeoTIFF
/vsihdfs/hdfs://lake-geo:8020/raster/sampleRGB.tif
Size: 4550, 3886
Coordinate System is:
GEOGCS["WGS 84",DATUM["WGS_1984",SPHEROID["WGS 84",6378137,298.257223563,AUTHORITY["EPSG","7030"]],AUTHORITY["EPSG","6326"]],PRIMEM["Greenwich",0],UNIT["degree",0.0174532925199433],AUTHORITY["EPSG","4326"]]
Origin = (114.151033, 30.102782)
PixelSize = (0.000038, -0.000038)
Metadata:
        AREA_OR_POINT=Area
        TIFFTAG_XRESOLUTION=1
        TIFFTAG_YRESOLUTION=1
IMAGE_STRUCTURE Metadata:
        INTERLEAVE=PIXEL
DERIVED_SUBDATASETS Metadata:
        DERIVED_SUBDATASET_1_NAME=DERIVED_SUBDATASET:LOGAMPLITUDE:/vsihdfs/hdfs://lake-geo:8020/raster/sampleRGB.tif
        DERIVED_SUBDATASET_1_DESC=log10 of amplitude of input bands from /vsihdfs/hdfs://lake-geo:8020/raster/sampleRGB.tif
Band 1 Block=4550x1 Type=1, ColorInterpretation=3
        Description =
        NoData Value=null
Band 2 Block=4550x1 Type=1, ColorInterpretation=4
        Description =
        NoData Value=null
Band 3 Block=4550x1 Type=1, ColorInterpretation=5
        Description =
        NoData Value=null
#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGSEGV (0xb) at pc=0x00007f73d38daf26, pid=73230, tid=0x00007f73d4e71700
#
# JRE version: OpenJDK Runtime Environment (8.0_191-b12) (build 1.8.0_191-b12)
# Java VM: OpenJDK 64-Bit Server VM (25.191-b12 mixed mode linux-amd64 compressed oops)
# Problematic frame:
# V  [libjvm.so+0x88af26]
#
# Failed to write core dump. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again
#
# An error report file with more information is saved as:
# /home/jiang/demo/hs_err_pid73230.log
#
# If you would like to submit a bug report, please visit:
#   http://bugreport.java.com/bugreport/crash.jsp
#
Aborted (core dumped)

## Java API access local file through HDFS
[jiang@lake-geo demo]$ java -cp .:gdal.jar:$CLASSPATH RasterDemo /vsihdfs/file:/home/jiang/sampleRGB.tif
hdfsGetPathInfo(file:/home/jiang/sampleRGB.tif): getFileInfo error:
java.lang.IllegalArgumentException: Wrong FS: file:/home/jiang/sampleRGB.tif, expected: hdfs://lake-geo:8020
        at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:647)
        at org.apache.hadoop.hdfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:194)
        at org.apache.hadoop.hdfs.DistributedFileSystem.access$000(DistributedFileSystem.java:106)
        at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
        at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1301)
        at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1425)
        at org.gdal.gdal.gdalJNI.Open__SWIG_0(Native Method)
        at org.gdal.gdal.gdal.Open(gdal.java:662)
        at RasterDemo.main(RasterDemo.java:17)
ERROR 4: `/vsihdfs/file:/home/jiang/sampleRGB.tif' does not exist in the file system, and is not recognized as a supported dataset name.
GDAL open failed - 4: `/vsihdfs/file:/home/jiang/sampleRGB.tif' does not exist in the file system, and is not recognized as a supported dataset name.
jamesmcclain commented 5 years ago

@qfjiang1993 I am investigating now, I hope to have something to say shortly

jamesmcclain commented 5 years ago

I was able to run the demo successfully.

Script started on 2019-03-02 09:51:19-05:00
ubuntu@vsihdfs:~/Desktop$ java -cp .:/tmp/gdal.jar:$CLASSPATH RasterDemo '/vsihdfs/file:/home/ubuntu/Downloads/c41078a1.tif'
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/home/ubuntu/local/hadoop-2.9.2/share/hadoop/common/lib/hadoop-auth-2.9.2.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Driver: GTiff/GeoTIFF
/vsihdfs/file:/home/ubuntu/Downloads/c41078a1.tif
Size: 7202, 5593
Coordinate System is:
PROJCS["WGS 84 / UTM zone 17N",GEOGCS["WGS 84",DATUM["WGS_1984",SPHEROID["WGS 84",6378137,298.257223563,AUTHORITY["EPSG","7030"]],AUTHORITY["EPSG","6326"]],PRIMEM["Greenwich",0,AUTHORITY["EPSG","8901"]],UNIT["degree",0.0174532925199433,AUTHORITY["EPSG","9122"]],AUTHORITY["EPSG","4326"]],PROJECTION["Transverse_Mercator"],PARAMETER["latitude_of_origin",0],PARAMETER["central_meridian",-81],PARAMETER["scale_factor",0.9996],PARAMETER["false_easting",500000],PARAMETER["false_northing",0],UNIT["metre",1,AUTHORITY["EPSG","9001"]],AXIS["Easting",EAST],AXIS["Northing",NORTH],AUTHORITY["EPSG","32617"]]
Origin = (576496.823428, 4660453.588271)
PixelSize = (25.400001, -25.400001)
Metadata:
    AREA_OR_POINT=Area
    TIFFTAG_RESOLUTIONUNIT=2 (pixels/inch)
    TIFFTAG_XRESOLUTION=72
    TIFFTAG_YRESOLUTION=72
IMAGE_STRUCTURE Metadata:
    COMPRESSION=PACKBITS
    INTERLEAVE=BAND
DERIVED_SUBDATASETS Metadata:
    DERIVED_SUBDATASET_1_NAME=DERIVED_SUBDATASET:LOGAMPLITUDE:/vsihdfs/file:/home/ubuntu/Downloads/c41078a1.tif
    DERIVED_SUBDATASET_1_DESC=log10 of amplitude of input bands from /vsihdfs/file:/home/ubuntu/Downloads/c41078a1.tif
Band 1 Block=7202x1 Type=1, ColorInterpretation=2
    Description = 
    NoData Value=null
ubuntu@vsihdfs:~/Desktop$ exit

Script done on 2019-03-02 09:51:46-05:00

Regarding the problem with gdalinfo that you reported above, it looks like that version does not have vsihdfs capability compiled into it, so there may be an older (or another) version of GDAL on your path.

My setup is the following: a clean Ubuntu 18.10 VM with OpenJDK 11, gcc 8.2.0, and Hadoop 2.9.2 (the first two from the package manager, the third from here). (I am compiling GDAL 2.4.0 from source.)

The test file that I used is this one from this directory. My next steps are to try a few more GeoTiffs from that same source (unless you can provide me with the exact one that you are using), then try OpenJDK 8 and/or Hadoop 2.7.7.

Update: I can report similar results with OpenJDK 11, Hadoop 2.7.7, and a variety of sample images from the source linked-to above. I will try OpenJDK 8 at some point in the near future.

qfjiang1993 commented 5 years ago

@jamesmcclain

I reproduce the problem with following setup: a clean CentOS 7.6.1810 VM with OpenJDK-1.8.0 (yum install), GCC-4.8.5 ( CentOS built-in) and Hadoop-2.7.7 (from hadoop mirror). I compiled GDAL-2.4.0 (http://download.osgeo.org/gdal/2.4.0/gdal-2.4.0.tar.gz) from source.

But, it still not work, then I switched to OpenJDK 11 (by yum), recompiled my Java code, now it worked! No longer resulted JVM fatal error. So maybe the OpenJDK 1.8.0 has something wrong.

But there is another weird problem, it only work with file stored in HDFS, but not for local file through HDFS (it may not be important, because when I use vsihdfs I wont read local file). Both for gdalinfo and Java API, as follwing:

## not work for local file accessed through HDFS
[jiang@lake-geo demo]$ gdalinfo /vsihdfs/file:/home/jiang/sampleRGB.tif
hdfsGetPathInfo(file:/home/jiang/sampleRGB.tif): getFileInfo error:
java.lang.IllegalArgumentException: Wrong FS: file:/home/jiang/sampleRGB.tif, expected: hdfs://lake-geo:8020
        at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:647)
        at org.apache.hadoop.hdfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:194)
        at org.apache.hadoop.hdfs.DistributedFileSystem.access$000(DistributedFileSystem.java:106)
        at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
        at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1301)
        at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1425)
ERROR 4: `/vsihdfs/file:/home/jiang/sampleRGB.tif' does not exist in the file system, and is not recognized as a supported dataset name.
gdalinfo failed - unable to open '/vsihdfs/file:/home/jiang/sampleRGB.tif'.

GDAL-2.4.0 worked fine with OpenJDK 11, and may result OpenJDK 1.8.0 JVM crashed (in my case)

I test the file (https://download.osgeo.org/geotiff/samples/usgs/c41078a1.tif) that you used, it works fine.

Finally, thanks for you generous reply!

jamesmcclain commented 5 years ago

Hello,

Yes, I was just going to follow up to say that I reproduced the problem using OpenJDK 8 using both Hadoop 2.7.7 and Hadoop 2.9.2.

I am not able to reproduce the issue with gdalinfo that you reported. When I run it on a sample file, I get this:

Script started on 2019-03-03 16:14:00-05:00
ubuntu@vsihdfs:~/Downloads$ gdalinfo '/vsihdfs/file:/home/ubuntu/Downloads/acea.tif'
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/home/ubuntu/local/hadoop-2.9.2/share/hadoop/common/lib/hadoop-auth-2.9.2.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Driver: GTiff/GeoTIFF
Files: /vsihdfs/file:/home/ubuntu/Downloads/acea.tif
Size is 515, 515
Coordinate System is:
PROJCS["ACEA        E000",
    GEOGCS["NAD27",
        DATUM["North_American_Datum_1927",
            SPHEROID["Clarke 1866",6378206.4,294.9786982138982,
                AUTHORITY["EPSG","7008"]],
            AUTHORITY["EPSG","6267"]],
        PRIMEM["Greenwich",0],
        UNIT["degree",0.0174532925199433],
        AUTHORITY["EPSG","4267"]],
    PROJECTION["Albers_Conic_Equal_Area"],
    PARAMETER["standard_parallel_1",33.90363402777778],
    PARAMETER["standard_parallel_2",33.62529002777778],
    PARAMETER["latitude_of_center",33.76446202777777],
    PARAMETER["longitude_of_center",-117.4745428888889],
    PARAMETER["false_easting",0],
    PARAMETER["false_northing",0],
    UNIT["metre",1,
        AUTHORITY["EPSG","9001"]]]
Origin = (-15411.615311534629654,15448.759177431464195)
Pixel Size = (60.000000000000000,-60.000000000000000)
Metadata:
  AREA_OR_POINT=Area
Image Structure Metadata:
  INTERLEAVE=BAND
Corner Coordinates:
Upper Left  (  -15411.615,   15448.759) (117d38'28.22"W, 33d54'13.08"N)
Lower Left  (  -15411.615,  -15451.241) (117d38'26.28"W, 33d37'30.15"N)
Upper Right (   15488.385,   15448.759) (117d18'25.50"W, 33d54'13.07"N)
Lower Right (   15488.385,  -15451.241) (117d18'27.45"W, 33d37'30.15"N)
Center      (  38.3846885,  -1.2408226) (117d28'26.86"W, 33d45'52.02"N)
Band 1 Block=515x15 Type=Byte, ColorInterp=Gray
ubuntu@vsihdfs:~/Downloads$ exit

Script done on 2019-03-03 16:14:21-05:00

(This is with OpenJDK 8 and Hadoop 2.9.2.)

Finally, thanks for you generous reply!

Thank you for your interest! Be well.

qfjiang1993 commented 5 years ago

@jamesmcclain

It was the hadoop's configuration mode's problem. I configured hadoop as pseudo-distributed, so I had to access with hdfs protocol, the file protocol not work. Corresponding, I switched hadoop to standalone mode, then it worked with file protocol, not for hdfs.

Thank you!

qfjiang1993 commented 5 years ago

I found another question, Erdas Images format not work with hdfs protocol, but support local file.

/vsihdfs/hdfs:/ not work (pseudo-distributed mode, access HDFS file), may the driver get wrong, https://github.com/OSGeo/gdal/blob/master/gdal/frmts/hfa/hfaentry.cpp .

## my Java program
[jiang@lake-geo demo]$ java -cp .:$CLASSPATH:gdal.jar RasterDemo /vsihdfs/hdfs:/raster/srtm_59_06.img
ERROR 3: VSIFReadL() failed in HFAEntry::LoadData().
GDAL open failed - 3: VSIFReadL() failed in HFAEntry::LoadData().

## GDAL Java samples gdalinfo.java
[jiang@lake-geo apps]$ java -cp .:$CLASSPATH:~/gdal.jar:../classes gdalinfo /vsihdfs/hdfs:/raster/srtm_59_06.img
ERROR 3: VSIFReadL() failed in HFAEntry::LoadData().
GDALOpen failed - 3
VSIFReadL() failed in HFAEntry::LoadData().

/vsihdfs/file:/ and gdallinfo work fine, standalone mode, access local file.

## my Java program
[jiang@lake-geo demo]$ java -cp .:$CLASSPATH:gdal.jar RasterDemo /vsihdfs/file:/home/jiang/srtm_59_06.img
Driver: HFA/Erdas Imagine Images (.img)
/vsihdfs/file:/home/jiang/srtm_59_06.img
Size: 6000, 6000
Coordinate System is:
GEOGCS["WGS 84",DATUM["WGS_1984",SPHEROID["WGS 84",6378137,298.257223563,AUTHORITY["EPSG","7030"]],TOWGS84[0,0,0,-0,-0,-0,0],AUTHORITY["EPSG","6326"]],PRIMEM["Greenwich",0,AUTHORITY["EPSG","8901"]],UNIT["degree",0.0174532925199433,AUTHORITY["EPSG","9122"]],AUTHORITY["EPSG","4326"]]
Origin = (110.000000, 35.000000)
PixelSize = (0.000833, -0.000833)
Metadata:
        AREA_OR_POINT=Area
DERIVED_SUBDATASETS Metadata:
        DERIVED_SUBDATASET_1_NAME=DERIVED_SUBDATASET:LOGAMPLITUDE:/vsihdfs/file:/home/jiang/srtm_59_06.img
        DERIVED_SUBDATASET_1_DESC=log10 of amplitude of input bands from /vsihdfs/file:/home/jiang/srtm_59_06.img
Band 1 Block=64x64 Type=3, ColorInterpretation=0
        Description = Layer_1
        NoData Value=-32768.0
        IMAGE_STRUCTURE Metadata:
                COMPRESSION=RLE
        Metadata:
                LAYER_TYPE=athematic

## GDAL Java samples gdalinfo.java
[jiang@lake-geo apps]$ java -cp .:$CLASSPATH:~/gdal.jar:../classes gdalinfo /vsihdfs/file:/home/jiang/srtm_59_06.img
Driver: HFA/Erdas Imagine Images (.img)
Files: /vsihdfs/file:/home/jiang/srtm_59_06.img
Size is 6000, 6000
Coordinate System is:
GEOGCS["WGS 84",
    DATUM["WGS_1984",
        SPHEROID["WGS 84",6378137,298.257223563,
            AUTHORITY["EPSG","7030"]],
        TOWGS84[0,0,0,-0,-0,-0,0],
        AUTHORITY["EPSG","6326"]],
    PRIMEM["Greenwich",0,
        AUTHORITY["EPSG","8901"]],
    UNIT["degree",0.0174532925199433,
        AUTHORITY["EPSG","9122"]],
    AUTHORITY["EPSG","4326"]]
Origin = (110.0,35.0)
Pixel Size = (8.333333333333334E-4,-8.333333333333334E-4)
Metadata:
  AREA_OR_POINT=Area
Corner Coordinates:
Upper Left  (110.0,35.0) (110d 0' 0.00"E, 35d 0' 0.00"N)
Lower Left  (110.0,30.0) (110d 0' 0.00"E, 30d 0' 0.00"N)
Upper Right (115.0,35.0) (115d 0' 0.00"E, 35d 0' 0.00"N)
Lower Right (115.0,30.0) (115d 0' 0.00"E, 30d 0' 0.00"N)
Center      (112.5,32.5) (112d30' 0.00"E, 32d30' 0.00"N)
Band 1 Block=64x64 Type=Int16, ColorInterp=Undefined
  Description = Layer_1
  NoData Value=-32768.0
  Metadata:
    LAYER_TYPE=athematic
<GDALRasterAttributeTable
</GDALRasterAttributeTable>
jamesmcclain commented 5 years ago

I found another question, Erdas Images format not work with hdfs protocol, but support local file.

I don't have access to any such files. If you can provide me with one that exhibits the problem, I am willing to take a look at it.

qfjiang1993 commented 5 years ago

I found another question, Erdas Images format not work with hdfs protocol, but support local file.

I don't have access to any such files. If you can provide me with one that exhibits the problem, I am willing to take a look at it.

There is my test files, https://drive.google.com/open?id=1AyTl6zGb8NwAxhDtnyqDLYcWATMnqQ8s

jamesmcclain commented 5 years ago

I found another question, Erdas Images format not work with hdfs protocol, but support local file.

I don't have access to any such files. If you can provide me with one that exhibits the problem, I am willing to take a look at it.

There is my test files, https://drive.google.com/open?id=1AyTl6zGb8NwAxhDtnyqDLYcWATMnqQ8s

Okay, thank you for the data.

It may be a little while before I can look at this because I am busy with other things at work. I can almost promise that I will have looked at it by the end of the upcoming weekend though :smile:

jamesmcclain commented 5 years ago

I am unable to reproduce the problem:

Script started on 2019-03-05 09:44:33-05:00
ubuntu@vsihdfs:~/Desktop$ gdalinfo '/vsihdfs/file:/home/ubuntu/Desktop/srtm_59_06.img'
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/home/ubuntu/local/hadoop-2.9.2/share/hadoop/common/lib/hadoop-auth-2.9.2.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Driver: HFA/Erdas Imagine Images (.img)
Files: /vsihdfs/file:/home/ubuntu/Desktop/srtm_59_06.img
Size is 6000, 6000
Coordinate System is:
GEOGCS["WGS 84",
    DATUM["WGS_1984",
        SPHEROID["WGS 84",6378137,298.257223563,
            AUTHORITY["EPSG","7030"]],
        TOWGS84[0,0,0,-0,-0,-0,0],
        AUTHORITY["EPSG","6326"]],
    PRIMEM["Greenwich",0,
        AUTHORITY["EPSG","8901"]],
    UNIT["degree",0.0174532925199433,
        AUTHORITY["EPSG","9122"]],
    AUTHORITY["EPSG","4326"]]
Origin = (110.000000000000000,35.000000000000000)
Pixel Size = (0.000833333333333,-0.000833333333333)
Metadata:
  AREA_OR_POINT=Area
Corner Coordinates:
Upper Left  ( 110.0000000,  35.0000000) (110d 0' 0.00"E, 35d 0' 0.00"N)
Lower Left  ( 110.0000000,  30.0000000) (110d 0' 0.00"E, 30d 0' 0.00"N)
Upper Right ( 115.0000000,  35.0000000) (115d 0' 0.00"E, 35d 0' 0.00"N)
Lower Right ( 115.0000000,  30.0000000) (115d 0' 0.00"E, 30d 0' 0.00"N)
Center      ( 112.5000000,  32.5000000) (112d30' 0.00"E, 32d30' 0.00"N)
Band 1 Block=64x64 Type=Int16, ColorInterp=Undefined
  Description = Layer_1
  NoData Value=-32768
  Metadata:
    LAYER_TYPE=athematic
  Image Structure Metadata:
    COMPRESSION=RLE

ubuntu@vsihdfs:~/Desktop$ java -cp .:gdal.jar:$CLASSPATH RasterDemo '/vsihdfs/file:/home/ubuntu/Desktop/srtm_59_06.img'
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/home/ubuntu/local/hadoop-2.9.2/share/hadoop/common/lib/hadoop-auth-2.9.2.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Driver: HFA/Erdas Imagine Images (.img)
/vsihdfs/file:/home/ubuntu/Desktop/srtm_59_06.img
Size: 6000, 6000
Coordinate System is:
GEOGCS["WGS 84",DATUM["WGS_1984",SPHEROID["WGS 84",6378137,298.257223563,AUTHORITY["EPSG","7030"]],TOWGS84[0,0,0,-0,-0,-0,0],AUTHORITY["EPSG","6326"]],PRIMEM["Greenwich",0,AUTHORITY["EPSG","8901"]],UNIT["degree",0.0174532925199433,AUTHORITY["EPSG","9122"]],AUTHORITY["EPSG","4326"]]
Origin = (110.000000, 35.000000)
PixelSize = (0.000833, -0.000833)
Metadata:
    AREA_OR_POINT=Area
DERIVED_SUBDATASETS Metadata:
    DERIVED_SUBDATASET_1_NAME=DERIVED_SUBDATASET:LOGAMPLITUDE:/vsihdfs/file:/home/ubuntu/Desktop/srtm_59_06.img
    DERIVED_SUBDATASET_1_DESC=log10 of amplitude of input bands from /vsihdfs/file:/home/ubuntu/Desktop/srtm_59_06.img
Band 1 Block=64x64 Type=3, ColorInterpretation=0
    Description = Layer_1
    NoData Value=-32768.0
    IMAGE_STRUCTURE Metadata:
        COMPRESSION=RLE
    Metadata:
        LAYER_TYPE=athematic
ubuntu@vsihdfs:~/Desktop$ exit

Script done on 2019-03-05 09:45:09-05:00

Setup: Ubuntu 18.10 VM, OpenJDK 11, Hadoop 2.9.2

jamesmcclain commented 5 years ago

I'm sorry, I misread your question above. You said that a local file was readable through HDFS but one on an HDFS filesystem is not? If that is the case, then I suspect that the HDFS setup may be the issue.

qfjiang1993 commented 5 years ago

I'm sorry, I misread your question above. You said that a local file was readable through HDFS but one on an HDFS filesystem is not? If that is the case, then I suspect that the HDFS setup may be the issue.

I don't think it's HDFS setup's problem, because I can read following raster formats: GeoTiff, ASCII Grid, Erdas Images, USGS ASCII DEM et. al. Even I read Erdas Images (img) using gdalinfo through /vsihdfs/hdfs:/ still throw ERROR 3: VSIFReadL() failed in HFAEntry::LoadData(), the strange thing was after the ERROR I can read the attributes, as follows:

[jiang@lake-geo demo]$ gdalinfo /vsihdfs/hdfs:/raster/srtm_59_06.img
ERROR 3: VSIFReadL() failed in HFAEntry::LoadData().
Driver: HFA/Erdas Imagine Images (.img)
Files: /vsihdfs/hdfs:/raster/srtm_59_06.img
Size is 6000, 6000
Coordinate System is:
GEOGCS["WGS 84",
    DATUM["WGS_1984",
        SPHEROID["WGS 84",6378137,298.257223563,
            AUTHORITY["EPSG","7030"]],
        TOWGS84[0,0,0,-0,-0,-0,0],
        AUTHORITY["EPSG","6326"]],
    PRIMEM["Greenwich",0,
        AUTHORITY["EPSG","8901"]],
    UNIT["degree",0.0174532925199433,
        AUTHORITY["EPSG","9122"]],
    AUTHORITY["EPSG","4326"]]
Origin = (110.000000000000000,35.000000000000000)
Pixel Size = (0.000833333333333,-0.000833333333333)
Metadata:
  AREA_OR_POINT=Area
Corner Coordinates:
Upper Left  ( 110.0000000,  35.0000000) (110d 0' 0.00"E, 35d 0' 0.00"N)
Lower Left  ( 110.0000000,  30.0000000) (110d 0' 0.00"E, 30d 0' 0.00"N)
Upper Right ( 115.0000000,  35.0000000) (115d 0' 0.00"E, 35d 0' 0.00"N)
Lower Right ( 115.0000000,  30.0000000) (115d 0' 0.00"E, 30d 0' 0.00"N)
Center      ( 112.5000000,  32.5000000) (112d30' 0.00"E, 32d30' 0.00"N)
Band 1 Block=64x64 Type=Int16, ColorInterp=Undefined
  Description = Layer_1
  NoData Value=-32768
  Metadata:
    LAYER_TYPE=athematic

But when using java code, after ERROR 3 the dataset was null, then exited JVM:

[jiang@lake-geo demo]$ java -cp .:$CLASSPATH:gdal.jar RasterDemo /vsihdfs/hdfs:/raster/srtm_59_06.img
ERROR 3: VSIFReadL() failed in HFAEntry::LoadData().
GDAL open failed - 3: VSIFReadL() failed in HFAEntry::LoadData().

Also I can use both vsicurl and vsiwebhdfs protocol to read img file.

## both worked
java -cp .:$CLASSPATH:gdal.jar RasterDemo /vsicurl/http://192.168.114.1:8080/data/srtm_59_06.img
java -cp .:$CLASSPATH:gdal.jar RasterDemo /vsiwebhdfs/http://lake-geo:50070/webhdfs/v1/raster/srtm_59_06.img

So, I can replace /vsihdfs/hdfs:/ with /vsiwebhdfs/http:/ in my app, then it worked, just wondering the ERROR 3: VSIFReadL() failed in HFAEntry::LoadData() question. Thanks!

jamesmcclain commented 5 years ago

Okay, I will give this another look over the weekend.

jamesmcclain commented 5 years ago

Okay, problem confirmed.

@qfjiang1993: I put up this pull request to address it. This PR allows your test program to succeed with the data that you supplied. Thank you for reporting this issue.

rouault commented 5 years ago

PR #1349 merged and backported to 2.4 branch as well

lichenang commented 2 years ago
import org.gdal.gdal.Dataset;
import org.gdal.gdal.Driver;

import org.gdal.gdal.gdal;
import org.gdal.gdalconst.gdalconstConstants;

public class Tifovr2Jpg {
    public static void main(String[] args) {
        gdal.AllRegister();
        gdal.SetConfigOption("gdal_FILENAME_IS_UTF8", "NO");
        System.out.println("下标为0的参数:"+args[0]);
        Dataset ds=gdal.Open(args[0], gdalconstConstants.GA_ReadOnly);
        System.out.println(ds);
        //判断数据源是否非空
        if (ds==null){
              System.out.println(" Dateset为空... ");
              System.exit(1);
        }

        Driver driver =  ds.GetDriver();
        System.out.println("driver : "+driver);
    }

}
root@Kylin:/home/gdalTest# java Tifovr2Jpg /vsihdfs/hdfs://192.168.56.46:8020/home/chdh/rawdata/chdl/image/06490082/20220415000000/06490082.tif
下标为0的参数:/vsihdfs/hdfs://192.168.56.46:8020/home/chdh/rawdata/chdl/image/06490082/20220415000000/06490082.tif
hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
hdfsGetPathInfo(hdfs://192.168.56.46:8020/home/chdh/rawdata/chdl/image/06490082/20220415000000/06490082.tif): constructNewObjectOfPath error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
ERROR 4: `/vsihdfs/hdfs://192.168.56.46:8020/home/chdh/rawdata/chdl/image/06490082/20220415000000/06490082.tif' does not exist in the file system, and is not recognized as a supported dataset name.
null
 Dateset为空...
root@Kylin:/home/gdalTest# gdalinfo  /vsihdfs/hdfs://192.168.56.46:8020/home/chdh/rawdata/chdl/image/06490082/20220415000000/06490082.tif
Driver: GTiff/GeoTIFF
Files: /vsihdfs/hdfs://192.168.56.46:8020/home/chdh/rawdata/chdl/image/06490082/20220415000000/06490082.tif
Size is 11251, 7501
Coordinate System is `'
Image Structure Metadata:
  INTERLEAVE=PIXEL
Corner Coordinates:
Upper Left  (    0.0,    0.0)
Lower Left  (    0.0, 7501.0)
Upper Right (11251.0,    0.0)
Lower Right (11251.0, 7501.0)
Center      ( 5625.5, 3750.5)
Band 1 Block=11251x16 Type=Byte, ColorInterp=Red
Band 2 Block=11251x16 Type=Byte, ColorInterp=Green
Band 3 Block=11251x16 Type=Byte, ColorInterp=Blue

I generated files such as gdal.jar through make in the swig/java directory of gdal, and placed them in the ext path of jdk. When running my java project, the above error occurred. When accessing the file through gdalinfo, there was no problem appear.

jamesmcclain commented 2 years ago

@lichenang Please see https://github.com/OSGeo/gdal/pull/714#issuecomment-1121266228