We ran into an interesting issue. If we read DBF file from disk -- everything is fine, but if we read exactly the same file from Google Cloud Storage, there is an "Unexpected end of file" exception thrown. After some digging it turned out it's caused by DBFReaderdataInputStream calls like this
bytesReaded = this.dataInputStream.read(b_array);
if (bytesReaded < field.getLength()) {
throw new EOFException("Unexpected end of file");
}
It looks like it's not guaranteed that dataInputStream.read fills the entire buffer array even if the underlying stream is not complete yet. So if there is some latency, the stream can fill just a part of the buffer. A good option here could be using readFully method instead of just read:
From DataInputStream source code:
public final void readFully(byte b[], int off, int len) throws IOException {
if (len < 0)
throw new IndexOutOfBoundsException();
int n = 0;
while (n < len) {
int count = in.read(b, off + n, len - n);
if (count < 0)
throw new EOFException();
n += count;
}
}
We ran into an interesting issue. If we read DBF file from disk -- everything is fine, but if we read exactly the same file from Google Cloud Storage, there is an "Unexpected end of file" exception thrown. After some digging it turned out it's caused by
DBFReader
dataInputStream
calls like thisSee https://github.com/albfernandez/javadbf/blob/f2ac74527aa4276ae41127a3be6cdf6185b0acaf/src/main/java/com/linuxense/javadbf/DBFReader.java#L418 for example
It looks like it's not guaranteed that
dataInputStream.read
fills the entire buffer array even if the underlying stream is not complete yet. So if there is some latency, the stream can fill just a part of the buffer. A good option here could be usingreadFully
method instead of justread
:From DataInputStream source code: