fsspec / gcsfs

Pythonic file-system interface for Google Cloud Storage
http://gcsfs.readthedocs.io/en/latest/
BSD 3-Clause "New" or "Revised" License
331 stars 142 forks source link

cat_file with start and end of gzipped file does not work. #512

Open racinmat opened 1 year ago

racinmat commented 1 year ago

Reading gzipped file using transcoding works when you use the fs.open, but not when using fs.cat_file. Here is and example uploading 2 files, 1 plaintext, 1 gzipped, and both files are read using open, and then using cat_file:

This part works:

fs = gcsfs.GCSFileSystem(project='a')
a_file = 'same_path/a_test'
a_file_gz = 'same_path/a_test.gz'
with fs.open(a_file, 'wb') as f:
    f.write(b'abcd')
with fs.open(a_file_gz, 'wb', compression='gzip', fixed_key_metadata={'content_encoding': 'gzip'}) as f:
    f.write(b'abcd')
with fs.open(a_file, 'rb') as f:
    assert f.read() == b'abcd'
with fs.open(a_file_gz, 'rb') as f:
    assert f.read() == b'abcd'
assert bytes(fs.cat_file(a_file, 1, 3)) == b'bc'

this errors out

assert bytes(fs.cat_file(a_file_gz, 1, 3, fixed_key_metadata={'content_encoding': 'gzip'})) == b'bc'
assert bytes(fs.cat_file(a_file_gz, 1, 3)) == b'bc'

throwing this error

self = <StreamReader e=ClientPayloadError("400, message='Can not decode content-encoding: gzip'")>
n = -1

    async def read(self, n: int = -1) -> bytes:
        if self._exception is not None:
>           raise self._exception
E           aiohttp.client_exceptions.ClientPayloadError: 400, message='Can not decode content-encoding: gzip'

C:\tools\miniconda3\envs\filesystem-py39\lib\site-packages\aiohttp\streams.py:349: ClientPayloadError

my guess is because it's not passing the header in https://github.com/fsspec/gcsfs/blob/main/gcsfs/core.py#L859-L863

martindurant commented 1 year ago
with fs.open(a_file_gz, 'wb', compression='gzip', fixed_key_metadata={'content_encoding': 'gzip'}) as f:
    f.write(b'abcd')

this is not correct. The content encoding is not the same as the MIME type, which would be "application/gzip". If you wanted to use content encoding like this, then the appropriate compression is actually none.

I don't exactly follow what your code snippet is trying to achieve: what behaviour are you after?

racinmat commented 1 year ago

I am trying to use the gzip transcoding https://cloud.google.com/storage/docs/transcoding The documentation there literally says Content-Encoding: gzip

The code I have used properly encodes the data into gzip format, and on google cloud, it is stored in gzip-compressed format, and when I download it, it is automatically decompressed, as the documentation states, so I'm not sure why its not correct when it's doing what it should. When I look at the object in the bucket browser, it shows correct encoding. image

martindurant commented 1 year ago

Yes, but in that case, fsspec must not attempt to decompress it, because the transport library (aiohttp) should have done it already. Also note that the size reported for the file might be wrong.

racinmat commented 1 year ago

I know, but AFAIK fsspec does not attempt to decompress it, because the compression='gzip' is only at the 'wb', because GCP needs to obtain the data compressed, and does not compress it on its own, but during the rb' there is no compression, I am just adding the header, because without it the code throws error. And apparently it works, because the fsspec correctly obtains the decompressed data.

racinmat commented 1 year ago

I found out that if I read the whole file, it works. The size of gzipped file is 24 bytes.

assert fs.read_range(a_file_gz, 0, 23) == b'abcd'

But when I read only part it seems it does not work, and it looks like it tried to decode it.

racinmat commented 1 year ago

I found out the problem, it's in headers, I can replicate the error in curl

curl --location --request GET 'https://storage.googleapis.com/download/storage/v1/b/our-temp/o/tmp_bong%2Fa_test.gz?alt=media' \
--header '... \
--header 'Range: bytes=1-5' \
--header 'Accept-Encoding: gzip, deflate, br'

errors out, but when using only

```bash
curl --location --request GET 'https://storage.googleapis.com/download/storage/v1/b/our-temp/o/tmp_bong%2Fa_test.gz?alt=media' \
--header '... \
--header 'Range: bytes=1-5' \
--header 'Accept-Encoding: deflate, br'

without the gzip, it works and returns the whole contents according to the docs. And there is no way to pass some custom Accept-Encoding header to the underlying GET call.

martindurant commented 1 year ago

This is not unexpected. You can only get specific offsets within the bytestream after decompression, this is a limitation of gzip. I expect the server is really returning the byte range you request out of the original compressed data, but that no longer is a valid gzip stream and so causes the error. If you save your data as gzip, you cannot expect random access of uncompressed data.

racinmat commented 1 year ago

The server decompressed the data and returns the whole range. The GCP documentation, link I shared, states the whole file contents is returned, decoded.

martindurant commented 1 year ago

Well, in the first place it says that you shouldn't ever do this; in the second that the header key will be ignored. And thirdly we have found that the documentation is incorrect. I don't think there's anything gcsfs can do about this.