Azure / azure-storage-blob-go

Microsoft Azure Blob Storage Library for Go
MIT License
157 stars 102 forks source link

Panic when accessing blob with gzip encoding #189

Open vl-dev opened 4 years ago

vl-dev commented 4 years ago

Which version of the SDK was used?

0.10.0

Which platform are you using? (ex: Windows, Linux, Debian)

centos-release-7-8.2003.0.el7.centos.x86_64

What problem was encountered?

We have blobs with the ContentEncoding = gzip stored in the Azure Blob storage container. When we tried to use the function DownloadBlobToBuffer we got the following panic:

panic: runtime error: slice bounds out of range [:-1]

goroutine 34 [running]:
github.com/Azure/azure-storage-blob-go/azblob.downloadBlobToBuffer.func1(0x0, 0xffffffffffffffff, 0x79ed40, 0xc00058b580, 0x1, 0x101)
        /home/centos/go/pkg/mod/github.com/!azure/azure-storage-blob-go@v0.8.0/azblob/highlevel.go:225 +0x49f
github.com/Azure/azure-storage-blob-go/azblob.DoBatchTransfer.func2(0xc0003de540, 0xc000035fa8)
        /home/centos/go/pkg/mod/github.com/!azure/azure-storage-blob-go@v0.8.0/azblob/highlevel.go:335 +0x50
github.com/Azure/azure-storage-blob-go/azblob.DoBatchTransfer.func1(0xc0003de540, 0xc0005b0c60)
        /home/centos/go/pkg/mod/github.com/!azure/azure-storage-blob-go@v0.8.0/azblob/highlevel.go:319 +0x3f
created by github.com/Azure/azure-storage-blob-go/azblob.DoBatchTransfer
        /home/centos/go/pkg/mod/github.com/!azure/azure-storage-blob-go@v0.8.0/azblob/highlevel.go:317 +0x21c

How can we reproduce the problem in the simplest way?

  1. Create a Storage Account on the Azure Portal
  2. Create a blob container in this account
  3. Upload a file with the ContentEncoding set to gzip (or change the encoding directly in the portal)
  4. Run the following code snippet:
    
    package main

import ( "context" "fmt" "log" "net/url"

"github.com/Azure/azure-storage-blob-go/azblob"

)

func main() { containerName := "" accountKey := "" accountName := "" blobName := ""

credential, err := azblob.NewSharedKeyCredential(accountName, accountKey)
if err != nil {
    log.Fatal(err)
}
pipeline := azblob.NewPipeline(credential, azblob.PipelineOptions{})
u, _ := url.Parse(fmt.Sprintf("https://%s.blob.core.windows.net", accountName))
serviceURL := azblob.NewServiceURL(*u, pipeline)
containerURL := serviceURL.NewContainerURL(containerName)
blobURL := containerURL.NewBlobURL(blobName)
opts := azblob.DownloadFromBlobOptions{}
var b []byte
if err := azblob.DownloadBlobToBuffer(context.Background(), blobURL, 0, 0, b, opts); err != nil {
    log.Fatal(err)
}

}



### Have you found a mitigation/solution?
The only solution that we have found is not to fill in the `ContentEncoding`. Then the blobs are downloaded without any issues.
serprex commented 3 years ago

Declaring var b []byte as var b []byte = make([]byte, sizeoffile) fixes the issue. I find it odd this buffer function doesn't take a byteBuffer which could grow to fit the blob

Note the output has changed to returning an error about "out of range" since io.WriterAt change

If you change the buffer length to be larger than 0 but smaller than file, you'll get "Not enough space for all bytes"