Open mgarrod opened 1 year ago
I went back to previous versions. It works on ubuntu-full-3.4.0-amd64. It breaks somewhere between 3.4.0 and 3.5.0
I get this error only if not passing the AWS credentials. For example, the following works:
docker run --rm -it -v $HOME:$HOME -e AWS_SECRET_ACCESS_KEY -e AWS_ACCESS_KEY_ID ghcr.io/osgeo/gdal:ubuntu-small-3.6.4 gdal2tiles.py $PWD/autotest/gcore/data/byte.tif /vsis3/spatialys/outgdal2tiles
So are you sure you're providing credentials correctly in your tries with the 3.6.4 image? If so, how do you provide them ?
Thanks for the reply. I was setting the AWS credentials using aws configure
as well as setting the env variable for GDAL_CONFIG_FILE
, which points to a file containing this
[credentials]
[.private_bucket] path=/vsis3/mybucketname AWS_SECRET_ACCESS_KEY= AWS_ACCESS_KEY_ID=
I made sure the AWS key id and secret has access to write to the s3 bucket by issuing the aws s3 mv
command. When I try the example you posted, I get this
Warning 1: HTTP response code on https://s3.amazonaws.com/mybucketname/test: 0
Warning 1: HTTP response code on https://s3.amazonaws.com/mybucketname/: 0
Traceback (most recent call last):
File "/usr/bin/gdal2tiles.py", line 15, in
I am supplying the AWS key id and secret with -e AWS_SECRET_ACCESS_KEY=mysecret -e AWS_ACCESS_KEY_ID=mykey
. I was setting the AWS credentials using
aws configure
as well as setting the env variable forGDAL_CONFIG_FILE
That works for me:
$ docker run --rm -it -v $HOME:$HOME -e GDAL_CONFIG_FILE=$PWD/test.txt ghcr.io/osgeo/gdal:ubuntu-small-3.6.4 gdal2tiles.py $PWD/autotest/gcore/data/byte.tif /vsis3/spatialys/outgdal2tiles
Generating Base Tiles:
0...10...20...30...40...50...60...70...80...90...100
I am still getting the same error. I tried using an admin account with full access to AWS, docker on an AWS EC2 instance, and docker on my Mac. I am able to generate tiles with the .tif file I am using if I write to a local directory. I'll keep plugging away to see if I can figure out what is going on.
Same issue. If output "folder" does not exist on S3. Works for 3.4.0
Expected behavior and actual behavior.
I am trying to use gdal2tiles.py to write tiles directly to AWS S3 bucket from the latest linux/amd64 docker image, but I get the following error:
Traceback (most recent call last): File "/usr/lib/python3/dist-packages/osgeo_utils/gdal2tiles.py", line 4565, in
sys.exit(main(sys.argv))
File "/usr/lib/python3/dist-packages/osgeo_utils/gdal2tiles.py", line 4527, in main
return submain(argv)
File "/usr/lib/python3/dist-packages/osgeo_utils/gdal2tiles.py", line 4544, in submain
single_threaded_tiling(input_file, output_folder, options)
File "/usr/lib/python3/dist-packages/osgeo_utils/gdal2tiles.py", line 4385, in single_threaded_tiling
conf, tile_details = worker_tile_details(input_file, output_folder, options)
File "/usr/lib/python3/dist-packages/osgeo_utils/gdal2tiles.py", line 4290, in worker_tile_details
gdal2tiles.generate_metadata()
File "/usr/lib/python3/dist-packages/osgeo_utils/gdal2tiles.py", line 2551, in generate_metadata
makedirs(self.output_folder)
File "/usr/lib/python3/dist-packages/osgeo_utils/gdal2tiles.py", line 95, in makedirs
if gdal.MkdirRecursive(path, 0o755) != 0:
File "/usr/lib/python3/dist-packages/osgeo/gdal.py", line 2040, in MkdirRecursive
return _gdal.MkdirRecursive(*args)
RuntimeError: unknown error occurred
gdal2tiles.py --zoom=1 output.tif /tmp/test
aws s3 mv /tmp/test s3://mybucketname/test --recursive
Steps to reproduce the problem.
gdal2tiles.py --zoom=1 output.tif /vsis3/mybucketname/test
Operating system
linux/amd64 docker image -
docker pull ghcr.io/osgeo/gdal:ubuntu-small-3.6.4@sha256:2283c0b7b2b55a2fa4865293ba11e6cfe6fbee9a1bfa0592994882144767a75e
Ubuntu 22.04.2 LTS
GDAL version and provenance
gdalinfo --version
GDAL 3.6.4python --version
Python 3.10.6