Running the sample code a while ago I ended up with 2600 images on disk at 12-13MB per - roughly 32GB. The hypothesis is if you are able to optionally download smaller versions of the image ( thumb_2048_url instead of thumb_original_url field in the request ) that would significantly reduce all compute/network resources that is brought to bear on the task while still being accurate enough for the task.
I did try just swapping out the params as mentioned above for a quick test and got the following error, but if I had to bet I'd say it's because I'm doing something wrong. Hoping an expert can weight in.
Per Dan's request...
Running the sample code a while ago I ended up with 2600 images on disk at 12-13MB per - roughly 32GB. The hypothesis is if you are able to optionally download smaller versions of the image ( thumb_2048_url instead of thumb_original_url field in the request ) that would significantly reduce all compute/network resources that is brought to bear on the task while still being accurate enough for the task.
I did try just swapping out the params as mentioned above for a quick test and got the following error, but if I had to bet I'd say it's because I'm doing something wrong. Hoping an expert can weight in.
400 Client Error: Bad Request for url: https://graph.mapillary.com/images?access_token=--REDACTED--&fields=id%2Cthumb_2048_url%2Cgeometry&is_pano=true&bbox=-85.65528860431792%2C41.95015961908728%2C-85.65510860413792%2C41.950339619267275