landsat-pds / landsat_ingestor

Scripts and other artifacts for landsat data ingestion into Amazon public hosting.
Apache License 2.0
102 stars 18 forks source link

Throttled downloads from usgs servers? #15

Closed kapadia closed 8 years ago

kapadia commented 8 years ago

A new error is being propagated from USGS servers. It appears that changes have been made on their end, and are now throttling downloads.

If this is the case, we'll need to re-work certain areas of the landsat-ingestor to request no more than 10 download urls at a time. I'll investigate a little more to find out the extent of this new constraint.

usgs.USGSError: User currently has more than 10 downloads that have not been attempted in the past 10 minutes.

/cc @warmerdam @jedsundwall @camillacaros

kapadia commented 8 years ago

Well .. it looks like our account is severely throttled. This script reproduces the problem:

https://gist.github.com/kapadia/a414dca221c6c976d4c1

yielding the same error from our download stack:

Traceback (most recent call last):
  File "/usr/local/bin/usgs", line 9, in <module>
    load_entry_point('usgs==0.1.5', 'console_scripts', 'usgs')()
  File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 664, in __call__
    return self.main(*args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 644, in main
    rv = self.invoke(ctx)
  File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 991, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 837, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 464, in invoke
    return callback(*args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/usgs/scripts/cli.py", line 198, in download_url
    data = api.download(dataset, node, scene_ids, product)
  File "/usr/local/lib/python2.7/dist-packages/usgs/api.py", line 139, in download
    _check_for_usgs_error(root)
  File "/usr/local/lib/python2.7/dist-packages/usgs/api.py", line 44, in _check_for_usgs_error
    raise USGSError(fault_string)
usgs.USGSError: User currently has more than 10 downloads that have not been attempted in the past 10 minutes.

There are still a few scenes that squeeze through this constraint.

Since the time limit threshold is stated as 10 minutes. I've temporarily paused the stack as an effort to clear this constraint. We'll likely still have to re-factor the landsat-ingestor to request fewer download urls, or act on them more rapidly.

camillacaros commented 8 years ago

heyo @kapadia, thanks for the info

There are still a few scenes that squeeze through this constraint.

to clarify, this means if > 10 downloads are attempted within 10 minute limit is the issues that the majority are failing? ie a few sneak through

I've temporarily paused the stack as an effort to clear this constraint.

sounds like this might exacerbate the issue if there will then be a greater backlog of imagery to download, no? trying to get a grasp on how the queue is working here (poking at the puller utils).

kapadia commented 8 years ago

Hiya @camillacaros -

I'm still trying to understand the exact meaning of the error posted above. My interpretation based purely on the message is that if 10 download urls have been requests AND not attempted, then USGS will return an error.

In it's current state, the landsat-ingestor uses parallel to request 10 scenes at a time. Because we still experience a high number of 503s from USGS servers, we likely surpass the limit of untried downloads by a large margin.

Temporarily pausing the stack is an effort to clear out the 10 untried download constraint that USGS is now tracking.

warmerdam commented 8 years ago

@kapadia You might find turning the parallel's down to 5 or 6 mostly clears the "more than 10 outstanding download urls".

kapadia commented 8 years ago

@warmerdam Yes that's one change to be made, however, (I think) the 503s that we continue to encounter contribute to the limit. This would result in amounting many download urls that have not been marked as accessed.

kapadia commented 8 years ago

To better understand the rate limiting, I hit USGS servers in various ways. The gist posted above helps confirm that a user cannot have more than 10 download urls that have not been accessed (e.g. you can't just hoard urls, you gotta use them). We crossed that line pretty quickly by running 10 concurrent downloads, while also continuing to get 503s.

After testing a few levels of concurrency, it seems it's best to stick with 2 - 3 concurrent downloads. This will have an impact on the job time, though the extent is not yet clear since the current job is working through the back log.

The stack is running again, and does not have any signs of the rate limiting error or 503s.

camillacaros commented 8 years ago

(e.g. you can't just hoard urls, you gotta use them). We crossed that line pretty quickly by running 10 concurrent downloads, while also continuing to get 503s

ah interesting. thanks for the dig & fix @kapadia !

kapadia commented 8 years ago

np C-dawg.