multiply-org / sar-pre-processing

This is the MULTIPLY repository containing the functionality for SAR pre processing.
Other
30 stars 13 forks source link

HTTP 401 in Step 2 in Jupyter Notebook #11

Closed arthur-e closed 3 years ago

arthur-e commented 3 years ago

I am running the examples in the Jupyter Notebook at docs/notebooks/running_test_application.ipynb and encountering some problems.

One issue is with Step 2, where I am supposed to be able to download Sentinel 1 SLC data through the SentinelAPI. I have filled out my log-in credentials as such:

user = 'my_user_name'
password = 'my_password'
# initialize settings
api = SentinelAPI(user, password, 'https://scihub.copernicus.eu/apihub/')

I verified that the user and password above are correct by logging into the Copernicus Open Access Hub web application. When I execute the second box in Step 2, however, I get the following error message (and traceback):

---------------------------------------------------------------------------
SentinelAPIError                          Traceback (most recent call last)
<ipython-input-4-00be6bc9c533> in <module>
      1 # search by polygon (MNI test site coordinates), time, and SciHub query keywords
      2 footprint = geojson_to_wkt(read_geojson('coordinates_mni.geojson'))
----> 3 products = api.query(footprint,
      4                      date=('20210101', '20210120'),
      5                      platformname='Sentinel-1',

/usr/local/dev/sar-pre-processing/env/lib/python3.9/site-packages/sentinelsat/sentinel.py in query(self, area, date, raw, area_relation, order_by, limit, offset, **keywords)
    145                           order_by, limit, offset, query)
    146         formatted_order_by = _format_order_by(order_by)
--> 147         response, count = self._load_query(query, formatted_order_by, limit, offset)
    148         self.logger.info("Found %s products", count)
    149         return _parse_opensearch_response(response)

/usr/local/dev/sar-pre-processing/env/lib/python3.9/site-packages/sentinelsat/sentinel.py in _load_query(self, query, order_by, limit, offset)
    270 
    271     def _load_query(self, query, order_by=None, limit=None, offset=0):
--> 272         products, count = self._load_subquery(query, order_by, limit, offset)
    273 
    274         # repeat query until all results have been loaded

/usr/local/dev/sar-pre-processing/env/lib/python3.9/site-packages/sentinelsat/sentinel.py in _load_subquery(self, query, order_by, limit, offset)
    302                                      headers={'Content-Type': 'application/x-www-form-urlencoded; charset=UTF-8'},
    303                                      timeout=self.timeout)
--> 304         _check_scihub_response(response)
    305 
    306         # store last status code (for testing)

/usr/local/dev/sar-pre-processing/env/lib/python3.9/site-packages/sentinelsat/sentinel.py in _check_scihub_response(response, test_json)
   1025         # See PEP 409
   1026         api_error.__cause__ = None
-> 1027         raise api_error
   1028 
   1029 

SentinelAPIError: HTTP status 401 Unauthorized: 
# HTTP Status 401 – Unauthorized

* * *

 **Type** Status Report

 **Message** Unauthorized

 **Description** The request has not been applied because it lacks valid
authentication credentials for the target resource.

* * *

### Apache Tomcat/8.0.53

Is there a missing step? e.g., the creation of an API key? Or is there an API version mismatch?

McWhity commented 3 years ago

It seems that the copernicus data hub has changed the download link within the last view weeks. I will update the jupyter notebook as well as the installed version of "sentinelsat" within the next hour. This should solve the problem.

McWhity commented 3 years ago

I updated the "sentinelsat" package and the jupyter notebook. After reinstalling "sensarp" the downloading of Sentinel-1 data from the Copernicus Data Hub should work.

arthur-e commented 3 years ago

Yes, I believe that has fixed the issue. Thanks! I am prompted to download the files. I'll have to check later that the download completes (it is very large).