techmatters / terraso-product

Non-engineering tasks or tasks that have significance across repos in Terraso.
0 stars 0 forks source link

Legacy data portal: export of public data fails silently #1004

Closed DerekCaelin closed 2 months ago

DerekCaelin commented 2 months ago

When trying to export data, users of the legacy data portal see a permanent loading wheel.

Steps to reproduce

  1. Go to http://portal.landpotential.org/#/landpksmap
  2. Under data export select "LandInfo"
  3. In the popover window, type a valid date range (e.g., 07/01/2024 - 08/12/2024)
  4. (leave email field blank)
  5. Select "Export"
  6. See loading spinner
  7. See issue

Expected behavior

After a few moments, the browser should initiate a download

Actual behavior

The loader spins forever

Additional context

Above experience is on Firefox. On brave (chrome), after a minute, the loading wheel will stop and the user will land back on the map (no download)

If the user types in an email address with no sites created in that range, the website will quickly return a valid csv (a file with headers and no data).

DerekCaelin commented 2 months ago

This is high-pri because the user is failing to achieve their goal AND the user who shared this with me is strategically important. The head of engineering for isda-africa reached out to ask for data

paulschreiber commented 2 months ago

This is not a mobile-client bug. It belongs in https://github.com/LandPotential/Portal2.0.

paulschreiber commented 2 months ago

I tried this in both Chrome and Firefox for macOS. I'm able to download a CSV file (but it only has a header row).

knipec commented 2 months ago

Triage notes:

paulschreiber commented 2 months ago

I tried downloading data from 2020, and got a 500 error corresponding to a timeout:

500 Internal Server Error
The server encountered an unexpected condition which prevented it from fulfilling the request.

Traceback (most recent call last):
  File "/base/data/home/apps/s~landpks-prediction-model/20220923t102808.446663324286033986/lib/cherrypy/_cprequest.py", line 670, in respond
    response.body = self.handler()
  File "/base/data/home/apps/s~landpks-prediction-model/20220923t102808.446663324286033986/lib/cherrypy/lib/encoding.py", line 217, in __call__
    self.body = self.oldhandler(*args, **kwargs)
  File "/base/data/home/apps/s~landpks-prediction-model/20220923t102808.446663324286033986/lib/cherrypy/_cpdispatch.py", line 61, in __call__
    return self.callable(*self.args, **self.kwargs)
  File "/base/data/home/apps/s~landpks-prediction-model/20220923t102808.446663324286033986/LandPKS_API_Main_Running_Cherry_Cherrypd.py", line 182, in export
    response = urllib2.urlopen(req, timeout=15000)
  File "/base/alloc/tmpfs/dynamic_runtimes/python27g/63bce1ea76665d81/python27/python27_dist/lib/python2.7/urllib2.py", line 154, in urlopen
    return opener.open(url, data, timeout)
  File "/base/alloc/tmpfs/dynamic_runtimes/python27g/63bce1ea76665d81/python27/python27_dist/lib/python2.7/urllib2.py", line 429, in open
    response = self._open(req, data)
  File "/base/alloc/tmpfs/dynamic_runtimes/python27g/63bce1ea76665d81/python27/python27_dist/lib/python2.7/urllib2.py", line 447, in _open
    '_open', req)
  File "/base/alloc/tmpfs/dynamic_runtimes/python27g/63bce1ea76665d81/python27/python27_dist/lib/python2.7/urllib2.py", line 407, in _call_chain
    result = func(*args)
  File "/base/alloc/tmpfs/dynamic_runtimes/python27g/63bce1ea76665d81/python27/python27_dist/lib/python2.7/urllib2.py", line 1228, in http_open
    return self.do_open(httplib.HTTPConnection, req)
  File "/base/alloc/tmpfs/dynamic_runtimes/python27g/63bce1ea76665d81/python27/python27_dist/lib/python2.7/urllib2.py", line 1201, in do_open
    r = h.getresponse(buffering=True)
  File "/base/alloc/tmpfs/dynamic_runtimes/python27g/63bce1ea76665d81/python27/python27_dist/lib/python2.7/gae_override/httplib.py", line 522, in getresponse
    **extra_kwargs)
  File "/base/alloc/tmpfs/dynamic_runtimes/python27g/63bce1ea76665d81/python27/python27_lib/versions/1/google/appengine/api/urlfetch.py", line 291, in fetch
    return rpc.get_result()
  File "/base/alloc/tmpfs/dynamic_runtimes/python27g/63bce1ea76665d81/python27/python27_lib/versions/1/google/appengine/api/apiproxy_stub_map.py", line 615, in get_result
    return self.__get_result_hook(self)
  File "/base/alloc/tmpfs/dynamic_runtimes/python27g/63bce1ea76665d81/python27/python27_lib/versions/1/google/appengine/api/urlfetch.py", line 411, in _get_fetch_result
    rpc.check_success()
  File "/base/alloc/tmpfs/dynamic_runtimes/python27g/63bce1ea76665d81/python27/python27_lib/versions/1/google/appengine/api/apiproxy_stub_map.py", line 596, in check_success
    self.__rpc)
DeadlineExceededError: The overall deadline for responding to the HTTP request was exceeded.
Powered by CherryPy 3.7.0
paulschreiber commented 2 months ago

I was able to download data from 2020 (for two weeks, 771 rows).

DerekCaelin commented 2 months ago

Paul:

large export requests (eg 4 years) still fail. We can manually generate csvs OR "we can bump up the time allotted for requests and/or do manual exports"