Riverscapes / riverscapes-tools

Open-source Python 3.0 tools for the Riverscapes organization
https://tools.riverscapes.net/
GNU General Public License v3.0
10 stars 8 forks source link

3DEP API Research #145

Closed philipbaileynar closed 1 year ago

philipbaileynar commented 3 years ago

The USGS has an API and Python package for interacting with 3DEP. We need to spend a day playing with it and researching how we can leverage this... I just watched a demo of retrieving 1m LiDAR data on the fly using a simple bounding box. Wow!

A quick distinction first. I see potential at two scales. First, there's the riverscapes modeling scale which we think of as the HUC8 (or maybe HUC10 or HUC12). This might only be practical at 10m DEMs through this API, but it's still worth doing. Second, there's the river restoration scale of 1-2 miles of river. I could see calling an API with a bounding box around 1 mile restoration project area and requesting a 1m LiDAR DEM.

What else is possible and interesting in this toolbox?

https://github.com/cheginit/py3dep

See other issue on pyNHD (#147). And here's the link for the umbrella Hydrodata initiative.

KellyMWhitehead commented 3 years ago

Most of the time I am able to download elevation rasters (as well as slope, hillshade, etc) based on a shapely polygon and then save it to geotiff. Response is quick relative to the size of the extent and resolution.

Questions:

KellyMWhitehead commented 3 years ago

Issues explained here: https://github.com/Riverscapes/riverscapes-tools/blob/3dep/packages/rscontext/research/3dep_testing.ipynb

joewheaton commented 3 years ago

@philipbaileynar, Brian Murphy and @nick4rivers both have funding on projects for this 3DEP feature. I get this thread is just testing feasibility. Last we spoke with @nick4rivers we talked about this as a May priority. Is that still the case? We know roughly what budget is for 30 mile and Herb. I will get some constraints on that from Brian Murphy as well. Both will be billed to AS.

philipbaileynar commented 3 years ago

I just posted two issues on the py3DEP repo:

philipbaileynar commented 3 years ago

Here is a handy webmap showing the availability of 1m LiDAR DEMs. It's disappointedly sparse!

(@MattReimer look at the address of this map)

Screen Shot 2021-03-01 at 12 16 07 PM
MattReimer commented 3 years ago

USGS uses dynamic JS scripts in all their s3 "folders" to show automatic indexes and maps. Totally valid thing to do but I think you're commenting on the fact that they never bothered using a custom domain.

Why? Could it be that urls and custom domains matter less with every passing day? 😜

philipbaileynar commented 3 years ago

It was actually the use of AWS to host web maps that I was interested in.

philipbaileynar commented 3 years ago

@KellyMWhitehead can you request data from somewhere that appears NOT to have LiDAR coverage in the above image. Somewhere in the middle of Nevada for example. Try and get 1m, 3m, 5m and 10m for an area where we think there is no LiDAR. What do we get back?

The above image suggests that we should abandon 3DEP and revisit in a year or so once it has better coverage. I just can't tell if there is 3m or 5m coverage that's better than the image above.

philipbaileynar commented 3 years ago

@MattReimer it just occurs to me that it's worth taking a quick look inside the py3dep code to learn what source that library actually uses to retrieve the data.

If we have heistancy about relying on a nascent, unproven library for retriving important topographic data within our tools, it might be that we can sidestep this library and simply get the data from the original source (assuming that the original is an official, public, well-documented system).

@MattReimer can you please revisit the py3dep code and just poke around to see where they are getting the data from.

philipbaileynar commented 3 years ago

I just came across this dedicated API for "The National Map" (TNM) of which both the NED And NHD are apart. It seems that it is another way of getting access to the same information in Science Base... but perhaps its more reliable or direct if we query these data directly?

https://apps.nationalmap.gov/tnmaccess/index.html

MattReimer commented 3 years ago

Here's a couple more coverage maps I found:

Seems like 1/3 arc-second (approximately 10 m) is the only thing that has national coverage for now.

Screenshot 2021-05-12 140431 Screenshot 2021-05-12 140601
MattReimer commented 3 years ago

Seems like we could just translate our needs into S3 addresses directly

https://prd-tnm.s3.amazonaws.com/index.html?prefix=StagedProducts/Elevation/13/TIFF/n41w121/

without too much trouble I was able to get a list of every .gpkg and .tiff file in the bucket:

all3dep.txt

2019-12-10 20:09:19  498983070 StagedProducts/Elevation/13/TIFF/n40w082/USGS_13_n40w082.tif
2021-01-11 18:39:59     163840 StagedProducts/Elevation/13/TIFF/n40w082/USGS_13_n40w082_20210111.gpkg
2021-01-11 18:39:59  499055172 StagedProducts/Elevation/13/TIFF/n40w082/USGS_13_n40w082_20210111.tif
2019-12-05 18:31:22     159744 StagedProducts/Elevation/13/TIFF/n40w082/n40w082.gpkg
2020-03-21 18:38:47  484898314 StagedProducts/Elevation/13/TIFF/n40w083/USGS_13_n40w083.tif
2021-01-11 18:40:19     131072 StagedProducts/Elevation/13/TIFF/n40w083/USGS_13_n40w083_20210111.gpkg
2021-01-11 18:40:20  484922910 StagedProducts/Elevation/13/TIFF/n40w083/USGS_13_n40w083_20210111.tif
2020-03-21 18:38:47     135168 StagedProducts/Elevation/13/TIFF/n40w083/n40w083.gpkg
2020-03-21 18:38:49  435226185 StagedProducts/Elevation/13/TIFF/n40w084/USGS_13_n40w084.tif
2020-03-21 18:38:51     143360 StagedProducts/Elevation/13/TIFF/n40w084/n40w084.gpkg
2019-12-10 20:10:06  452623067 StagedProducts/Elevation/13/TIFF/n40w085/USGS_13_n40w085.tif
2019-12-05 18:31:22    2113536 StagedProducts/Elevation/13/TIFF/n40w085/n40w085.gpkg

...

The geopackage gives us some shapes and basic information about where the data came from

Screenshot 2021-05-12 142411

As hack-y as it sounds it's probably pretty easy just to create a csv lookup that accounts for any weird spellings and then just hard code this into our tools.

There are only so many degree arcs to cover the US so this is one-time work.

MattReimer commented 3 years ago

Just for completeness I tried just blindly guessing S3 paths based on lat and lng without doing a science base query. Green is where we found an appropriate DEM

Screenshot 2021-05-13 104013

It's a bit naughty to bypass the API and just guess at network paths but there don't seem to be any holes and the API seems to go down from time to time so this may be an acceptable compromise.

philipbaileynar commented 3 years ago

I think this is a totally acceptable thing to do!

And I believe that the same assumptions can be made for NHD!!!!!!!!!

MattReimer commented 1 year ago

This task is complete I think. Closing the issue.