WSWCWaterDataExchange / MappingStatesDataToWaDE2.0

Manage all code to map and import state's data into WaDE 2.0
BSD 3-Clause "New" or "Revised" License
4 stars 1 forks source link

Import UT Gage/reservoir data #128

Open amabdallah opened 1 year ago

amabdallah commented 1 year ago

To reach out to UDWRi and see if they have a better way to access this data https://www.waterrights.utah.gov/distinfo/realtime_info.asp https://maps.waterrights.utah.gov/EsriMap/map.asp?layersToAdd=distribution

Old? https://www.waterrights.utah.gov/distinfo/default.asp

https://water.utah.gov/reservoirlevels/ RESERVOIR_STORAGE https://www.waterrights.utah.gov/cgi-bin/pubdump.exe

rwjam commented 1 year ago
rwjam commented 1 year ago

We may be able to get some data from here in the future. But it is not downloadable / in a machine friendly setup. https://www.waterrights.utah.gov/distinfo/distribution_systems.asp

amabdallah commented 1 year ago

On 10/06/2022, David J. Jones djjones@utah.gov @ UTDWRi shared a shapefile of ~1,600 stations which include Station ID, name, and system name. UTDWRi seems to have an API that allows a dynamic query based on the station ID and years range but the output is a CSV file. Daily data is great, but we also need the monthly data if possible.

https://www.waterrights.utah.gov/dvrtdb/DailyCommaData.asp?BYEAR=1960&EYEAR=2022&StationId=2614&Units=Mean+daily+discharge+in+CFS

So hopefully, we could loop through the Stations IDs in the shapefile into this API and unpack the CSV file in memory. This might be a good resource https://stackoverflow.com/questions/16283799/how-to-read-a-csv-file-from-a-url-with-python

amabdallah commented 1 year ago

I tried to change Daily to monthly in the API call, but no luck. Not sure if there is a dynamic way to get the monthly values. We're okay with daily.

https://www.waterrights.utah.gov/dvrtdb/**Daily**CommaData.asp?BYEAR=1960&EYEAR=2022&StationId=2614&Units=Mean+**daily**+discharge+in+CFS

rwjam commented 1 year ago

So it's like the intermediately step preventing up from using the proper api service.

amabdallah commented 1 year ago

Ryan shared this evaluation on Slack.

Ran into a snag with the UTssro API data we've been trying to tackle. Did some digging and looking at results and unfortunately I noticed the return data is not the same & it will change each time you try and access that first api. Get's a little complicated but I'll do my best to describe the problem we are facing.

  • So we are facing a problem that listed api we have access to doesn't allow us to save the information in tabular format for mass machine readable data. As of right we would have to access that API manually & download each link / file for about 1,600 sites.
  • What we can save to a table for mass machine readable info from that api is kind of a HTML return of that app page of theirs (see image).
  • We had the idea of trimming down that HTML return to what we thought was a second layer api, and that does kind of work as that second api result allows us to save it to a table.
  • HOWEVER, I just realized the portion we are trimming down from that first api is like a timestamp id value of when that first api was called, not actual api attributes we can manipulate for the second one.
  • So each time we try to use that first api -> trim it down to what we thought was a working second api -> will return different results each time it's called as we can't specify the site we want for that second api.
  • I've tried already manually entering like site info to that second api with no luck, just returns an error.

image

amabdallah commented 1 year ago

I followed up with our POC at UT DWRi. They will see if they can easily tweak the current CSV download process to make it more useful to our scripts.

Relevant: They are working on a tool that will enable staff to make the relationship between measurement locations and irrigated fields.

amabdallah commented 1 year ago

Jim Reese shared this update on Jan 20, 2023

David was able to tweak the daily comma delimited values page so now the server response should be the data. Also, you will only need BYEAR, EYEAR, and StationId in the URL to get the data. Give this a try and let me know if that works for you. https://www.waterrights.utah.gov/dvrtdb/DailyCommaData.asp?BYEAR=2007&EYEAR=2007&StationId=9457

Adel's evaluation This is great. Ideally we also want the variable in the API parameters to have it in the exported table. e.g., Mean daily discharge in CFS

rwjam commented 1 year ago

Update from Jim on 02/28/2023. They have tweaked that webapp to better accept parameters. We should be able to use it with the 'stationid' from the Distribution_Stations.shp Jim emailed us to populate a dataframe.

Example return. image

amabdallah commented 1 year ago

@rwjam, the ReadMe of this mapping needs an update. It mentioned California https://github.com/WSWCWaterDataExchange/MappingStatesDataToWaDE2.0/tree/master/Utah/SS_ReservoirsObservationSites