DOI-USGS / dataretrieval-python

Python package for retrieving water data from USGS or the multi-agency Water Quality Portal
https://doi-usgs.github.io/dataretrieval-python/
Other
156 stars 37 forks source link

get_iv issue #115

Open csg-code opened 11 months ago

csg-code commented 11 months ago

Hello – I'm attempting to use the get_iv functions, although it's not retrieving any results.

Thank you!

state = ['WY'] 
pcodes = ['72019', '30210', '61055', '99019', '62610', '62611', '62612', '62613', '72150', '72230', '99227'] 

targetSites, _ = nwis.what_sites(stateCd=state, parameterCd=pcodes) 
targetSites.head(10)

wy = targetSites["site_no"][ :100]
#or - example of wy sites 
wy = ["410000104152501", "410000104180801", "410000104463501"]

startDate = '2010-09-01'
endDate = '2021-09-30'

# Get the data
iv, _ = nwis.get_iv(sites=wy, parameterCd=pcodes, start=startDate, end=endDate)
thodson-usgs commented 11 months ago

Hmm, I suspect this is a deeper problem, like these data are actually archived in a different service.

@ldecicco-USGS, do you know? I assume, this won't work in R either, but I haven't tried.

I should add that @csg-code is interested in groundwater sites (if that wasn't obvious)

ldecicco-USGS commented 11 months ago

Assuming you have the same output as in R, you'll want to look at the column "data_type_cd". When I ran that, I see a mix of "uv", "dv", "gw", and "qw". Each of those is a different web service. If you are pulling the "iv" data, you'll want the sites/parameters that match up with "uv" (iv and uv are synonymous in the USGS - instantaneous or unit values depending on who you are talking to)

Here's how I'd get all the data in R, I'm sure it's basically the same in python:

state = 'WY' 
pcodes = c('72019', '30210', '61055', '99019', '62610', '62611', '62612', '62613', '72150', '72230', '99227')

library(dataRetrieval)
targetSites = whatNWISdata(stateCd = state,
                           parameterCd = pcodes) 

iv_sites <- unique(targetSites$site_no[targetSites$data_type_cd == "uv"])
iv_pcodes <- unique(targetSites$parm_cd[targetSites$data_type_cd == "uv"])

iv_data <- readNWISuv(siteNumbers =  iv_sites[1:3], #I only did 3 sites because it's a lot of data!
                      parameterCd = iv_pcodes,
                      startDate = "2010-09-01",
                      endDate = "2021-09-30")

dv_sites <- unique(targetSites$site_no[targetSites$data_type_cd == "dv"])
dv_pcodes <- unique(targetSites$parm_cd[targetSites$data_type_cd == "dv"])
dv_stat_cds <- unique(targetSites$stat_cd[targetSites$data_type_cd == "dv"])

dv_data <- readNWISdv(siteNumbers =  dv_sites,
                      parameterCd = dv_pcodes,
                      statCd = dv_stat_cds,
                      startDate = "2010-09-01",
                      endDate = "2021-09-30")

gw_sites <- unique(targetSites$site_no[targetSites$data_type_cd == "gw"])
gw_pcodes <- unique(targetSites$parm_cd[targetSites$data_type_cd == "gw"])

gwl_data <- readNWISgwl(siteNumbers =  gw_sites,
                      parameterCd = gw_pcodes,
                      startDate = "2010-09-01",
                      endDate = "2021-09-30")

Hope that helps!

thodson-usgs commented 11 months ago

@ldecicco-USGS, nailed it. In python you'd do something like

targetSites, _ = nwis.what_sites(stateCd=state, parameterCd=pcodes, seriesCatalogOutput = "true") 
# I forget why, but use uv instead of iv
# here we select only the sites with iv data, which is just a handfull
iv_sites = targetSites[targetSites['data_type_cd'] == 'uv']

# then download the data from only the iv sites
iv, _ = nwis.get_iv(sites=iv_sites['site_no'],   parameterCd=pcodes, start=startDate, end=endDate)
csg-code commented 11 months ago

Awesome. Thanks so much for your help!

csg-code commented 11 months ago

Hi @thodson-usgs – when I run the code I get this:

DtypeWarning: Columns (16) have mixed types. Specify dtype option on import or set low_memory=False. df = pd.read_csv(StringIO(rdb), delimiter='\t', skiprows=count + 2,

What do you recommend as the best way of fixing this? Thank you!