jtleider / censusdata

Download data from Census API
MIT License
139 stars 29 forks source link

Possible to add 2019 data? #17

Closed mghersher closed 4 years ago

mghersher commented 4 years ago

Would it be possible to add the 2019 ACS data? It looks like the one year data was published September 17th and the supplemental data was published on October 15th. https://www.census.gov/programs-surveys/acs/news/data-releases/2019/release.html

Thanks!

jtleider commented 4 years ago

This is on my list! Please note that you can already use the package to download the 2019 data. I just need to add support for the 2019 documentation.

mghersher commented 4 years ago

Thanks @jtleider! Appreciate you working so hard to maintain this package. It's a really helpful one.

I'm for some reason getting an error though when I try to use the package to download the 2019 data. I'm using version 1.9 of the censusdata package and running the following code as a test:

sample = censusdata.search('acs1', 2019,'concept', 'mortgage')

I get the following error:

FileNotFoundError Traceback (most recent call last)

in ----> 1 sample = censusdata.search('acs1', 2019,'concept', 'mortgage') 2 print(len(sample)) 3 # print(sample) /usr/local/anaconda3/lib/python3.7/site-packages/censusdata/variable_info.py in search(src, year, field, criterion, tabletype) 171 raise ValueError 172 topdir, filename = os.path.split(__file__) --> 173 with open(os.path.join(topdir, 'variables', '{0}_{1}_{2}_variables.json'.format(src, year, tabletype))) as infile: 174 allvars = infile.read() 175 allvars = json.loads(allvars)['variables'] **FileNotFoundError: [Errno 2] No such file or directory: '/usr/local/anaconda3/lib/python3.7/site-packages/censusdata/variables/acs1_2019_detail_variables.json'** Is there something that I'm doing wrong? Thanks!
jtleider commented 4 years ago

Hi, the search function relies on the 2019 documentation (as opposed to the data itself), which I had not added yet. Version 1.10 now adds support for the 2019 data. I hope this is helpful!