bblockwood / lab

Repository for collecting and documenting work done by research assistants.
0 stars 0 forks source link

Create maps from Opportunity Atlas data using R. #1

Closed bblockwood closed 5 years ago

bblockwood commented 5 years ago

The goal of this project is to use R to create maps which display heatmaps of variables in the Opportunity Atlas data by census tract.

Raw data should be placed in the dropbox folder (at ...RA_work/data/opportunity_atlas/input/) and all code should be placed in github (at .../lab/projects/opportunity_atlas/code/), with the resulting output saved to .../lab/projects/opportunity_atlas/output/.

danlee22 commented 5 years ago

Attached below is my current progress on mapping teen birth. Obviously, there are a lot of issues with the map. Looking at the data provided by the Opportunity Atlas (attached), I can't seem to make sense of the organization of the columns and don't have data for all the counties in any column. I think that may be why the map isn't "filled in". Another possibility could be the repetition of county names, which my current iteration can probably not decipher.

Screen Shot 2019-06-05 at 12 50 26 PM Screen Shot 2019-06-05 at 12 51 46 PM
bblockwood commented 5 years ago

Thanks @danlee22, great to have this progress already! Before refining the look further, let's get our "production process" in place, with the following steps:

  1. Please put the raw Excel data file in the dropbox folder described above.
  2. Place your R code in the Github folder as described above.
  3. Then adjust the code so that it can import the Excel data (from its location in Dropbox) and export the resulting map image into output file above.

Once you have that working, go ahead and "commit" your changes using the GitHub desktop app, and that will push all your edits to the github cloud, so that I can see the code, the output, etc. Thanks!

danlee22 commented 5 years ago
Screen Shot 2019-06-07 at 1 28 38 PM
danlee22 commented 5 years ago
Screen Shot 2019-06-11 at 12 18 23 PM
bblockwood commented 5 years ago

@danlee22: beautiful! Let's restrict to the lower 48, and change the "outcome" to reflect (ideally automatically) the outcome that is being plotted. Then let me know if this is automated to the extent that I should be able to run it on my own computer. Thanks!

bblockwood commented 5 years ago

Hi @danlee22, I came across some example R code that might be useful for automatically setting path to the dropbox data:

#Set custom working directory
username <- Sys.getenv("USERNAME")
if(username %in% "OID User") {
  maindir <- "/Users/OID User/Dropbox/Work/LeadHousePrices/data"
}

setwd(maindir)
bblockwood commented 5 years ago

@danlee22: nice progress on this. I just updated the filepath, and it ran successfully on my system. I have two follow-up requests:

  1. The script threw a warning after this line. (Column `FIPS` joining factor and character vector, coercing into character vector) It still ran successfully, but for cleanliness, can you sort that out?
  2. Can you export the resulting map (excluding Alaska and Hawaii) as a .pdf file, saved to the output folder?

Thanks again for your work on this!

bblockwood commented 5 years ago

@danlee22 thanks for your work on this. I just checked, and I like the exported map, but it doesn't look like it is created and saved within the nice script you have made. Remember our goal is to have everything be completely replicable, so when a request like "export map as pdf" really means "add code to the script file(s) so that when someone runs the script, a map is exported as a pdf...".

Additionally, can you please rename the file IncTest.R as MakeMap.R, to reflect its generality?

Let me know if you have any questions. And if not, please post a final update on this issue (can be as simple as "I have done this!") and close the issue — since at that point, we will have completely achieved the "deliverable" that this issue was created for.

Thanks again for your work on this!

danlee22 commented 5 years ago

Finished!

bblockwood commented 5 years ago

Thanks @danlee22!