epiforecasts / covidregionaldata

An interface to subnational and national level COVID-19 data. For all countries supported, this includes a daily time-series of cases. Wherever available we also provide data on deaths, hospitalisations, and tests. National level data is also supported using a range of data sources as well as linelist data and links to intervention data sets.
https://epiforecasts.io/covidregionaldata/
Other
37 stars 18 forks source link

made map for readme #378

Closed joseph-palmer closed 3 years ago

joseph-palmer commented 3 years ago

This PR looks to add a map to the readme showing what countries we have direct data for (that is we specifically have a country class getting 'official' data for that country).

I'm not very experianced making maps in R, so if anyone has any suggestions of better ways to do this or to make the map look nicer I am all ears!

closes #377

github-actions[bot] commented 3 years ago

👋 Thanks for opening this pull request! Can you please run through the following checklist before requesting review (ticking as complete or if not relevant).

Thank you again for the contribution. If making large scale changes consider using our pre-commit hooks (see the contributing guide) to more easily comply with our guidelines.

seabbs commented 3 years ago

Looks pretty nice to me.

A couple of thoughts for discussion.

@Bisaloo @kathsherratt any thoughts/suggestions?

joseph-palmer commented 3 years ago

Happy for it to go wherever. The data status vignette might be nice and more accessable as users might not look at the readme. map_data is from ggplot

Bisaloo commented 3 years ago

Thanks for taking care of this!

I think it is better to use coord_sf() instead of coord_cartesian() in maps. More about maps and ggplot2: https://ggplot2-book.org/maps.html.

I would completely remove the country borders or make them "cleaner" by using geom_polygon(color = "black", lwd = 0.1).

Finally, a slightly simpler and more legible way of specifying the scale (might be subjective):

supported_countries <- mutate(
  world,
  fill = ifelse(
    region %in% regional_countries$class,
    ifelse(
      region %in% regional_countries_l2$class,
      "Level 2",
      "Level 1"
    ),
    "None"
  )
)

ggplot(supported_countries, aes(long, lat, fill = fill, group = group)) +
  geom_polygon(color = "black", lwd = 0.1) +
  ggtitle("Supported countries") +
  scale_fill_manual(
    name = "",
    values = c("#d91c45", "#328ba8", "gray80")
  ) +
  theme_void() +
  theme(legend.position = "top") +
  coord_sf(ylim = c(-55, 80))

or even:

supported_countries <- mutate(
  world,
  fill = case_when(
    region %in% regional_countries_l2$class ~ "Level 2",
    region %in% regional_countries$class ~ "Level 1",
    TRUE ~ "None"
  )
)

ggplot(supported_countries, aes(long, lat, fill = fill, group = group)) +
  geom_polygon(color = "black", lwd = 0.1) +
  ggtitle("Supported countries") +
  scale_fill_manual(
    name = "",
    values = c("#d91c45", "#328ba8", "gray80")
  ) +
  theme_void() +
  theme(legend.position = "top") +
  coord_sf(ylim = c(-55, 80))

image

joseph-palmer commented 3 years ago

Moved it into the data status vignette, although happy to move it elsewhere

kathsherratt commented 3 years ago

Looks great to me! 👍 I can imagine this getting complicated very quickly as Sam mentioned so this looks like a nice clean simple solution to have in place. Looks like just need to add sf to suggests in DESCRIPTION to get this passing the build checks.

P.S. I also fully support removing borders from maps - just replace the geom_polygon arg with geom_polygon(lwd = 0) +...

joseph-palmer commented 3 years ago

Are we happy with how this looks and if data status is the right vignette for this?

I can't work out why the check is failing, tried putting sf in the description but doesn't seem to remove the error....

joseph-palmer commented 3 years ago

updated the vignette names in the package down but I don't know if the new page actually exists as when clicking on the link it goes to 404, but the page does exist so not sure why thats playing up. Does this need to be updated somewhere else also?

Still getting the error and I'm not sure why but I think something weird is going on. pkgdown build fine locally but when running devtools::check () there is an error in the re-building of vignettes, but this is different to the one on here. Locally it says there is no package called class. This error goes away when I include library(sf). The code doesn't use any functions from sf but I think ggplot does....but ggplot2 is included in the suggested packages so it shouldn't need sf?

The fact that locally it gave a weird error message that does not state sf directly makes me think something weird is going on, but no idea what this is / could be...

Bisaloo commented 3 years ago

This happens because sf installation fails due to missing dependencies: https://github.com/epiforecasts/covidregionaldata/runs/2802734105#step:8:80:

package ‘sf’ successfully unpacked and MD5 sums checked using staged installation configure: CC: clang -mmacosx-version-min=10.13 configure: CXX: clang++ -mmacosx-version-min=10.13 -std=gnu++11 checking for gdal-config... no no configure: error: gdal-config not found or not executable. ERROR: configuration failed for package ‘sf’

  • removing ‘/Users/runner/work/_temp/Library/sf’

We need to explicitly install sf because ggplot2 only suggests it and soft-dependencies (Suggests) are not recursively installed.

joseph-palmer commented 3 years ago

I see, I tried doing with with if (require(sf)) install.packages("sf") but this also failed due to not having a cran mirror. What's the best way to install this for the checks?

seabbs commented 3 years ago

I think what Hugo is getting at here is that we need to include the mac/linux dependencies in the CI build. There is a step above the R deps where external libraries go and gdal needs to go in there. Managing external deps is one of the things R is sadly not great at.

Bisaloo commented 3 years ago

It should be fine now because we only needed these extra dependencies to compile the packages (sf & textshaping) from source. Now that binaries are available on CRAN, tests are passing.