martinedoesgis / novel-coronavirus

Mapping of the 2020 Novel Coronavirus
1 stars 0 forks source link

Excellent work #1

Open xbno opened 4 years ago

xbno commented 4 years ago

Hey, Didn't know how else to get in touch with you about this map, but I had been trying to find something with city level data that also displayed propagation over time. Its definitely the highest quality map/data I've seen! Hope you keep it updated, and if you need any help with keeping the data I'd be happy to help. Thanks so much for sharing!

martinedoesgis commented 4 years ago

Hi Geoff,

Thanks for the feedback, I'm glad you liked it :)

For now I scripted all the chinese part, I'm using a Node.js script to get the data from https://ncov.dxy.cn/ncovh5/view/pneumonia by provinces and cities, archive and format it and then Google Translate and Geocoding in the Google Sheet. It works quite well though they start to encode a lot as "Unknown town" or "Unknown area" and I always have to troubleshoot manually a few names and coordinates but nothing complicated.

The rest of the world is what takes more time to update, I get the data from https://bnonews.com/index.php/2020/01/the-latest-coronavirus-cases/ (they do an amazing job) but then encode everything manually (because of the way they format the KML file in the map it's almost impossible to scrap, everything is put in one field "description" ...) and I have to find each time the corresponding region ect.

It takes me between 15 and 30 min per day for now to update everything but if cases are starting to shoot up everywhere outside China it will become a challenge. I intend to keep it updated as long as possible but at some point it might be too much work for one person (and I might also move to areas with really low internet connection for work) so help will be really appreciated, I keep your proposition in mind ;)

xbno commented 4 years ago

Awesome! I actually created a script to scrape the bno map in python before they updated the symbols and categories and broke my script. But it worked relatively well. Paring the current case count in the title was waay easier than parsing the description info. I'll fiddle with it to get it working again and send you a link

martinedoesgis commented 4 years ago

Hi,

Ok sounds good, I'm thinking about doing something similar. The data from https://3g.dxy.cn/newh5/view/pneumonia gets messier everyday, they always change the names and list of cities in the provinces ...

I have started to extract automatically the KML since a few days and will try to get the cases/deaths from the title for each dates I have and then manually build the history with the description (that part won't be fun but once done the update will take a few second everyday). My only concern with the BNO data is that they cut (at least in the timeline below the map) the reporting at midnight US time and so Chinese cases are reported for some provinces on the day before or after. I event checked if it's the case in the KML too but it might create some strange patterns, will see.

martinedoesgis commented 4 years ago

Hi Geoff,

If you're still interested in helping for the update please tell me, I don't think I'll be able to keep the data updated after this week end, I'm heading to a place with very low internet connection. In any case I'll create a server to still fetch the data everyday but the update of the map and the Google Sheet will be more irregular.

xbno commented 4 years ago

Hey, I'm certainly game. I see the bno map has been putting multiple dates and changing their format so it doesn't look like the best resource anymore. I found a different source that is much more structured and consistent fro chinese cities. We'll have to scrape daily to stay updated but please take a look here: https://github.com/xbno/ncov2019. I can modify the output to concat to a/your googlesheet, I didn't look exactly at the format of your sheet. Other thing is I couldn't figure out how to translate the page automatically with google translate.. so it scrapes chinese chars. I remember yours having both so maybe you already have a solution for that.