osmbe / road-completion-old

This repository contains all the code needed to compare open-data road datasets to OSM data.
8 stars 3 forks source link

Setup a server where this runs every day. #13

Open xivk opened 7 years ago

xivk commented 7 years ago

Setup a server where the comparison script runs daily for the available sources.

xivk commented 7 years ago

@jbelien Did you have a server available? If so can you email me the details?

jbelien commented 7 years ago

You still can use the server I already sent you the details. I'm using it too for something related to OpenLabs Albania but it should work :) Server is paid until end of the month !

xivk commented 7 years ago

Thanks, managed to logon... doing my thing now.

xivk commented 7 years ago

@jbelien This line doesn't work on the server:

https://github.com/osmbe/road-completion/blob/master/sources/osm/get-data.sh#L8

In folder ./sources/osm and I get this error:

ubuntu@road-completion:~/work/road-completion/sources/osm$ sudo ogr2ogr -f GeoJSON -select name,highway -where "highway is not null" -nln osm_highways -progress belgium.geojson belgium-latest.osm.pbf lines
Warning 1: Input datasource uses random layer reading, but output datasource does not support random layer writing
0.ERROR 1: Cannot copy /vsimem/osm_importer/osm_temp_nodes_0xae8110 to ./osm_tmp_nodes_14748_1
..10...20...30...40...50...60...70...80...90...100 - done.
xivk commented 7 years ago

I know why:

ERROR 1: Cannot write in temporary node file ./osm_tmp_nodes_14776_1 : No space left on device
xivk commented 7 years ago

Can you see what can be cleaned up? Not sure what I can delete... ;-)

xivk commented 7 years ago

I cleaned up the old data folder, we're fine for now but this will happen again I fear.

jbelien commented 7 years ago

As long as you don't touch on ~/openlabs-geoportal/ ; you probably can delete everything you want :) But yes, it's a small server (10Go) so it will probably happen again.

gplv2 commented 6 years ago

In case I can help let me know. I can add terraform code to this project so you can launch a server, get the stuf installed , build and tear it down once you retrieve the data off it. It's economic, you can buy a bigger box as long as you don't forget tearing it down. For example, it cost me 40 euro's in the past month to build GRB stuff on a 28Gig ram machine with 3 disks. I'm planning to automate GRB but I could also just add building Agiv/CRAB stuff (in fact I will) every month. Only takes a few hours. I then ship the data to a permanent server. We could synchronize this effort then every month. I'm willing to share this resource if I can delegate access decently. Just let me know when you are pressing for resources.