InnoZ / MAS

analysis and (potentially) development of a multi-agent simultation for carsharing
Apache License 2.0
0 stars 0 forks source link

enable local test develeopment without Mobility DataHub #32

Closed 00Bock closed 7 years ago

00Bock commented 8 years ago

To test developments remote from Mobility DataHub/ playground, we need to set up a local test mirror from relevant databases.

wese-da commented 8 years ago

Sth. like:

remote: pg_dump [db] > file local: psql -f file

for relevant databases (geodata, surveyed_mobility, ...)?

00Bock commented 7 years ago

@dhosse So in an email you said, that you have made such a dataset. Can we share it?

wese-da commented 7 years ago

I'm just making a dump of the databases "surveyed_mobility" and "geodata". These should be sufficient for some tryouts.

00Bock commented 7 years ago

won't that be pretty big?

wese-da commented 7 years ago

Yes, it's a pretty big one.

Just for clarification: Is this issue related to or a duplicate of #50? Because we seem to discuss the same thing here and there, only #50 is explicitly about the development of a tool.

The way I got it now is: We want to have a local dump of modelling-relevant databases (i.e. surveyed_mobility and geodata) to 1) test local changes 2) have a showcase for customers to locally do automatic model runs

I have set up a complete db dump on my machine at the moment. Was your intention to have an excerpt of the databases? Because then, I would understand the need for a "tool".

00Bock commented 7 years ago

yes. It's a duplicate. But I then changed it according to your email. I just want to do 1) and 2) as you defined above...

wese-da commented 7 years ago

Okay. But the thing is, I made a pg_dump of "geodata" and a pg_dump of "surveyed_mobility" and set them up locally. So, I have the whole databases on my laptop. That is what I originally tried to say in my email.

But you want a subset of both databases because it's too much data to send around the globe, right?

Because in this case, we should clarify what data we want to extract for test purposes, e.g. we only take geodata from Bayern and consequently only survey data of that region or region types.

00Bock commented 7 years ago

how big is it? owncloud? If too big, then let's stick to osnabrück and lower saxony... sry for the confusion

wese-da commented 7 years ago

Just checked my owncloud memory... Should be okay, "surveyed_mobility" is about 980 MB and "geodata" around 38 GB. It may take some time to upload it, but there is enough free space. I will try to upload the dump files then, hopefully the upload size is not too small.

00Bock commented 7 years ago

It's 23:11... so I'll just check tomorrow!

wese-da commented 7 years ago

I managed to upload the surveys db into my owncloud. I had to compress it because the upload size is limited to ~ 540 MB. Which brings me to the next problem: the geodata dump is too big. Maybe I can cut the districts table to germany to reduce the size...

00Bock commented 7 years ago

that would be great! but, hey, no priority...

wese-da commented 7 years ago

The files needed for local scenario generation are now in the pg_dumps folder of my owncloud. The osm tables need to be inserted into the database via 'copy' command because they needed to be dumped as csv files and are limited to Lower Saxony because it's just too much data otherwise.

00Bock commented 7 years ago

works