hackforwesternmass / frcog-private-wells

2 stars 0 forks source link

Change script that generates SQL imports #2

Closed markhildreth closed 11 years ago

markhildreth commented 11 years ago

Currently, the SQL import script is generated using the PostgreSQL database on Heroku. The schema has changed slightly (different table names, column names, etc). Change the script to be able to insert into the new Django-generated schema.

Try to run the resulting SQL against the local django database and make sure it works.

hjw commented 11 years ago

There is a way to point django's models at legacy db table names. I am doing that now, so we can, probably leave old db schema as is.

Closing this issue, until I run into an insurmountable problem or hear from Mark that pointing at legacy table naming schemes will cause headaches down the line.

russpitre commented 11 years ago

Should I stop working on this issue?

hjw commented 11 years ago

Hi Russ. The django model code has been modified to handle the legacy table naming scheme, and a CSV import module has been added to the admin module. The CSV module works with the munis and welltypes files, and will probably work with the wells csv once a correction regarding required fields is made to the model definition.

So, yes I think you can stop working on this issue. The current top priority issue it to get the well data out of the heroku db and importable to the django db (which will be moved to heroku once import is working). Dan added geocoding data to the heroku db and we don't want to lose it be importing the original xl-converted-to-csv data.

If you'd like to dig into Django a bit, you could take a look at the issue #12 thread and add null=true to the appropriate places in wells/models.py. I won't be putting any more time in on this today and Mark, said he might have time to work on it tonight, but if it's already done by the time he takes a look, he'll move on to the next thing.

dgarant commented 11 years ago

If we can import wells into the Django DB, we can probably just kill/overwrite the Heroku database. The geocoding data that I added isn't very valuable, as I basically just have to start a script and it should be re-populated in a few minutes.

hjw commented 11 years ago

I've already dumped a CSV file of the well data from the heroku db and saved it to the django branch.

Russ could keep modifying his scripts, and write more scripts to take the csv, turn it into sql statements and then import them but being able to directly import csv files seems like a better solution for Glenn. He may want to add data for other towns and he'll get that data as csv or a spreadsheet. Clicking on a button in the admin interface gives him easy power to pull in the new data. Glenn already has the hack of the fusion tables to provide an immediate view of his data. I'm hoping we can provide him a more long term solution. But maybe I'm over thinking things, or misunderstanding something?

On Jun 5, 2013, at 3:57 PM, dgarant notifications@github.com wrote:

If we can import wells into the Django DB, we can probably just kill/overwrite the Heroku database. The geocoding data that I added isn't very valuable, as I basically just have to start a script and it should be re-populated in a few minutes.

— Reply to this email directly or view it on GitHub.

dgarant commented 11 years ago

Yep, my point was just that if everything is working with the Django schema (imports etc.), there may not be a need to get Django 'in sync' with Heroku. We can let Django create/update the Heroku database as it sees fit, as the data we have in there can always be re-imported.