CityofToronto / bdit_volumes

Traffic Volumes Modelling Project
7 stars 6 forks source link

Create process for updating BDIT database with live FLOW data #22

Open aharpalaniTO opened 7 years ago

aharpalaniTO commented 7 years ago

Process likely to involve:

  1. Create links to FLOW (oracle DB) within Postgres
  2. Identify new artery codes
  3. Run new artery codes through some sort of QC process to ensure they are assigned to artery_tcl correctly
  4. Update all respective tables in Postgres (arterydata, cnt_det, det, etc.)
  5. Re-run count data checks developed in #23
aharpalaniTO commented 7 years ago

I will produce initial delta tables + summaries. Will leave to @sunnyqywang to determine best approach. Most likely will look to refresh our version of FLOW counts every month, if the process isn't too time consuming.

aharpalaniTO commented 7 years ago

@aharpalani started to develop this process; still in progress. @sunnyqywang has updated artery locations to reflect more recent version of FLOW.

aharpalaniTO commented 7 years ago

@aharpalaniTO wrote process for creating 5 tables:

aharpalaniTO commented 7 years ago

@aharpalaniTO and @sunnyqywang to discuss streamlining process for future use

sunnyqywang commented 7 years ago

@aharpalaniTO loading new data from FLOW to the traffic schema, and @sunnyqywang's script only processes the difference between updated tables and their clean versions in prj_volume schema.