kcigeospatial / MDOT-SHA-NPDES-Next-Gen

Code and issues related to the MDOT SHA NPDES Project. Project codes: Config = 31, Management = 32.
0 stars 0 forks source link

What happens after ETL 3 with more edits? #219

Closed talllguy closed 5 years ago

talllguy commented 6 years ago

Something that came up in the training with MES:

What happens after ETL 3 when there more edits?

So, what happens if they keep working? Can they run ETL 3 more? Will ETL 1 overwrite without confirmation? What happens to inspections and photos?

johnshiu commented 6 years ago

For BMP Inventory:

Teams can continue to run ETL 3 after the initial push. For example, they could do the following actions:

  1. Update field data.
  2. Run ETL 2, then 3. This will commit changes made in step 1 to the QC version.
  3. Update field data again.
  4. Run ETL 2, then 3. This will commit changes made in step 3 to the QC version; it will not re-push changes from step 1, as they have already been pushed.

One important note: if someone at SHA accesses the QC version and does not approve some of the changes during rec/post after step 2, there may be mismatches in the field and source versions. For example, in step 1, the team may create a new SWMFAC erroneously and push it into QC in step 2. In such a case, the SHA QCer may decide to delete that SWMFAC from the QC version.

This results in the SWMFAC no longer existing in the QC version but still existing in the field version. If they make an edit to it in step 3 and run the ETL in step 4, a commit error will be logged as an edit was attempted to made against a feature that no longer exists, and that errored feature will be stored for manual fixes later, if needed. To clarify, while this error will be logged, the commit for any other feature edits will still go through, and the 'Processing' in the Web UI will still complete without any issues.

This is quite a fringe case, and only really happens when there is QCing going on (and changes made to the QC version) while the users are still making changes in the field. It should be easily mitigated by communication between the field team and the SHA QCer who's doing the rec/post.

johnshiu commented 6 years ago

For BMP Inspections:

When the field user initiates the submission of BMP inspections, the following occurs:

  1. The BMP inspections tables and file attach tables are updated in the QC version with the data from the Survey123 field tables.
  2. The inspection photos are kept in the database as BLOBs, and the filenames/WebUI request IDs are stored in a separate table. This allows them to download the photos after the commit via the WebUI.
  3. The BMP inspections in the field are then "Deleted".

For step 3, this is not a hard delete, but a soft one, as they are simply marked as deleted using the GDB_TO_DATE field. This will hide the inspection from the WebUI as well as in the feature service, so it will no longer be accessible. Therefore, they should be sure that the inspection data is as close to 100% good to go as possible before submitting. Remember, once it gets pushed to source, there is no way for them to edit it again. Even if it goes through and gets committed, the BMP inspections table is not replicated into the field, so the only way to edit the existing inspection is locally at SHA (unless we do something like expose the inspections table via a feature service, which I do not recommend at this late stage).

Please note that, if needed, I can manually go in and restore an inspection and all its data, as nothing in the inspections tables are fully deleted. This might be useful if something goes wrong in the script, for example.

talllguy commented 6 years ago

Thanks, John! That first BMP Inventory explanation, edge case included, seems logical to me. If the firm wants to, they could prepare a final submittal once per day, week, fortnight, month, etc., to give SHA something to review more frequently. Besides the deleted feature case, the firm could lose data if they pulled a new source via ETL 1 while other firm editors were still editing that data. Will it be up to them to ensure they don't have any unsubmitted data in the field schema before ETL 1 runs?

johnshiu commented 6 years ago

I don't think we've really established how ETL 1 will run. As of right now, only I am able to execute it manually from my dev box. Eventually, it should be set to run either automatically (nightly?) or manually by the SHA QCer.

I should also make the distinction that ETL 1 is updating the base version of the field geodatabase, and by itself, it does not clear anyone's field version. The actual clearing of the version is done by deleting the version (i.e. deleting KCI_1) and recreating it off of the base version that is kept up-to-date by ETL 1. I have been doing this using ArcCatalog manually as of now, as it's quite easy to do.

Eventually, I suppose this version deletion/recreation should probably be put into another UI so someone doesn't need to manually do it, but I'm hesitant as it is VERY dangerous to automate. Once a version is gone, it's gone.

talllguy commented 5 years ago

@johnshiu Did we every establish an update protocol for ETL 1?

talllguy commented 5 years ago

ETL 1 established. Closing this out. ETL 1 is run as needed after we do a major rec/post.