Fixes: Issue TDP-207
Plugin had a bug where the capture source dialog during bulk load did not properly show the capture source - but instead left it in a blank state and produced no error during bulk load. Then, during comparisons, since the bulk load couldn't refer to the correct capture source, it referred to the first capture source in the list which was ID = 1000, and thus buildings outside of the bulk loaded extent were shown to be removed.
To solve this problem, we first identified those features outside of the capture extent which were incorrectly labelled for deletion. We then created a table of these and imported them as a temporary table into the buildings_reference schema.
We then deleted these buildings from these two tables using this SQL:
BEGIN;
DELETE FROM buildings_bulk_load.removed r
WHERE r.building_outline_id IN (SELECT building_outline_id FROM buildings_reference.area6_features_not_in_extent);
DELETE FROM buildings_bulk_load.existing_subset_extracts e
WHERE e.building_outline_id IN (SELECT building_outline_id FROM buildings_reference.area6_features_not_in_extent);
COMMIT;
We then identified the problem and added the changes into the plugin, then finished the comparisons, ran publish and all seemed to be ok:
During bulk load, as well as during comparisons, the plugin reads the capture source area from the bulk load dialog. If the bulk load dialog isn't correctly populated, the capture source can't be read later.
This fix correctly forces the script to look for the capture source code at the beginning of the dialog value, using Qt.
Note: Metadata is updated to include publish date for next dataset.
Tested on linx restore of Sept 30 2022 prod db, on a local Postgres 11 instance. Was not able to use Andrew's Docker Postgres 9.3 container successfully.
Fixes: Issue TDP-207 Plugin had a bug where the capture source dialog during bulk load did not properly show the capture source - but instead left it in a blank state and produced no error during bulk load. Then, during comparisons, since the bulk load couldn't refer to the correct capture source, it referred to the first capture source in the list which was ID = 1000, and thus buildings outside of the bulk loaded extent were shown to be removed.
To solve this problem, we first identified those features outside of the capture extent which were incorrectly labelled for deletion. We then created a table of these and imported them as a temporary table into the buildings_reference schema. We then deleted these buildings from these two tables using this SQL:
We then identified the problem and added the changes into the plugin, then finished the comparisons, ran publish and all seemed to be ok:
Note: Metadata is updated to include publish date for next dataset.
Tested on linx restore of Sept 30 2022 prod db, on a local Postgres 11 instance. Was not able to use Andrew's Docker Postgres 9.3 container successfully.