merddyin / ADDeploy

Used to deploy components to support an ESAE forest and RBAC model via native control.
MIT License
2 stars 0 forks source link

Improve traceability #8

Open PatrickOnGit opened 1 year ago

PatrickOnGit commented 1 year ago

Data tables should contain time stamps to track changes. For instance to automatically add timestamp when changing information in AP_Rundata:

PRAGMA writable_schema = on;

UPDATE sqlite_master
SET sql = replace(sql, 'DEFAULT 0',
                       'DEFAULT CURRENT_TIMESTAMP')
WHERE type = 'table'
  AND name = 'AP_Rundata';

PRAGMA writable_schema = off;

CREATE TRIGGER UpdateLastTime UPDATE OF  OB_runvalue, OB_legacyvalue ON AP_Rundata
BEGIN
  UPDATE AP_Rundata SET TimeStamp=CURRENT_TIMESTAMP WHERE OB_item=NEW.OB_item;
END;
merddyin commented 1 year ago

It is something I have considered, however it didn't make sense for the current iteration of the toolset due to the amount of work involved. Simply setting a single data field wouldn't provide value as the actual file metadata lets you know the last time the file was modified already. To add value, I'd need to track not just date and time, but old and updated value, table, and who made the change. This is further complicated by the fact that SQLite is a distributed DB, so if you have multiple copies of the module, each would have distinct change data. The solve for this is a centralized DB that all copies of the module connect to, but this introduces other challenges in terms of securing the connection and the data. I have started the design work for this, which would include a secured API that the module could interface with. It's a nearly complete rewrite however, so it wouldn't provide an update to this project.

PatrickOnGit commented 1 year ago

I do agree. Still, it would provide some hint that "somebody" or "something" changed values in the database and one can compare with a backup copy.

merddyin commented 1 year ago

Absolutely agree...just not sure if the associated dev effort required to make it useful makes sense at this time. My goal is to fix the major problem points with this version of the module, but then I want to move on to what I call 'vNext'. As indicated in other threads, I am currently in the design and prototype phase of that effort. It will have a centralized database, an API, a web interface for configuration, but it will still leverage PowerShell as the implementation mechanism. A good portion of most of these functions will transition to the new version, but I'll likely need to refactor a lot them. On the bright side, I'll be able to handle more of the business logic within the API, so the code should become simpler on the PowerShell side. The current design for vNext tracks all changes to any table, retains old values, and associates every change to an individual with date and time. This DB wasn't designed with all of that in mind unfortunately, but more than happy for you to contribute to this project if you have the cycles for it. I'll try to see about getting the updated code base committed in the next few weeks. I really need to buckle down and learn how to write Pester tests and factor my code accordingly...would make updating the code base easier, but the concept of mocking in regards to AD objects is still a bit confusing to me.