simont77 / fakegato-history

Module to emulate Elgato Eve history
MIT License
168 stars 16 forks source link

Historical import of measurements? #48

Closed mylesagray closed 6 years ago

mylesagray commented 6 years ago

I have a bunch of measurements alongside their UNIX timestamp in the API I am querying for the stats on my air purifier plugin, is it supported to load in historical records using addEvent?

I have about 3k records that would need to be included, or is there a specific bulk loader?

I was considering looping over all the measurements and calling addEvent on every iteration - is there a way to push these direct to this history, or will each call need to wait for the 10-minute poll?

simont77 commented 6 years ago

There is no loader and what you ask is not an included functionality. You may try using the “private” _addEntry function instead of the “public” addEntry, this would bypass timers and averages, but I cannot guarantee that it would work with no issue.

ebaauw commented 6 years ago

Did you have a look at the persist files that fakagato-storage creates? I image it would be relatively easy to write a small programme that converts your log file to the proper json, so fakegato storage can simply load it on startup.

mylesagray commented 6 years ago

I was hoping I could get the integration to do it automatically per-user so there would be no file system dependencies, I will try it with the private addEntry function and see if I can get it to bulk load that way.

I’m hoping as they’re all timestamped that the Eve app will use that to sort rather than order in the history array. On Fri, 2 Mar 2018 at 19:49, Erik Baauw notifications@github.com wrote:

Did you have a look at the persist files that fakagato-storage creates? I image it would be relatively easy to write a small programme that converts your log file to the proper json, so fakegato storage can simply load it on startup.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/simont77/fakegato-history/issues/48#issuecomment-370033170, or mute the thread https://github.com/notifications/unsubscribe-auth/AAlMTNuw2FZQd0sYJGce5jIhQGbwU_HZks5taaIygaJpZM4SaZSJ .

mylesagray commented 6 years ago

Just to let you know I've used the private _addEntry() method and it worked beautifully,

For reference, this.historicalmeasurements is a 2D array formatted as such:

[
  [timestamp]
  [particular matter density]
  [temperature]
  [humidity]
  [co2 in ppm]
  [voc]
  [all pollution as a %]
]

Each value at a particular position in the 2D array directly correlates to the other values at that same position, for example: this.historicalmeasurements[0][123] and this.historicalmeasurements[3][123] were measured at the same time. So when I iterate over the array passing the loop number, I know that all characteristics correlate:

//Load historicals from API into Elgato
for (var i = 0; i < this.historicalmeasurements[0].length; i++){
  this.loggingService._addEntry({
    time: this.historicalmeasurements[0][i],
    temp: this.historicalmeasurements[2][i],
    humidity: this.historicalmeasurements[3][i],
    ppm: this.historicalmeasurements[4][i]
  });
}

Just going to add some safety barriers around it so it only adds to the history once before I commit it to my plugin, current working version can be seen here: https://github.com/mylesgray/homebridge-blueair/commit/811421eab4e0618f52c6ca22bdfc96058ccd8649

https://dl.dropboxusercontent.com/s/kvge7vu3hsf40ss/Photo%2007-03-2018%2C%2000%2045%2003.png?dl=0

mylesagray commented 6 years ago

Trying to refine this a bit, I have reverted to using the public addEntry() method and just setting disableTimer:true in the setup.

It downloads all values and pushes them in fine - but there seems to be a strange behaviour when it saves to the persistence file - like there are many, many processes all editing the same file at once.

Does the public (or even private) addEntry() method not wait until the file has been successfully written to before returning?

It seems like what is happening upon the initial ingestion of data that the file is written fully, but then there are n number of save processes queued, one for each addEntry call and these all need to execute before it pushes the history to the Eve app?

This behaviour would probably also occur even if a few entries were pushed to addEntry in quick succession.

simont77 commented 6 years ago

Are you pushing data with addEntry with a short delay? All the persistence stuff was designed assuming a rate of al least few seconds. I cannot exclude some glitch if the rate is much faster. However @NorthernMan54 is working on an event based persistence that I hope should fix all this race related issues.

mylesagray commented 6 years ago

@simont77 No delay at all, just dumping the data in as fast as the loop executes. Event based persistence would be awesome, glad to hear there is something in the works for this, backfilling histories would be cool for a few plugins i'm using but at the moment I can't work around the problems without editing fakegato-history itself.