jertel / vuegraf

Populate metrics from your Emporia Vue energy monitoring devices into an InfluxDB
MIT License
167 stars 55 forks source link

[Feature Request] Pull historical data on first run #27

Closed odannyc closed 1 year ago

odannyc commented 3 years ago

As a user of vuegraf, I want to have the option to pull all historical data to be able to analyze past datetime.now()

MichaelMedford commented 2 years ago

The ability to ingest up to seven days of one-minute data, the maximum retained by Emporia, is available in this PR: https://github.com/jertel/vuegraf/pull/88

If this PR is accepted, I would be happy to create another PR for ingestion of historical one-hour data. Emporia currently retains one-hour data for the lifetime of the device.

jertel commented 2 years ago

This is now implemented thanks to PR #88. However, it's not automated. It requires a manually set configuration parameter and care must be taken to ensure the parameter is disabled or removed after the import completes so that historic data is not re-imported and overlapped with real-time collected data. See the README for more information. I'll leave the issue open for now, for additional comments or suggestions. As the above post mentions, it currently only accepts the most recent 7-days. More work will be needed for longer-term imported data.

xmorand commented 2 years ago

Looks like the reason this is broken for some users (the 400 bad request response) is that Emporia is limiting the window in which you can query. 24h chunks seems to pass that limit, and looks like the limit is around 11h or so. The code should be modified to grab something like 12h chunks (14 times if you want 7 days).

dakegg commented 1 year ago

Just setting this up for the first time. Running Influx v1 and getting the following error when trying to import historical data on first run ...

2023-01-25 20:03:14.722862 | INFO | Submitting datapoints to database; account="Primary Residence"; points=256869 2023-01-25 20:03:27.397515 | ERROR | Failed to record new usage data: (<class 'influxdb.exceptions.InfluxDBClientError'>, InfluxDBClientError('413: {"error":"Request Entity Too Large"}'), <traceback object at 0x7ff76fea3c80>) Traceback (most recent call last): File "/opt/vuegraf/vuegraf.py", line 284, in influx.write_points(usageDataPoints) File "/usr/local/lib/python3.11/site-packages/influxdb/client.py", line 603, in write_points return self._write_points(points=points, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/influxdb/client.py", line 681, in _write_points self.write( File "/usr/local/lib/python3.11/site-packages/influxdb/client.py", line 413, in write self.request( File "/usr/local/lib/python3.11/site-packages/influxdb/client.py", line 378, in request raise InfluxDBClientError(err_msg, response.status_code) influxdb.exceptions.InfluxDBClientError: 413: {"error":"Request Entity Too Large"}

jertel commented 1 year ago

InfluxDB is rejecting the input, saying it's too much data. If you're hosting your own InfluxDB you could look into adjust the max request size. If you're using a cloud-hosted InfluxDB you will need to either reduce the amount of history you import, or modify Vuegraf to split the write into multiple, smaller writes.

dakegg commented 1 year ago

Thanks. I'll look into max request size, its a local influxdb.

gauthig commented 1 year ago

Just created a merge request that fixes this problem by adding "usageDataPoints,batch_size=5000" to influx.write_points. Seems to only be a problem with V1 and not V2.

@jertel, if not ready for all the items on the merge request, look at adding adding this option for the current code base.