Closed mcwieger closed 4 months ago
statistics_meta
table?The history is kept in the statistics
and statistics_short_term
tables and not the states
table. The script assumes that there is already data in the statistics_short_term
and only corrects it and does not create entries in the statistics_short_term
table.
Can you share the SolarEdge created data (CSV files)? elec_solar_high_resolution.csv
Can you share the SolarEdge entity definition from the statistics_meta table? I see now that it is actually in Wh, so I made a mistake there (set the conversion to 1.0, not to 1000.0). Still, it's not a multitude of 1000 that's the issue...
Was there already SolarEdge data in Home Assistant? Yes there was. I made sure that the CSV input ended where the original data started.
If not, how did you create the entity? N/A
If so, was the original data correct? Yes it was/is.
What are the timeperiods that you selected in the Energy dashboard? Looked at it from different angles: YTD, yearly, monthly. I compared the data in the first days of the upload (January 2017) and those actually match the CSV. And there are also moments that are displayed correctly (e.g. a defective inverter which caused 0 production in a particular month), but for many days the production just seems flattened somehow (production appears to be stable year round, which of course is incorrect).
The history is kept in the
statistics
andstatistics_shor
The history is kept in the
statistics
andstatistics_short_term
tables and not thestates
table. The script assumes that there is already data in thestatistics_short_term
and only corrects it and does not create entries in thestatistics_short_term
table.
So removing it from statistics
would be an option, and then recreating it with lifetime CSV input.
The history is kept in the
statistics
andstatistics_shor
The history is kept in the
statistics
andstatistics_short_term
tables and not thestates
table. The script assumes that there is already data in thestatistics_short_term
and only corrects it and does not create entries in thestatistics_short_term
table.So removing it from
statistics
would be an option, and then recreating it with lifetime CSV input.
Yes just remove the newly inserted records from the statistics
table. You have to select the records based on the start_ts
column. So the value in the start_ts
column has to be less than the max timestamp of your CSV file.
You can also just activate the backup database and start over and then just do the P1 data ;-)
Can you create an export of your SolarEdge data from the statistics
table?
SELECT * FROM statistics WHERE metadata_id=56 ORDER BY start_ts
Right click on the result grid and press Export grid rows
. Select File
, Excel CSV
and clear the box for NULL value
There you go. Looks like a strange cut between the imported data and the data already in the db.
Thank you for the data.
I ran your data against the script and could reproduce the issue. Because the sensor is in Wh and not kWh the variables should also have been increased by a factor of 1000.
Below are the correct values to be used. Because the variables are global they are applied on all the data, so you have to make sure that all the imported data per run have the same factor (Input Wh - Sensor Wh). Use multiple runs in case you have different factors. In the next version of the script I will make it possible to specify it per sensor.
INSERT INTO VARS VALUES ('cutoff_new_meter', 25000); /* Change this in case your new meter started with a higher start value (especially when the unit_of_measurement is not kWh!) */
INSERT INTO VARS VALUES ('cutoff_invalid_value', 1000000); /* Change this in case a higher/lower diff cutoff is needed to mark a value invalid */
I just released version 2.1 of the scripts. In this version it is possible to specify the correction and cutoff values per sensor. I also included examples how the values should be set depending on the unit of measurement.
It's working now. And the explanation of the cutoff also helps. To make it absolutely "idiot proof" you may add one or two examples of a combination (so Wh > Wh is always 1.0/25000/1000000).
FYI: I ultimately removed all solar info during the evening, when there was no solar activity and ran the full history import at that time. Works perfectly, all year totals match those of the SolarEdge portal.
Thanks for your help!
Perfect! Enjoy looking at your historical data
Very much appreciate the work done on the MariaDB script, thanks for that.
Just finished the load with the new script and I have two findings:
The solar energy should be quite easily repairable though: do you think I can simply remove all entries from the STATES table where ENTIT_ID is the SolarEdge entity? If so, I will just upload the entire history up until today.