Recently, we've seen that the report json for some of the SURFRAD sites is over 200MB in size of almost entirely processed data values. It turns out when recomputing a report (which has happend 223 times so far for the Bondville GHI report), the processed values are added to the database, but the old values are not removed. So fetching the report gets all 1784 of these processed values instead of the expected 8.
Since the values and raw report are posted separately, perhaps the best way to go about this is to require a list of processed values to keep be included with the raw report on POST. Then, as the raw report is being stored all other processed values are deleted.
Recently, we've seen that the report json for some of the SURFRAD sites is over 200MB in size of almost entirely processed data values. It turns out when recomputing a report (which has happend 223 times so far for the Bondville GHI report), the processed values are added to the database, but the old values are not removed. So fetching the report gets all 1784 of these processed values instead of the expected 8.
Since the values and raw report are posted separately, perhaps the best way to go about this is to require a list of processed values to keep be included with the raw report on POST. Then, as the raw report is being stored all other processed values are deleted.