Currently utils.forecast.cache_forecast_metadata() only uses the passed forecast's predictions to create the cached information. However, this is incorrect: It should instead "merge" that forecast with all versions going back to get the effective contents of the originally-uploaded forecast, and not just the passed one, which a) does not contain duplicate prediction elements in the upload, and b) has retractions of previous versions. This change will allow the forecast detail nad project forecast summary pages to correctly show what was effectively uploaded. In this way, the metadata that's cached will be consistent with the merging behavior of forecast querying.
Currently
utils.forecast.cache_forecast_metadata()
only uses the passed forecast's predictions to create the cached information. However, this is incorrect: It should instead "merge" that forecast with all versions going back to get the effective contents of the originally-uploaded forecast, and not just the passed one, which a) does not contain duplicate prediction elements in the upload, and b) has retractions of previous versions. This change will allow the forecast detail nad project forecast summary pages to correctly show what was effectively uploaded. In this way, the metadata that's cached will be consistent with the merging behavior of forecast querying.