openEDI / documentation

OEDI Public Dataset Documentation
55 stars 19 forks source link

SMARTDS opendssdirect load issue in documentation #36

Closed adam-morse closed 1 week ago

adam-morse commented 1 week ago

Hello,

I'm trying to make use of the opendssdirect code given in the documentation and may have found an error.

When creating the load_profile_map dictionary, the code references the non-normalized total_site_electricity_kw and total_site_electricity_kvar columns in the load_data parquet files. Depending on whether it is a split phase load, the code goes on to assign a 0.5 multiplier.

However the Loads.dss and LoadShapes.dss files reference a normalized curve, and thus should reference the normalized kw and kvar values provided in the csv files, saved in the profiles folder.

Am I missing something? It seems like the csv load curve files are normalized versions of the parquet file total usage columns; and thus the parquet files are provided mainly as reference (ie there is no interation with the OpenDSS powerflow).

FYI my purpose is to adapt this code to determine the minimum load point of the year on feeders.

Adam

tarekelgindy commented 1 week ago

Hi Adam,

Yes the csv files are normalized versions of the timeseries loads that are included in the parquet files. When the timeseries values in the csv files are multiplied by the kW and kVar values in the Loads.dss files (which represent the maximum kW and kVar values for that load over the year), they should match the timeseries values in the parquet files.

For people running opendss directly, the loadshapes reference the csv timeseries files which are multiplied by the kW and kVar values for each load. The load values are updated for each timepoint when opendss is run. However, this approach can be a bit slow with OpenDSS holding all the timeseries data in memory at once.

The alternative is to use the code that I included to use the timeseries data from the parquet files and update the load values directly at each timepoint and then run OpenDSS's powerflow. I've found that this approach was much faster and allowed the user to have more direct control of the data being used in the simulations.

Hope this makes sense - let me know if you've got any further questions about this at all.

Tarek

adam-morse commented 1 week ago

When the timeseries values in the csv files are multiplied by the kW and kVar values in the Loads.dss files (which represent the maximum kW and kVar values for that load over the year), they should match the timeseries values in the parquet files.

I guess my question boiled down to whether that statement is true and I understood it wasn't. This would mean that the max kw/kvar loads in the Loads.dss files would be the same as the normalization factor used to create the csv file. My experience with Resstock/Comstock has been that the values represent an arbitrary number of simulated building models. The magnitude of the values thus has little value and it is the normalized curve that's important.