Closed mariusves closed 7 years ago
Issue still present:
when importing through e.g.
network.import_components_from_dataframe(components['Line'],'Line')
instead of
network.import_components_from_dataframe(components[Line],'Line')
(like in ego.powerflow.py)
memory usage increases significantly.
Might be a PyPSA import_components_from_dataframe()
issue...
For now, I don't it get why network.import_components_from_dataframe(components[component], str(component))
(from mv_grid.py) should be more memory consuming than network.import_components_from_dataframe(component_data[Line],'Line')
(from ego.powerflow.py). components
and components_data
in both cases is a dict. In the first case dict keys are strings (i.e. 'Bus', 'Line', ...). In the latter example of ego.powerflow.py keys are SQLAlchemy orm objects (compare the import in the variables Bus
, Line
, ... with the keys that are used to access data from the dict). But this is just a side note.
I would like to compare the components
dict of ego.powerflow.py to that one from an hv example file similar to to mv_grid.py that uses pypsa_io.py
. Could you please provide a script in the examples folder?
You are right, for me it makes no sense why there is any difference between those two. I uploaded a short script, that allows to switch between both versions of import. Also I removed thestr()
on the component
part from my previous commit since it didn't solve the problem.
Memory issue appears when importing large networks with many components and setting the time range to a full year (8760 hours). Therefore not a bug, importer works as intended.
When creating powerflow problem by using create_powerflow_problem() function, memory blows up to 10 GB (when importing HV grid). When using old ego.powerflow script memory is at ~250 MB.