Open NormanZielke opened 6 months ago
Thanks for the hint @NormanZielke. To make this work, you have to provide files needed for mapping (e.g. heatpump_cop_timeseries.csv) in the store.
For this you have to copy the corresponding raw data from into the raw directory store/esys_raw/. In the future, empty raw data (scalars and time series) will be created automatically. Then, assumptions on constant parameters such as plant costs, lifetime and efficiencies are mapped and set as values of the corresponding variables in the scalars.
I guess this needs to be adapted to:
For this you have to provide the corresponding input data in the store:
- raw/technology_data/data
- raw/renewables.ninja_feedin/data
- datasets/heatpump_cop/data.
Then, assumptions on constant parameters such as plant costs, lifetime and efficiencies are mapped and set as values of the corresponding variables in the scalars.
Esys raw data should be created automatically by now.
Could you please open up an pull request and replace the paragraph with the new one?
System: Windows, running in Anaconda env
When i run the code:
snakemake -j1
I do get the following error:
MissingInputException in rule datasets_demand_heat_region_heat_demand_shares in file xxxx\digipipe\digipipe\store\datasets\demand_heat_region/create.smk, line 81: Missing input files for rule datasets_demand_heat_region_heat_demand_shares: output: xxxx/demand_heat_region/data/demand_heat_shares_hh.json wildcards: sector=hh affected files: xxxx\digipipe\digipipe\store\datasets\demand_heat_region\data\demand_heat_zonal_stats-res-bkg_vg250_federal_states.gpkg xxxx\digipipe\digipipe\store\datasets\demand_heat_region\data\demand_heat_zonal_stats-res-bkg_vg250_state.gpkg xxxx\digipipe\digipipe\store\datasets\demand_heat_region\data\demand_heat_zonal_stats-res-bkg_vg250_muns_region.gpkg
"xxxx" is usually a path, I just deleted it because of privacy reasons.
MissingInputException
System: Windows, running in Anaconda env
When i run the code:
snakemake -j1
I do get the following error:
(For documentation purposes: we verified the python and snakemake versions are ok #187)
The missing files should be created by the upstream rule datasets_demand_heat_region_create_raster_zonal_stats
but it seems that snakemake cannot determine this rule thus something might be wrong with the DAG creation.
You are not able to create graphs as described here since you removed graphviz (not working on windoze #202 ), are you?
If so, could you please execute
snakemake --rulegraph > graph_rules.txt
andsnakemake --filegraph > graph_files.txt
,does this run without errors and the txt files hold data? (please on current dev, I had to flip some bits c9e3981944589a8d81d2574d562a805e9e831598 as they collided with the graph stuff)
Thanks for the quick reply @nesnoj! Yes, I had to remove graphviz because it interrupted the poetry install process, saying:
pygraphviz/graphviz_wrap.c(3020): fatal error C1083: Datei (Include) kann nicht geffnet werden: "graphviz/cgraph.h": No such file or directory error: command 'C:\\Program Files (x86)\\Microsoft Visual Studio\\2022\\BuildTools\\VC\\Tools\\MSVC\\14.39.33519\\bin\\HostX86\\x64\\cl.exe' failed with exit code 2
Note: This error originates from the build backend, and is likely not a problem with poetry but with pygraphviz (1.11) not supporting PEP 517 builds. You can verify this by running 'pip wheel --no-cache-dir --use-pep517 "pygraphviz (==1.11)"'.
Tried it with pip command - doesn't work as well, so I installed it using this way: https://pygraphviz.github.io/documentation/stable/install.html
I pulled the current dev and executed:
If so, could you please execute snakemake --rulegraph > graph_rules.txt and snakemake --filegraph > graph_files.txt ,
(Updated) Files are created with content. But the shell returns the same error as in my first post above:
MissingInputException in rule datasets_demand_heat_region_heat_demand_shares
INFO - [Mon Apr 8 17:02:14 2024] INFO - [Mon Apr 8 17:02:14 2024] rule write_ts: input: store/datasets/esys_raw/data/time_series/empty_ts_efficiencies.csv, store/datasets/esys_raw/data/time_series/empty_ts_feedin.csv, store/datasets/esys_raw/data/time_series/empty_ts_load.csv output: store/datasets/esys_raw/data/time_series/ts_efficiencies.csv, store/datasets/esys_raw/data/time_series/ts_feedin.csv, store/datasets/esys_raw/data/time_series/ts_load.csv jobid: 6 reason: Missing output files: store/datasets/esys_raw/data/time_series/ts_load.csv, store/datasets/esys_raw/data/time_series/ts_feedin.csv, store/datasets/esys_raw/data/time_series/ts_efficiencies.csv resources: tmpdir=C:\Users\NORMAN~1.ZIE\AppData\Local\Temp INFO - rule write_ts: input: store/datasets/esys_raw/data/time_series/empty_ts_efficiencies.csv, store/datasets/esys_raw/data/time_series/empty_ts_feedin.csv, store/datasets/esys_raw/data/time_series/empty_ts_load.csv output: store/datasets/esys_raw/data/time_series/ts_efficiencies.csv, store/datasets/esys_raw/data/time_series/ts_feedin.csv, store/datasets/esys_raw/data/time_series/ts_load.csv jobid: 6 reason: Missing output files: store/datasets/esys_raw/data/time_series/ts_load.csv, store/datasets/esys_raw/data/time_series/ts_feedin.csv, store/datasets/esys_raw/data/time_series/ts_efficiencies.csv resources: tmpdir=C:\Users\NORMAN~1.ZIE\AppData\Local\Temp
INFO - Traceback (most recent call last): File "esys/scripts/write_ts.py", line 225, in
map_over_var_name_ts(
File "esys/scripts/write_ts.py", line 195, in map_over_var_name_ts
df_new_data_ts = pd.read_csv(datasets_file_path, usecols=[1], sep=",")
File "C:\Users\Norman.Zielke\AppData\Local\miniconda3\envs\digipipe_windows\lib\site-packages\pandas\util_decorators.py", line 211, in wrapper
return func(*args, *kwargs)
File "C:\Users\Norman.Zielke\AppData\Local\miniconda3\envs\digipipe_windows\lib\site-packages\pandas\util_decorators.py", line 331, in wrapper
return func(args, kwargs)
File "C:\Users\Norman.Zielke\AppData\Local\miniconda3\envs\digipipe_windows\lib\site-packages\pandas\io\parsers\readers.py", line 950, in read_csv
return _read(filepath_or_buffer, kwds)
File "C:\Users\Norman.Zielke\AppData\Local\miniconda3\envs\digipipe_windows\lib\site-packages\pandas\io\parsers\readers.py", line 605, in _read
parser = TextFileReader(filepath_or_buffer, kwds)
File "C:\Users\Norman.Zielke\AppData\Local\miniconda3\envs\digipipe_windows\lib\site-packages\pandas\io\parsers\readers.py", line 1442, in init
self._engine = self._make_engine(f, self.engine)
File "C:\Users\Norman.Zielke\AppData\Local\miniconda3\envs\digipipe_windows\lib\site-packages\pandas\io\parsers\readers.py", line 1735, in _make_engine
self.handles = get_handle(
File "C:\Users\Norman.Zielke\AppData\Local\miniconda3\envs\digipipe_windows\lib\site-packages\pandas\io\common.py", line 856, in get_handle
handle = open(
FileNotFoundError: [Errno 2] No such file or directory: 'C:\Users\Norman.Zielke\Eigene_Daten\systemmodellierung\digipipe\digipipe\store\datasets\heatpump_cop\data\heatpump_cop_timeseries.csv'
[Mon Apr 8 17:02:18 2024]