Closed ClaraBuettner closed 1 month ago
I installed powerd-data using this branch and started a new calculation for the status2019 scenario. Below you can find relevant information: -airflow-port: 8081 -database-name: powerd-data -database-port: '59731'
Airflow starts but the same error is displayed for both pipelines:
Broken DAG: [/home/powerd/powerd-run-de/powerd-data/src/egon/data/airflow/dags/pipeline_status_quo.py]
Traceback (most recent call last):
File "
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/powerd/powerd-run-de/powerd-data/src/egon/data/datasets/init.py", line 12, in airflow.operators.python.PythonOperator
while trying to import airflow.operators.python_operator.PythonOperator
.
Airflow starts but the same error is displayed for both pipelines:
Broken DAG: [/home/powerd/powerd-run-de/powerd-data/src/egon/data/airflow/dags/pipeline_status_quo.py] Traceback (most recent call last): File "", line 219, in _call_with_frames_removed File "/home/powerd/powerd-run-de/venv/lib/python3.8/site-packages/airflow/operators/python.py", line 56, in from airflow.utils.file import get_unique_dag_module_name ImportError: cannot import name 'get_unique_dag_module_name' from 'airflow.utils.file' (/home/powerd/powerd-run-de/venv/lib/python3.8/site-packages/airflow/utils/file.py)
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/home/powerd/powerd-run-de/powerd-data/src/egon/data/datasets/init.py", line 12, in from airflow.operators.python_operator import PythonOperator File "/home/powerd/powerd-run-de/venv/lib/python3.8/site-packages/airflow/utils/deprecation_tools.py", line 63, in getattr_with_deprecation raise ImportError(error_message) from e ImportError: Could not import
airflow.operators.python.PythonOperator
while trying to importairflow.operators.python_operator.PythonOperator
.
Thanks for reporting! I can somehow not reproduce the problem locally. Does the environment still exist on the server?
Since the problem looked like #218, I merged that branch. If it solves the problem, I will merge the other PR.
Airflow starts but the same error is displayed for both pipelines: Broken DAG: [/home/powerd/powerd-run-de/powerd-data/src/egon/data/airflow/dags/pipeline_status_quo.py] Traceback (most recent call last): File "", line 219, in _call_with_frames_removed File "/home/powerd/powerd-run-de/venv/lib/python3.8/site-packages/airflow/operators/python.py", line 56, in from airflow.utils.file import get_unique_dag_module_name ImportError: cannot import name 'get_unique_dag_module_name' from 'airflow.utils.file' (/home/powerd/powerd-run-de/venv/lib/python3.8/site-packages/airflow/utils/file.py) The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/home/powerd/powerd-run-de/powerd-data/src/egon/data/datasets/init.py", line 12, in from airflow.operators.python_operator import PythonOperator File "/home/powerd/powerd-run-de/venv/lib/python3.8/site-packages/airflow/utils/deprecation_tools.py", line 63, in getattr_with_deprecation raise ImportError(error_message) from e ImportError: Could not import
airflow.operators.python.PythonOperator
while trying to importairflow.operators.python_operator.PythonOperator
.Thanks for reporting! I can somehow not reproduce the problem locally. Does the environment still exist on the server?
Sure. Everything related to this test is still available in at31: -airflow-port: 8081 -database-name: powerd-data -database-port: '59731'
working directory: home/powerd/powerd-run-de virtual environment: home/powerd/powerd-run-de/venv git repository: home/powerd/powerd-run-de/powerd-data
@CarlosEpia: I am very optimistic that pypsa-eur will finish now successfully. There were some problems with downloading files (inside pypsa-eur). I tried to fix this by copying some files, which is not the best solution (I'm sorry!). In general, the download works fine when I test it locally, so it might be related to the server.
I run pypsa-eur outside of airflow to speed up the bug fixing. But I think you can now either restart the task "pypsa_eur.prepare_network" or just mark it as successful as it was already running in a separate process. At least it is possible now to continue running the pipeline. I wish I would have been able to do a complete run before I leave. But I hope it also works out like this and you can continue from this point on.
There are several references in the function execute
to files that are not available:
The tasks included in the new class RunPypsaEur
must be included as a dependency for ElectricalNeighbours
Some of the missing files are created when eGon100RE is one of the selected scenarios. One file needs to be added to the data bundle and in one other case, I need to have a look at the code. But at least it works when the missing files are added manually.
The tasks included in the new class
RunPypsaEur
must be included as a dependency forElectricalNeighbours
That is strange, the prepared network should be enough. That is already part of the dependencies. Do you remember which error you got?
I copy pasted the file Snakefile from your session in order to get over this problem: Error: Snakefile "run-pypsa-eur/Snakefile" not found.; 331359)
This PR was replaced by the (already merged) PR #293
This branch updates the integration of pypsa-eur (which is now used instead of pypsa-eur-sec). Changes for pypsa-eur are now done within powerd-data, so we do not need a fork from pypsa-eur anymore.
Before merging into
dev
-branch, please make sure thatCHANGELOG.rst
was updated.black
andisort
.Dataset
-version is updated when existing datasets are adjusted.continuous-integration/run-everything-over-the-weekend
-branch.test mode
.Everything
mode.