airflow-helm / charts

The User-Community Airflow Helm Chart is the standard way to deploy Apache Airflow on Kubernetes with Helm. Originally created in 2017, it has since helped thousands of companies create production-ready deployments of Airflow on Kubernetes.
https://github.com/airflow-helm/charts/tree/main/charts/airflow
Apache License 2.0
647 stars 475 forks source link

Missing kubectl logs of airflow-web in persistent volume #691

Closed stanvv closed 1 year ago

stanvv commented 1 year ago

Checks

Chart Version

8.6.1

Kubernetes Version

Client Version: version.Info{Major:"1", Minor:"26", GitVersion:"v1.26.0", GitCommit:"b46a3f887ca979b1a5d14fd39cb1af43e7e5d12d", GitTreeState:"clean", BuildDate:"2022-12-08T19:58:30Z", GoVersion:"go1.19.4", Compiler:"gc", Platform:"linux/amd64"}
Kustomize Version: v4.5.7
Server Version: version.Info{Major:"1", Minor:"24", GitVersion:"v1.24.6", GitCommit:"c86d003ea699ec4bcffee10ad563a26b63561c0e", GitTreeState:"clean", BuildDate:"2022-12-17T10:31:53Z", GoVersion:"go1.18.6", Compiler:"gc", Platform:"linux/amd64"}

Helm Version

version.BuildInfo{Version:"v3.11.0", GitCommit:"472c5736ab01133de504a826bd9ee12cbe4e7904", GitTreeState:"clean", GoVersion:"go1.18.10"}

Description

We're trying to collect all logging of our airflow pods for future analysis. For this, we included logs.persistence.enabled=true in our helm command and updated our helm-values.yaml (see relevant settings below at Custom Helm Values)

This leads to pod logging being written to /opt/airflow/logs/.. which is mounted to a PV (Azure File Share). So far so good!

However, when we run kubectl logs airflow-web -n airflow, we see a whole bunch of logs that we don't see in any of our log files in our PV. Please find the example below at Revelant Logs

How can we write those logs to a path?

Relevant Logs

Hello from custom entrypoint (baked in docker)
  ____________       _____________
 ____    |__( )_________  __/__  /________      __
____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
 _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
Running the Gunicorn Server with:
Workers: 4 sync
Host: 0.0.0.0:8080
Timeout: 120
Logfiles: /opt/airflow/logs/webserver/access.log /opt/airflow/logs/webserver/error.log
Access Logformat: 
=================================================================
/home/airflow/.local/lib/python3.7/site-packages/airflow/plugins_manager.py:258 RemovedInAirflow3Warning: This decorator is deprecated.
...........
[2023-02-01 09:08:18,364] {app.py:1742} ERROR - Exception on /admin/metrics/ [GET] ....
[2023-02-01 09:06:07,921] {app.py:1850} ERROR - Request finalizing failed with an error while handling an error
Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/pool/base
...........

Custom Helm Values

airflow:
  config:
    AIRFLOW__LOGGING__REMOTE_LOGGING: "True"
    AIRFLOW__LOGGING__REMOTE_LOG_CONN_ID: "azure_logging"
    AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER: "wasb-airflow-logs"
    AIRFLOW__LOGGING__BASE_LOG_FOLDER: "/opt/airflow/logs/dags"
    AIRFLOW__WEBSERVER__ACCESS_LOGFILE: "/opt/airflow/logs/webserver/access.log"
    AIRFLOW__WEBSERVER__ERROR_LOGFILE: "/opt/airflow/logs/webserver/error.log"
    AIRFLOW__LOGGING__DAG_PROCESSOR_MANAGER_LOG_LOCATION: "/opt/airflow/logs/dags/dag_processor_manager/dag_processor_manager.log"

scheduler:
  logCleanup:
    enabled: False

logs: 
  persistence:
    enabled: True
    storageClass: azurefile

workers:
  logCleanup:
    enabled: False

pgbouncer:
  logDisconnections: 1
  logConnections: 1
  verbose: 1
thesuperzapper commented 1 year ago

@stanvv I think you are confusing the logs which are persisted under /opt/airflow/logs/ (which are from the airflow application itself), with the Kubernetes logs (which are from the Pod's STDOUT and STDERR).

If you want to persist the Pod logs, you might struggle to keep them ALSO accessible within Kubernetes, but some possible approaches are:

  1. Just persist them using a normal Kubernetes process (this will depend on how you run Kubernetes, but should be easy if you run on a public cloud)
  2. Modify the entry point of the airflow container image to capture the STDOUT and STDERR into a file, while also displaying it.
stale[bot] commented 1 year ago

This issue has been automatically marked as stale because it has not had activity in 60 days. It will be closed in 7 days if no further activity occurs.

Thank you for your contributions.


Issues never become stale if any of the following is true:

  1. they are added to a Project
  2. they are added to a Milestone
  3. they have the lifecycle/frozen label