DataDog / datadog-agent

Main repository for Datadog Agent
https://docs.datadoghq.com/
Apache License 2.0
2.91k stars 1.21k forks source link

[BUG] Trace working but logs not #27605

Closed StephanyBatista closed 4 months ago

StephanyBatista commented 4 months ago

I have configured as documentation and the metrics and trace is working, but I don´t see logs

Agent Environment This error ocorrs on my linux machine but also on azure app service container

docker exec {MY_CONTAINER} agent status OCI runtime exec failed: exec failed: unable to start container process: exec: "agent": executable file not found in $PATH: unknown

Below my docker file

FROM python:3.8-buster

ARG VERSION

RUN mkdir -p /root/.opencensus/.azure/gunicorn

COPY --from=datadog/serverless-init:1 /datadog-init /usr/bin/datadog-init

ADD requirements.txt .
RUN pip install -r requirements.txt

ENTRYPOINT ["/usr/bin/datadog-init"]

RUN mkdir /code
WORKDIR /code
ADD src /code/
ADD . /code/

RUN echo ${VERSION}
ENV DD_VERSION=${VERSION}
ENV DD_SERVICE={MY_SERVICE}
ENV DD_ENV=production
ENV DD_VERSION=test-log-dd2
ENV DD_API_KEY={MY_KEY}
ENV DD_TRACE_ENABLED=true
ENV DD_LOG_LEVEL=debug
ENV DD_LOGS_CONFIG_CONTAINER_COLLECT_ALL=true
ENV DD_LOGS_ENABLED=true

COPY init.sh /usr/local/bin/

RUN chmod u+x /usr/local/bin/init.sh

CMD ["/usr/local/bin/init.sh"]

And in init.sh

#!/bin/bash

ddtrace-run gunicorn --worker-tmp-dir /dev/shm --max-requests $GUNICORN_MAX_REQUESTS --bind 0.0.0.0:$PORT -k gevent --chdir src wsgi:app

These are my logs when I start the app

WARNING: Published ports are discarded when using host network mode 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Loading proxy settings 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | DEBUG | 'use_proxy_for_cloud_metadata' is enabled: adding cloud provider URL to the no_proxy list 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Starting logs-agent... 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | logs-agent started 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | DEBUG | Adding Datadog Agent Log Source 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Starting to load the configuration 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | WARN | Error loading config: Config File "datadog" Not Found in "[]" 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | 0 Features detected from environment: 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | DEBUG | Error loading config: Config File "datadog" Not Found in "[]" 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | DEBUG | Using a SyncForwarder with a 5s timeout 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Retry queue storage on disk is disabled 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Starting to load the configuration 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | WARN | Error loading config: Config File "serverless" Not Found in "[/var/task/datadog.yaml]" 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Creating TimeSampler #0 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | 0 Features detected from environment: 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | DEBUG | Forwarder started 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | DEBUG | Demultiplexer started 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Loaded configuration: /var/task/datadog.yaml 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | DEBUG | Setting DefaultEnv to "production" (from 'env' config option) 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | DEBUG | 'telemetry.dogstatsd.aggregator_channel_latency_buckets' is empty, falling back to default values 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | DEBUG | 'telemetry.dogstatsd.listeners_latency_buckets' is empty, falling back to default values 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | DEBUG | 'telemetry.dogstatsd.listeners_channel_latency_buckets' is empty, falling back to default values 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | DEBUG | dogstatsd-udp: 127.0.0.1:8125 successfully initialized 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Dogstatsd workers and pipelines count: 10 workers, 1 pipelines 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Dogstatsd configured to run with 10 workers and 1 pipelines 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | DEBUG | DogStatsD will run 10 workers 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | dogstatsd-udp: starting to listen on 127.0.0.1:8125 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Dogstatsd workers and pipelines count: 10 workers, 1 pipelines 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Dogstatsd configured to run with 10 workers and 1 pipelines 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Dogstatsd workers and pipelines count: 10 workers, 1 pipelines 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Dogstatsd configured to run with 10 workers and 1 pipelines 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Dogstatsd workers and pipelines count: 10 workers, 1 pipelines 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Dogstatsd configured to run with 10 workers and 1 pipelines 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Dogstatsd workers and pipelines count: 10 workers, 1 pipelines 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Dogstatsd configured to run with 10 workers and 1 pipelines 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Dogstatsd workers and pipelines count: 10 workers, 1 pipelines 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Dogstatsd configured to run with 10 workers and 1 pipelines 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Dogstatsd workers and pipelines count: 10 workers, 1 pipelines 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Dogstatsd configured to run with 10 workers and 1 pipelines 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Dogstatsd workers and pipelines count: 10 workers, 1 pipelines 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Dogstatsd configured to run with 10 workers and 1 pipelines 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Dogstatsd workers and pipelines count: 10 workers, 1 pipelines 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Dogstatsd configured to run with 10 workers and 1 pipelines 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Dogstatsd workers and pipelines count: 10 workers, 1 pipelines 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Dogstatsd configured to run with 10 workers and 1 pipelines 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Dogstatsd workers and pipelines count: 10 workers, 1 pipelines 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Dogstatsd configured to run with 10 workers and 1 pipelines 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | DEBUG | otlp endpoint disabled 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | DEBUG | Launching subprocess [/usr/local/bin/init.sh] 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Starting Agent with processor trace buffer of size 0 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | DEBUG | Stats writer initialized (climit=20 qsize=83) 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Receiver configured with 6 decoders and a timeout of 1000ms 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Trace writer initialized (climit=100 qsize=1) 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | DEBUG | [pipeline_stats] Creating proxy handler 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | DEBUG | [pipeline_stats] Intake URL %s https://trace.agent.datadoghq.com/api/v0.1/pipeline_stats 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | DEBUG | [pipeline_stats] Creating reverse proxy 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Listener started with 1000 maximum connections. 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | INFO | Listening for traces at http://localhost:8126 2024-07-15 18:41:07 UTC | SERVERLESS_INIT | DEBUG | Starting concentrator [2024-07-15 18:41:08 +0000] [20] [INFO] Starting gunicorn 19.9.0 [2024-07-15 18:41:08 +0000] [20] [INFO] Listening at: http://0.0.0.0:5000 (20) [2024-07-15 18:41:08 +0000] [20] [INFO] Using worker: gevent /usr/local/lib/python3.8/os.py:1023: RuntimeWarning: line buffering (buffering=1) isn't supported in binary mode, the default buffer size will be used return io.open(fd, *args, **kwargs) [2024-07-15 18:41:08 +0000] [25] [INFO] Booting worker with pid: 25 INFO 2024-07-15 18:41:08,345 server register_blueprints 39 : Registering blueprints INFO 2024-07-15 18:41:08,359 server register_blueprints 48 : Blueprints registered 2024-07-15 18:41:09 UTC | SERVERLESS_INIT | INFO | Container metrics provider discovery process finished 2024-07-15 18:41:10 UTC | SERVERLESS_INIT | DEBUG | Received a Flush trigger 2024-07-15 18:41:10 UTC | SERVERLESS_INIT | DEBUG | Demultiplexer: sendIterableSeries: start sending iterable series to the serializer 2024-07-15 18:41:10 UTC | SERVERLESS_INIT | DEBUG | Sending sketches payload : {"metric":"datadog.serverless_agent.enhanced.cold_start","tags":["datadog_init_version:1.2.2","_dd.compute_stats:1","env:production","service:rc-contracts-api-us","version:test-log-dd"],"host":"","interval":10,"points":[{"sketch":{"summary":{"Min":1,"Max":1,"Sum":1,"Avg":1,"Cnt":1}},"ts":1721068860}]} 2024-07-15 18:41:10 UTC | SERVERLESS_INIT | DEBUG | The payload was not too big, returning the full payload 2024-07-15 18:41:10 UTC | SERVERLESS_INIT | DEBUG | The payload was not too big, returning the full payload 2024-07-15 18:41:11 UTC | SERVERLESS_INIT | INFO | Successfully posted payload to "https://7-54-0-app.agent.datadoghq.com/api/v1/series" (202 Accepted), the agent will only log transaction success every 500 transactions 2024-07-15 18:41:11 UTC | SERVERLESS_INIT | DEBUG | SyncForwarder has flushed 1 transactions 2024-07-15 18:41:11 UTC | SERVERLESS_INIT | DEBUG | Demultiplexer: sendIterableSeries: stop routine 2024-07-15 18:41:11 UTC | SERVERLESS_INIT | DEBUG | SyncForwarder has flushed 1 transactions 2024-07-15 18:41:12 UTC | SERVERLESS_INIT | DEBUG | Send messages for pipeline logs (msg_count:8, content_size=2792, avg_msg_size=349.00) 2024-07-15 18:41:12 UTC | SERVERLESS_INIT | INFO | No data received 2024-07-15 18:41:13 UTC | SERVERLESS_INIT | DEBUG | Received a Flush trigger 2024-07-15 18:41:13 UTC | SERVERLESS_INIT | DEBUG | Demultiplexer: sendIterableSeries: start sending iterable series to the serializer 2024-07-15 18:41:13 UTC | SERVERLESS_INIT | DEBUG | The payload was not too big, returning the full payload 2024-07-15 18:41:13 UTC | SERVERLESS_INIT | DEBUG | SyncForwarder has flushed 1 transactions 2024-07-15 18:41:13 UTC | SERVERLESS_INIT | DEBUG | Demultiplexer: sendIterableSeries: stop routine 2024-07-15 18:41:16 UTC | SERVERLESS_INIT | DEBUG | Received a Flush trigger 2024-07-15 18:41:16 UTC | SERVERLESS_INIT | DEBUG | Demultiplexer: sendIterableSeries: start sending iterable series to the serializer 2024-07-15 18:41:16 UTC | SERVERLESS_INIT | DEBUG | The payload was not too big, returning the full payload 2024-07-15 18:41:16 UTC | SERVERLESS_INIT | DEBUG | SyncForwarder has flushed 1 transactions 2024-07-15 18:41:16 UTC | SERVERLESS_INIT | DEBUG | Demultiplexer: sendIterableSeries: stop routine 2024-07-15 18:41:19 UTC | SERVERLESS_INIT | DEBUG | Received a Flush trigger

StephanyBatista commented 4 months ago

More one information

I see logs when the app shows internal error (500), like this code:

import logging
import flask_restful as restful
from flask_restful import reqparse
from flask import jsonify

from helpers import authentication_required

FORMAT = ('%(asctime)s %(levelname)s [%(name)s] [%(filename)s:%(lineno)d] '
          '[test] '
          '- %(message)s')
logging.basicConfig(format=FORMAT)
log = logging.getLogger(__name__)
log.level = logging.INFO

class GetAllContracts(restful.Resource):
    @authentication_required
    def get(self, **kwargs):
        print('test 343')
        log.info('Hello, World!')

        zero = 0
        um = 1
        divisao = um / zero
        print(divisao)

This is exactaly what the log shows when the app is started 2024-07-15 18:41:11 UTC | SERVERLESS_INIT | INFO | Successfully posted payload to "https://7-54-0-app.agent.datadoghq.com/api/v1/series" (202 Accepted), the agent will only log transaction success every 500 transactions

image

jcstorms1 commented 4 months ago

Hey @StephanyBatista 👋 Apologies for the inconvenience here but I suspect this is related to a bug introduced in v1.2.3. We recognized the bug and released v1.2.4 based on the previous working version. Since you are using COPY --from=datadog/serverless-init:1 I suspect you pulled in the version with the bug before we recognized it. Could you confirm if you are still missing logs after the update?

https://github.com/DataDog/datadog-agent/issues/27502.

StephanyBatista commented 4 months ago

Hi, Thanks for help me.

Now I'm seeing the log just on live search screen, but as one error, not info, do you know what could be?

image

And the message continue with 024-07-16 14:29:48 UTC | SERVERLESS_INIT | INFO | Successfully posted payload to "https://7-54-0-app.agent.datadoghq.com/api/v1/series" (202 Accepted), the agent will only log transaction success every 500 transactions

StephanyBatista commented 4 months ago

Reading the documentation, I understood that this just work on azure container ou GCP, so in my local machine is normal this error? If I publish to these clouds will work?

jcstorms1 commented 4 months ago

Hey @StephanyBatista I'm not sure I understand the error you are seeing. If you continue to have issues with that, feel free to reach out to support. I think the original reason this issue was opened was addressed and I'm going to close this for now. Thanks again for reaching out and sorry for the inconvenience.