grafana / loki

Like Prometheus, but for logs.
https://grafana.com/loki
GNU Affero General Public License v3.0
23.86k stars 3.44k forks source link

Logstash plugin doesn't send logs to Loki #7986

Open ichasco-heytrade opened 1 year ago

ichasco-heytrade commented 1 year ago

Describe the bug I have added in logstatsh the output plugin of loki and it seems like it doesn't send any logs to loki. Logstash is working because it sends also logs to opensearch and it works pefect.

To Reproduce Install loki output plugin in logstash Configure the output to send the logs to loki Generate logs

Expected behavior I need to send logs from logstash to loki

Environment:

Versions Loki: 2.7.0 Logstash: 8.4.0 | Docker Image: opensearchproject/logstash-oss-with-opensearch-output-plugin

Configs

Loki:

auth_enabled: false
    common:
      path_prefix: /var/loki
      replication_factor: 1
      ring:
        instance_addr: 127.0.0.1
        kvstore:
          store: inmemory
      storage:
        filesystem:
          chunks_directory: /var/loki/chunks
          rules_directory: /var/loki/rules
    limits_config:
      enforce_metric_name: false
      max_cache_freshness_per_query: 10m
      reject_old_samples: true
      reject_old_samples_max_age: 168h
      split_queries_by_interval: 15m
    memberlist:
      join_members:
      - loki-memberlist
    query_range:
      align_queries_with_step: true
    schema_config:
      configs:
      - from: "2022-01-11"
        index:
          period: 24h
          prefix: loki_index_
        object_store: filesystem
        schema: v12
        store: boltdb-shipper
    server:
      grpc_listen_port: 9095
      http_listen_port: 3100
    storage_config:
      hedging:
        at: 250ms
        max_per_second: 20
        up_to: 3

Logstash:

    input {
        http {
          port      => 9601
          add_field => { "[@metadata][input-http]" => "" }
        }
      }
      filter {
        if [@metadata][input-http] == "" {
          date {
            match         => [ "date", "UNIX" ]
            remove_field  => [ "date" ]
          }
          mutate {
            remove_field => ["headers","host","_p"]
          }
        }
      }
      output {
      if [cluster] == "development" {
          loki {
            url => "http://loki.monitoring.svc.cluster.local:3100/loki/api/v1/push"
            insecure_skip_verify => true
          }
          opensearch {
            hosts => ["${ELASTIC_HOST}"]
            auth_type => {
              type      => 'basic'
              user      => "${ELASTIC_USERNAME}"
              password  => "${ELASTIC_PASSWORD}"
            }
            index  => "logstash-dev-%{+YYYY.MM.dd}"
            action => "create"
          }
        }
     }

Logs:

Logstash

08:05:45.562 [Converge PipelineAction::Create<main>] DEBUG logstash.outputs.loki - config LogStash::Outputs::Loki/@insecure_skip_verify = true
08:05:45.566 [Converge PipelineAction::Create<main>] DEBUG logstash.outputs.loki - config LogStash::Outputs::Loki/@url = "http://loki.monitoring.svc.cluster.local:3100/loki/api/v1/push"
08:05:45.567 [Converge PipelineAction::Create<main>] DEBUG logstash.outputs.loki - config LogStash::Outputs::Loki/@id = "1e66f1cde107730d9939ed0d7f0e62e5335cc37b00cd4a592134c21df3788d32"
08:05:45.567 [Converge PipelineAction::Create<main>] DEBUG logstash.outputs.loki - config LogStash::Outputs::Loki/@enable_metric = true
08:05:45.567 [Converge PipelineAction::Create<main>] DEBUG logstash.outputs.loki - config LogStash::Outputs::Loki/@codec = <LogStash::Codecs::Plain id=>"plain_743803b4-94b5-4e25-a402-f374086b1017", enable_metric=>true, charset=>"UTF-8">
08:05:45.567 [Converge PipelineAction::Create<main>] DEBUG logstash.outputs.loki - config LogStash::Outputs::Loki/@workers = 1
08:05:45.567 [Converge PipelineAction::Create<main>] DEBUG logstash.outputs.loki - config LogStash::Outputs::Loki/@batch_size = 102400
08:05:45.567 [Converge PipelineAction::Create<main>] DEBUG logstash.outputs.loki - config LogStash::Outputs::Loki/@batch_wait = 1
08:05:45.568 [Converge PipelineAction::Create<main>] DEBUG logstash.outputs.loki - config LogStash::Outputs::Loki/@message_field = "message"
08:05:45.568 [Converge PipelineAction::Create<main>] DEBUG logstash.outputs.loki - config LogStash::Outputs::Loki/@min_delay = 1
08:05:45.568 [Converge PipelineAction::Create<main>] DEBUG logstash.outputs.loki - config LogStash::Outputs::Loki/@include_fields = []
08:05:45.568 [Converge PipelineAction::Create<main>] DEBUG logstash.outputs.loki - config LogStash::Outputs::Loki/@max_delay = 300
08:05:45.568 [Converge PipelineAction::Create<main>] DEBUG logstash.outputs.loki - config LogStash::Outputs::Loki/@retries = 10
08:06:14.233 [[main]-pipeline-manager] INFO  logstash.outputs.loki - Loki output plugin {:class=>"LogStash::Outputs::Loki"}

If I send a log with curl from logstash to loki, it works. So I guess that is loki out plugin error beacause logstash also sends the logs as expected to opensearch.

Thanks

chaudum commented 1 year ago

Hi @ichasco-heytrade Did you try the official Loki logstash output plugin as well? Do you receive any requests on Loki?

ichasco-heytrade commented 1 year ago

Hi @chaudum No, I haven't try lokis logstash only opensearch's image because it was the output I was using. No I don't receive any request on Loki. I have run it in debug mode also to see if there was any error and I didn't found anything.

ichasco-heytrade commented 1 year ago

Loki docker image is broken. I can't install other plugins that I need

Gem::LoadError: You have already activated clamp 1.3.2, but your Gemfile requires clamp 1.0.1. Prepending `bundle exec` to your command may solve this.
           check_for_activated_spec! at /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bundler-2.3.6/lib/bundler/runtime.rb:309
                               setup at /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bundler-2.3.6/lib/bundler/runtime.rb:25
                                each at org/jruby/RubyArray.java:1821
                                each at /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bundler-2.3.6/lib/bundler/spec_set.rb:136
                                 map at org/jruby/RubyEnumerable.java:886
                               setup at /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bundler-2.3.6/lib/bundler/runtime.rb:24
                               setup at /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bundler-2.3.6/lib/bundler.rb:151
                              setup! at /usr/share/logstash/lib/bootstrap/bundler.rb:79
  update_logstash_mixin_dependencies at /usr/share/logstash/lib/pluginmanager/install.rb:187
                             execute at /usr/share/logstash/lib/pluginmanager/install.rb:77
                                 run at /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/clamp-1.3.2/lib/clamp/command.rb:66
                             execute at /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/clamp-1.3.2/lib/clamp/subcommand/execution.rb:18
                                 run at /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/clamp-1.3.2/lib/clamp/command.rb:66
                                 run at /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/clamp-1.3.2/lib/clamp/command.rb:140
                              <main> at /usr/share/logstash/lib/pluginmanager/main.rb:64
ichasco-heytrade commented 1 year ago

The version of loki output plugin I am using is logstash-output-loki (1.1.0)

adityacs commented 1 year ago

@ichasco-heytrade Loki needs at least one label in the stream to accept the data. In your case, seems the event loki-output plugin is receiving is something like below.

"@version" => "1",
 "@timestamp" => 2022-12-30T10:23:09.300Z,
 "message" => "{"request_duration": "20"}"

Which loki-output plugin transforms to following before sending it to Loki server.

 {:payload=>"{\"streams\":[{\"stream\":{},\"values\":[[\"1672395335076999936\",\"{\\\"reuqest_duration\\\": \\\"20\\\"}\"]]}]}"}

message and @timestamp gets transformed to values. Since, there are no other fields without @ prefix, no labels will be added. So, stream is empty.

To fix your issue, add or transform a field to remove @ may be using a filter

mfhanif commented 1 year ago

Hai I encounter same error, i send jhipster apps logs from logstash to grafana cloud with error below

[2023-10-16T06:08:21,743][DEBUG][logstash.outputs.loki    ][jhipster-app] failed payload {:payload=>"{\"streams\":[{\"stream\":{\"app_name\":\"sfa-phapros_prod\",\"app_port\":\"8080\",\"level\":\"WARN\",\"level_value\":\"30000\",\"logger_name\":\"org.zalando.problem.spring.common.AdviceTraits\",\"thread_name\":\"XNIO-1 task-10\",\"type\":\"syslog\"},\"values\":[[\"1697436500252000000\",\"Unauthorized: Full authentication is required to access this resource\"]]},{\"stream\":{\"app_name\":\"sfa-phapros_prod\",\"app_port\":\"8080\",\"level\":\"WARN\",\"level_value\":\"30000\",\"logger_name\":\"org.springframework.web.servlet.mvc.method.annotation.ExceptionHandlerExceptionResolver\",\"thread_name\":\"XNIO-1 task-10\",\"type\":\"syslog\"},\"values\":[[\"1697436500252999936\",\"Resolved [org.springframework.security.authentication.InsufficientAuthenticationException: Full authentication is required to access this resource]\"]]}]}"}

logstash pipeline

input {
    tcp {
        port => 5000
        type => syslog
        codec => json_lines
    }
}

output {
    elasticsearch {
            cloud_id =>  ...
            api_key =>  ...
            index => "logs-%{+YYYY.MM.dd}"
    }
    stdout {}
    loki {
        url =>  ...
        username =>  ...
        password =>  ...
    }
}

When i send to elasticsearch it succeed. I use offical logstash-output-loki https://grafana.com/docs/loki/latest/send-data/logstash/ Help me anyone