logstash-plugins / logstash-integration-kafka

Kafka Integration for Logstash, providing Input and Output Plugins
Apache License 2.0
32 stars 60 forks source link

AWS IAM authentication #178

Open andsel opened 1 month ago

andsel commented 1 month ago

Bundles all libraries to use AWS IAM authentication as SASL client.

How to test

andsel commented 4 weeks ago

Verified on EC2 with a policy connecting to an Amazon MKS

Running pipeline as described in the description of this PR.

And was able to create and connect client with AWS IAM

[2024-08-23T15:27:18,667][INFO ][logstash.runner          ] Log4j configuration path used is: /home/ubuntu/logstash/logstash-8.15.0/config/log4j2.properties
[2024-08-23T15:27:18,673][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.15.0", "jruby.version"=>"jruby 9.4.8.0 (3.1.4) 2024-07-02 4d41e55a67 OpenJDK 64-Bit Server VM 21.0.4+7-LTS on 21.0.4+7-LTS +indy +jit [x86_64-linux]"}
[2024-08-23T15:27:18,675][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[2024-08-23T15:27:18,678][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[2024-08-23T15:27:18,678][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[2024-08-23T15:27:18,895][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2024-08-23T15:27:19,606][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2024-08-23T15:27:20,085][INFO ][org.reflections.Reflections] Reflections took 164 ms to scan 1 urls, producing 138 keys and 481 values
[2024-08-23T15:27:20,382][INFO ][logstash.javapipeline    ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-08-23T15:27:20,423][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, "pipeline.sources"=>["/home/ubuntu/logstash/input_kafka_pipeline.conf"], :thread=>"#<Thread:0x4d4a40cf /home/ubuntu/logstash/logstash-8.15.0/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2024-08-23T15:27:20,974][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.55}
[2024-08-23T15:27:20,980][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2024-08-23T15:27:21,007][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2024-08-23T15:27:21,048][INFO ][org.apache.kafka.clients.consumer.ConsumerConfig][main][faab892c9600277d4182df64a79b8bb11b32e9078457a4dde3333bb1458e042d] ConsumerConfig values:
    allow.auto.create.topics = true
    auto.commit.interval.ms = 5000
    auto.include.jmx.reporter = true
    auto.offset.reset = latest
    bootstrap.servers = [boot-etkoetue.c1.kafka-serverless.us-east-1.amazonaws.com:9098]
    check.crcs = true
    client.dns.lookup = use_all_dns_ips
    client.id = logstash-0
    client.rack =
    connections.max.idle.ms = 540000
    default.api.timeout.ms = 60000
    enable.auto.commit = true
    exclude.internal.topics = true
    fetch.max.bytes = 52428800
    fetch.max.wait.ms = 500
    fetch.min.bytes = 1
    group.id = logstash
    group.instance.id = null
    heartbeat.interval.ms = 3000
    interceptor.classes = []
    internal.leave.group.on.close = true
    internal.throw.on.fetch.stable.offset.unsupported = false
    isolation.level = read_uncommitted
    key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer
    max.partition.fetch.bytes = 1048576
    max.poll.interval.ms = 300000
    max.poll.records = 500
    metadata.max.age.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
    receive.buffer.bytes = 32768
    reconnect.backoff.max.ms = 50
    reconnect.backoff.ms = 50
    request.timeout.ms = 40000
    retry.backoff.ms = 100
    sasl.client.callback.handler.class = class software.amazon.msk.auth.iam.IAMClientCallbackHandler
    sasl.jaas.config = [hidden]
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = null
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.login.callback.handler.class = null
    sasl.login.class = null
    sasl.login.connect.timeout.ms = null
    sasl.login.read.timeout.ms = null
    sasl.login.refresh.buffer.seconds = 300
    sasl.login.refresh.min.period.seconds = 60
    sasl.login.refresh.window.factor = 0.8
    sasl.login.refresh.window.jitter = 0.05
    sasl.login.retry.backoff.max.ms = 10000
    sasl.login.retry.backoff.ms = 100
    sasl.mechanism = AWS_MSK_IAM
    sasl.oauthbearer.clock.skew.seconds = 30
    sasl.oauthbearer.expected.audience = null
    sasl.oauthbearer.expected.issuer = null
    sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
    sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
    sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
    sasl.oauthbearer.jwks.endpoint.url = null
    sasl.oauthbearer.scope.claim.name = scope
    sasl.oauthbearer.sub.claim.name = sub
    sasl.oauthbearer.token.endpoint.url = null
    security.protocol = SASL_SSL
    security.providers = null
    send.buffer.bytes = 131072
    session.timeout.ms = 10000
    socket.connection.setup.timeout.max.ms = 30000
    socket.connection.setup.timeout.ms = 10000
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
    ssl.endpoint.identification.algorithm = https
    ssl.engine.factory.class = null
    ssl.key.password = null
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.certificate.chain = null
    ssl.keystore.key = null
    ssl.keystore.location = null
    ssl.keystore.password = null
    ssl.keystore.type = JKS
    ssl.protocol = TLSv1.3
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.certificates = null
    ssl.truststore.location = null
    ssl.truststore.password = null
    ssl.truststore.type = JKS
    value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer

[2024-08-23T15:27:21,206][INFO ][org.apache.kafka.common.security.authenticator.AbstractLogin][main][faab892c9600277d4182df64a79b8bb11b32e9078457a4dde3333bb1458e042d] Successfully logged in.
[2024-08-23T15:27:21,325][INFO ][org.apache.kafka.common.utils.AppInfoParser][main][faab892c9600277d4182df64a79b8bb11b32e9078457a4dde3333bb1458e042d] Kafka version: 3.4.1
[2024-08-23T15:27:21,326][INFO ][org.apache.kafka.common.utils.AppInfoParser][main][faab892c9600277d4182df64a79b8bb11b32e9078457a4dde3333bb1458e042d] Kafka commitId: 8a516edc2755df89
[2024-08-23T15:27:21,326][INFO ][org.apache.kafka.common.utils.AppInfoParser][main][faab892c9600277d4182df64a79b8bb11b32e9078457a4dde3333bb1458e042d] Kafka startTimeMs: 1724426841320
[2024-08-23T15:27:21,330][INFO ][org.apache.kafka.clients.consumer.KafkaConsumer][main][faab892c9600277d4182df64a79b8bb11b32e9078457a4dde3333bb1458e042d] [Consumer clientId=logstash-0, groupId=logstash] Subscribed to topic(s): logstash
kssminus commented 3 weeks ago

👍 👍 👍 👍 👍 👍 👍 👍 👍 👍 👍 👍 👍 👍 You are A LIFE SAVER!!