vectordotdev / vector

A high-performance observability data pipeline.
https://vector.dev
Mozilla Public License 2.0
18.22k stars 1.6k forks source link

Enable OIDC use authentication with the kafka source #21605

Open rightly opened 1 month ago

rightly commented 1 month ago

A note for the community

Problem

When I use kafka with OAUTHBEARER mechanism. I faced this errors

ERROR vector::topology::builder: Configuration error. error=Sink "kafka": creating kafka producer failed: Client config error: Configuration property "sasl.oauthbearer.token.endpoint.url" not supported in this build: OAuth/OIDC depends on libcurl and OpenSSL which were not available at build time sasl.oauthbearer.token.endpoint.url
ERROR vector::topology::builder: Configuration error. error=Sink "kafka": creating kafka producer failed: Client config error: Configuration property "sasl.oauthbearer.client.id" not supported in this build: OAuth/OIDC depends on libcurl and OpenSSL which were not available at build time sasl.oauthbearer.client.id
ERROR vector::topology::builder: Configuration error. error=Sink "kafka": creating kafka producer failed: Client config error: Configuration property "sasl.oauthbearer.method" not supported in this build: OAuth/OIDC depends on libcurl and OpenSSL which were not available at build time sasl.oauthbearer.method

I think there was a patch to use oauthbearer, how do I fix this?

Configuration

kafka:
    bootstrap_servers: kafka-domain:10992
    librdkafka_options:
      sasl.mechanism: "OAUTHBEARER"
      sasl.oauthbearer.token.endpoint.url: "https://oauth-url"
      sasl.oauthbearer.client.id: "client"
      sasl.oauthbearer.client.secret: "secrets"
      sasl.oauthbearer.scope: "domain"
      sasl.oauthbearer.method: "OIDC"
      ssl.ca.location: "/etc/vector/ca.crt"
      security.protocol: "SASL_SSL"

Version

0.42.0

Debug Output

No response

Example Data

No response

Additional Context

No response

References

No response

jszwedko commented 4 weeks ago

https://github.com/vectordotdev/vector/pull/21103 was intended to enable this, but it looks like there are some more changes needed. I can confirm that I receive the same error with the above configuration.

cc/ @zapdos26 in case you have any thoughts.

zapdos26 commented 4 weeks ago

So, I've been using the one I built for a while now. Let me check what possibly changed.

zapdos26 commented 4 weeks ago

Okay, so after testing, its due to the fact curl was changed to curl-sys. Apparently curl-sys is not a valid to enable to OAUTH/OIDC.

Building with curl should work.

jszwedko commented 4 weeks ago

Ah, I see, interesting. This may be a bug with rust-rdkafka or librdkafka that including curl-static is insufficient since statically compiling in libcurl should make it available. We try to avoid any dynamic linking in Vector to reduce dependencies on the host system.

It sounds like a workaround would be to build Vector yourself with the curl feature rather than curl-static for rust-rdkafka.

jszwedko commented 4 weeks ago

Thanks for looking at that quickly @zapdos26 !

rightly commented 4 weeks ago

@zapdos26 , @jszwedko Thanks!!

Could you provide me with a package(build?)?

rightly commented 3 weeks ago

@jszwedko I tried to build from rhel 8, but I got an error. Can you help me?

  1. curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y --default-toolchain stable
  2. mkdir -p vector && \ curl -sSfL --proto '=https' --tlsv1.2 https://api.github.com/repos/vectordotdev/vector/tarball/v0.42.0 | \ tar xzf - -C vector --strip-components=1
  3. change Cargo.toml curl-static to curl
  4. make build

    
    -- output
    Error configuring OpenSSL build:
      Command: cd "/~/src/vector/target/release/build/openssl-sys-c875dce6ad3fcfac/out/openssl-build/build/src" && env -u CROSS_COMPILE AR="ar" CC="cc" RANLIB="ranlib" "perl" "./Configure" "--prefix=/home1/~/vector/target/release/build/openssl-sys-c875dce6ad3fcfac/out/openssl-build/install" "--openssldir=/usr/local/ssl" "no-dso" "no-shared" "no-ssl3" "no-tests" "no-comp" "no-zlib" "no-zlib-dynamic" "--libdir=lib" "no-md2" "no-rc5" "no-weak-ssl-ciphers" "no-camellia" "no-idea" "no-seed" "linux-x86_64" "-O2" "-ffunction-sections" "-fdata-sections" "-fPIC" "-m64" "-g0" "-O3"
      Exit status: exit status: 2
    
    note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
    warning: build failed, waiting for other jobs to finish...
    make: *** [Makefile:195: build] Error 101
jszwedko commented 3 weeks ago

make build builds a development build. Try the make task that builds for your specific platform. For example, this might by make build-x86_64-unknown-linux-gnu. This should set the right flags.

rightly commented 3 weeks ago

@jszwedko I'm sorry. Can I contact one more thing?

I just build this command (cargo build --release --target x86_64-unknown-linux-gnu --no-default-features --features target-x86_64-unknown-linux-gnu) and startup with oauthbearer config.

but an error has occurred with segmentation fault.

jszwedko commented 3 weeks ago

That's strange πŸ˜“ It looks like it might be a bug in librdkafka, I found a related issue https://github.com/confluentinc/librdkafka/issues/4505. Maybe @zapdos26 has more details about how they got it working in their custom build πŸ™ .

rightly commented 3 weeks ago

@jszwedko Thanks a lot!

@zapdos26 I'm sorry... Could you let me know how to build vector that work kafka with oauthbearer? or share x86_64 linux binary?

zapdos26 commented 3 weeks ago

So the above error is actually due to the fact its trying to dynmically link to OpenSSL 3 vs 1.1 Make sure in you have OpenSSL 1.1 installed.

You could also try something like this: LD_PRELOAD=/usr/lib/libssl.so.1.1 vector

hayman42 commented 2 weeks ago

@rightly Does it work properly after rebuilding vector? I faced the same problem and then built a new vector with openssl 1.1 as instructed above. Then the libcurl and openssl error disappeared, but new one occurs as following. This happens even if I specifiyssl.ca.location correctly.

2024-11-06T12:55:07.682043Z ERROR source{component_kind="source" component_id=hadoop_app component_type=kafka}:kafka_source: rdkafka::client: librdkafka: Global error: Authentication (Local: Authentication failure): Failed to acquire SASL OAUTHBEARER token: SSL certificate problem: unable to get local issuer certificate    
2024-11-06T12:55:07.682141Z ERROR source{component_kind="source" component_id=hadoop_app component_type=kafka}:kafka_source: vector::internal_events::kafka: Failed to read message. error=Message consumption error: Authentication (Local: Authentication failure) error_code="reading_message" error_type="reader_failed" stage="receiving" internal_log_rate_limit=true

It seems like due to rust-rdkafka using old librdkafka version. rust-rdkafka is based on librdkafka 2.3.0 but here in https://github.com/confluentinc/librdkafka/issues/4761, same occurs with 2.4.0. And then I confirmed that it works properly with https://github.com/confluentinc/confluent-kafka-python, which is based on librdkafka 2.6.0

Is there any workaround? or any plan for upgrading librdkafka version? @jszwedko


I digged further into this issue and seems like version upgrade won't fix the issue. (I just tried to build rust-rdkafka with librdkafka 2.60, use this new one to build vector and then error still occurs). We can try this approach https://github.com/confluentinc/librdkafka/issues/3751#issuecomment-1681480868 which overrides oauthbearer_token_refresh_cb I guess?

rightly commented 1 week ago

@hayman42 I also faced same issue. but, I put it off because I didn't have enough time.