Open casssoft opened 3 years ago
/cc @aslom
This issue is stale because it has been open for 90 days with no
activity. It will automatically close after 30 more days of
inactivity. Reopen the issue with /reopen
. Mark the issue as
fresh by adding the comment /remove-lifecycle stale
.
/remove-lifecycle stale
I believe the eventing-kafka-broker can easily support this use case with a simple flag that just doesn't create a topic if it's not there... WDYT @pierDipi?
Yeah, we can support it. We just need to decide the API surface and then it could even be a good first contribution.
@pierDipi can you open an issue? Whatever API is fine for me, even an annotation
This issue is stale because it has been open for 90 days with no
activity. It will automatically close after 30 more days of
inactivity. Reopen the issue with /reopen
. Mark the issue as
fresh by adding the comment /remove-lifecycle stale
.
cc @itsmurugappan, this should be related to what you were sharing on #eventing-delivery regarding using a user-provided topic.
I opened an issue https://github.com/knative-sandbox/eventing-kafka-broker/issues/977
yes @pierDipi this is what kafka-topic-channel addresses and this is how I defined the channel spec -> https://github.com/Optum/kafka-topic-channel/blob/main/samples/pingsourceevents/channel-cm.yaml. Broker works the same.
This issue is stale because it has been open for 90 days with no
activity. It will automatically close after 30 more days of
inactivity. Reopen the issue with /reopen
. Mark the issue as
fresh by adding the comment /remove-lifecycle stale
.
/remove-lifecycle stale
This issue is stale because it has been open for 90 days with no
activity. It will automatically close after 30 more days of
inactivity. Reopen the issue with /reopen
. Mark the issue as
fresh by adding the comment /remove-lifecycle stale
.
/remove-lifecycle stale /triage accepted
Problem
In my cluster I already have a Kakfa topic managed through a non-knative system. I want a Broker/Trigger setup to read events off that kafka topic and filter them to different services, BUT ideally I don't want to have the overhead of an extra kafka topic just to copy the event into the Broker system.
This is a follow up from https://github.com/knative-sandbox/eventing-kafka/issues/215#issuecomment-733891085
The reason I want to use the Broker/Trigger setup is that it supports event filtering. Also it seems like the best way to use knative eventing when you have a big bucket of events that you want to separate out to multiple knative services
Here's the best ways I could tackle this that I know of:
Persona: Event consumer
Exit Criteria A clear path for this use case.
(Sorry that this is vague, part of the reason for this issue is to move the discussion out of issue 215 and hopefully help guide others in a similar position.)
Additional context (optional) See the discussion ending with this comment: https://github.com/knative-sandbox/eventing-kafka/issues/215#issuecomment-733891085
@pierDipi Let me know if you had something else in mind when you said to create an issue.
Also I really appreciate the effort to guide me in the right direction!