camunda-community-hub / kafka-connect-zeebe

Kafka Connect for Zeebe.io
Apache License 2.0
96 stars 53 forks source link
camunda-8 kafka kafka-connect zeebe

Community Extension Lifecycle: Incubating Compatible with: Camunda Platform 8

:warning: This is not the official Kafka connector for Camunda 8. The Kafka Producer and Kafka Consumer connectors are found here. This project uses Kafka Connect from Confluence.

kafka-connect-zeebe

This Kafka Connect connector for Zeebe can do two things:

It can work with Camunda Platform 8 SaaS or self-managed.

Overview

See this blog post for some background on the implementation.

Examples and walk-through

Examples

The following video walks you through an example connecting to Camunda Platform 8 - SaaS:

Walkthrough

Installation and quickstart

You will find information on how to build the connector and how to run Kafka and Zeebe to get started quickly here:

Installation

Connectors

The plugin comes with two connectors, a source and a sink connector.

The source connector activates Zeebe jobs, publishes them as Kafka records, and completes them once they have been committed to Kafka.

Sink connector

In a workflow model you can wait for certain events by name (extracted from the payload by messageNameJsonPath):

Overview

The sink connector consumes Kafka records and publishes messages constructed from those records to Zeebe. This uses the Zeebe Message Correlation features. So for example if no matching workflow instance is found, the message is buffered for its time-to-live (TTL) and then discarded. You could simply ingest all messages from a Kafka topic and check if they correlate to something in Zeebe.

Configuration

In order to communicate with the Zeebe workflow engine, the connector has to create a Zeebe client.

Camunda SaaS Properties

If you want to connect to Camunda SaaS, you can use these properties:

If you want to connect to another endpoint than the public SaaS endpoint, you can further specify:

Zeebe Broker Properties

If you want to connect to a Zeebe broker hosted yourself (e.g. running on localhost), use these properties:

Common Configuration

The Zeebe client and job workers can be configured by system properties understood by the Zeebe Java Client. Typical other properties are:

You can find sample properties for the source connector here.

Sink

The connector does support schemas, but only supports JSON. The connector will use JSON path to extract certain properties from this JSON data:

You can find sample properties for the sink connector here.

Source

Similar to receiving a message, the process can also create records. In your BPMN process model you can add a ServiceTask with a configurable task type which will create a record on the configured Kafka topic:

Overview

Under the hood, the connector will create one job worker that publishes records to Kafka. The record value is a JSON representation of the job itself, the record key is the job key.

Filtering Variables

You can filter the variables being sent to Kafka by adding a configuration option "job.variables" to your source properties. It must contain a comma-separated list of variables to pass to Kafka.

If this property is not present, then all variables in the scope will be sent to Kafka by default.

{
  "name": "ping",
  "config": {
    ...
    "job.variables": "a, b, andSomeVariableC",
    ...

Configuring Error Handling of Kafka Connect, e.g. Logging or Dead Letter Queues

Kafka Connect allows you to configure what happens if a message cannot be processed. A great explanation can be found in Kafka Connect Deep Dive – Error Handling and Dead Letter Queues. This of course also applies to this connector.

Remote Debugging During Development

To ease with development, you can add this environment variable to kafka-connect: "JAVA_TOOL_OPTIONS": "-agentlib:jdwp=transport=dt_socket,address=*:5005,server=y,suspend=n"

And then use remote debugging

Confluent Hub

This project is set up to be released on Confluent Hub.

When