Open oscerd opened 4 years ago
I've already installed one of camel-kafka-connectors into my docker image. And I can share the code.
My example uses camel-http-kafka-connector.
1. First, we need to create the files described bellow and put them into one folder
Dockerfile:
# The image from Docker Hub
FROM confluentinc/cp-kafka-connect:6.0.2
# ------------------------
# Installing Apache Camel Kafka Connectors
# ------------------------
# Getting started with camel connectors packages
# https://camel.apache.org/camel-kafka-connector/0.11.0/user-guide/getting-started/getting-started-with-packages.html
ARG CAMEL_PLUGIN_PATH="/home/appuser/camel-kafka-connectors"
# An example of version list per camel package
# https://repo.maven.apache.org/maven2/org/apache/camel/kafkaconnector/camel-http-kafka-connector/
ARG CAMEL_REPOSITORY_BASE_URL="https://repo.maven.apache.org/maven2/org/apache/camel/kafkaconnector"
# See comptibility matrix -- https://camel.apache.org/camel-kafka-connector/0.11.0/user-guide/camel-compatibility-matrix.html
ARG CAMEL_PACKAGE_VERSOIN="0.6.1"
RUN mkdir -p "${CAMEL_PLUGIN_PATH}" \
&& wget -qO- "${CAMEL_REPOSITORY_BASE_URL}/camel-http-kafka-connector/${CAMEL_PACKAGE_VERSOIN}/camel-http-kafka-connector-${CAMEL_PACKAGE_VERSOIN}-package.tar.gz" | tar -C "${CAMEL_PLUGIN_PATH}" -xz
ENV CONNECT_PLUGIN_PATH="/usr/share/java,${CAMEL_PLUGIN_PATH}"
connector.json
{
"name": "http-sink-connector",
"config": {
"tasks.max": "1",
"topics": "topic1",
"errors.tolerance": "all",
"errors.log.enable": "true",
"errors.log.include.messages": "true",
"errors.deadletterqueue.topic.name": "dlq_topic1",
"errors.deadletterqueue.topic.replication.factor": "1",
"connector.class": "org.apache.camel.kafkaconnector.http.CamelHttpSinkConnector",
"camel.sink.path.httpUri": "echo-server/test",
"camel.sink.endpoint.httpMethod": "PUT"
}
}
docker-compose.yml
version: '3.4'
services:
kafka:
image: confluentinc/cp-kafka:6.0.1
environment:
KAFKA_ZOOKEEPER_CONNECT: 'zookeeper:2181'
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: 'PLAINTEXT:PLAINTEXT'
KAFKA_ADVERTISED_LISTENERS: 'PLAINTEXT://kafka:9092'
KAFKA_DEFAULT_REPLICATION_FACTOR: 1
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
KAFKA_OFFSETS_TOPIC_NUM_PARTITIONS: 3
KAFKA_TRANSACTION_STATE_LOG_REPLICATION_FACTOR: 1
KAFKA_TRANSACTION_STATE_LOG_NUM_PARTITIONS: 3
KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS: 0
KAFKA_NUM_PARTITIONS: 3
tmpfs:
- "/var/lib/kafka/data"
logging:
driver: none
kafka-connect:
build: "."
environment:
CONNECT_BOOTSTRAP_SERVERS: 'kafka:9092'
CONNECT_GROUP_ID: 'connect-group'
CONNECT_CONFIG_STORAGE_TOPIC: '__connect-configs'
# Value -1 means to take options from Kafka config
# https://docs.confluent.io/home/connect/userguide.html#using-ak-broker-default-topic-settings
CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR: -1
CONNECT_OFFSET_STORAGE_PARTITIONS: -1
CONNECT_OFFSET_STORAGE_TOPIC: '__connect-offsets'
CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR: -1
CONNECT_STATUS_STORAGE_TOPIC: '__connect-statuses'
CONNECT_STATUS_STORAGE_REPLICATION_FACTOR: -1
CONNECT_STATUS_STORAGE_PARTITIONS: -1
CONNECT_KEY_CONVERTER: 'org.apache.kafka.connect.storage.StringConverter'
CONNECT_VALUE_CONVERTER: 'org.apache.kafka.connect.storage.StringConverter'
# CONNECT_REST_ADVERTISED_HOST_NAME: 'kafka-connect'
CONNECT_REST_ADVERTISED_HOST_NAME: 'localhost'
CONNECT_REST_PORT: 8083
CONNECT_LOG4J_LOGGERS: 'org.apache.zookeeper=ERROR,org.I0Itec.zkclient=ERROR,org.reflections=ERROR'
ports:
- "8083:8083"
zookeeper:
image: confluentinc/cp-zookeeper:6.0.1
environment:
ZOOKEEPER_CLIENT_PORT: 2181
tmpfs:
- "/var/lib/zookeeper/data"
- "/var/lib/zookeeper/log"
logging:
driver: none
# --------------- Helpers -----------------
kaf-ui:
image: obsidiandynamics/kafdrop:3.23.0
ports:
- "9000:9000"
environment:
KAFKA_BROKERCONNECT: 'kafka:9092'
logging:
driver: none
echo-server:
image: mendhak/http-https-echo:22
environment:
HTTP_PORT: 80
2. Second, to run these example we should follow these steps:
Open a terminal window, build a kafka-connect image with camel-http-kafka-sink-connector inside it and start the docker services:
```sh
docker-compose up
```
Open one more terminal window. Because this example starts kafka-connect in distributed mode, we should create a connector via REST API:
```sh
cat ./connector.json | curl -H "Accept: application/json" -H "Content-Type: application/json" -X POST -d "$(</dev/stdin)" http:/localhost:8083/connectors
```
And produce a message into Kafka:
```sh
docker exec $(docker ps -q -f label=name=cp-kafka) /bin/sh -c 'echo YOUR_MESSAGE | kafka-console-producer --broker-list localhost:9092 --topic topic1'
```
Why is this not merged?
Why is this not merged?
A comment? A comment cannot be merged
This comment could be added to the docs with a pull request, but it must be tested with latest version first and we are not enough to cover this too
This comment could be added to the docs with a pull request, but it must be tested with latest version first and we are not enough to cover this too
Oh no. I meant add these instructions to the docs 😉 . No rush. I can create a pull request with Salesforce connector once I have it working. I thought there was some other official way of using it and that is why this docker setup never got added to docs.
Lets take some information from #95