apache / camel-kafka-connector-examples

Apache Camel Kafka Connector Examples
https://camel.apache.org
64 stars 43 forks source link

Kamelet with id cassandra-sink not found in locations: classpath:/kamelets #336

Closed nathanhagemann closed 2 years ago

nathanhagemann commented 2 years ago

First, thank you so much for creating these excellent Camel libraries.

Second, this may not be the right place to put this question. I don't have an issue with the connector. My inexperience is the real issue.

Background: I'm wanting to sink a JSON message from a Kafka (open-source) topic to Cassandra (open-source) table with Kafka Connect

Kafka Version: 2.13-3.0.0 (Scala: 2.13, Kafka: 3.0.0) IP/Ports: 10.188.5.86:9092,10.188.5.86:9093,10.188.5.86:9094 Topic: pie (./bin/kafka-topics.sh --bootstrap-server 10.188.5.86:9092,10.188.5.86:9093,10.188.5.86:9094 --create --replication-factor 1 --partitions 1 --topic pie)

JSON Simple Pie Data: {"Type:":"Apple","Invented":1381} {"Type:":"Pecan","Invented":1870} {"Type:":"Cherry","Invented":1500}

Cassandra Version:4.0.3 IP:10.66.16.10 Port:9042 User:cameldevloader Pass:newpassword

USE dev; CREATE TABLE pie(type varchar, invented double, PRIMARY KEY (type)); INSERT INTO pie (type, invented) VALUES ('Chess',1750); SELECT * FROM pie; yields: Chess 1750.0

Kafka Connect

Using this Camel connector: https://camel.apache.org/camel-kafka-connector/1.0.x/reference/connectors/camel-cassandra-sink-kafka-sink-connector.html

Downloaded from: https://repo.maven.apache.org/maven2/org/apache/camel/kafkaconnector/camel-cassandra-sink-kafka-connector/1.0.0/camel-cassandra-sink-kafka-connector-1.0.0-package.tar.gz copied the camel-cassandra-sink-kafka-connector-1.0.0-package.tar.gz to Kafka server at location /root/sofware/camel-cassandra-sink-kafka-connector-1.0.0-package.tar.gz

on my Kafka server in the /root/connectors/ directory unzipped: ~/connectors/tar -xvzf ../software/camel-cassandra-sink-kafka-connector-1.0.0-package.tar.gz

Updated /root/kafka/config/connect-standalone.properties --###################################################### --# Licensed to the Apache Software Foundation (ASF) under one or more --# contributor license agreements. See the NOTICE file distributed with --# this work for additional information regarding copyright ownership. --# The ASF licenses this file to You under the Apache License, Version 2.0 --# (the "License"); you may not use this file except in compliance with --# the License. You may obtain a copy of the License at --# --# http://www.apache.org/licenses/LICENSE-2.0 --# --# Unless required by applicable law or agreed to in writing, software --# distributed under the License is distributed on an "AS IS" BASIS, --# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. --# See the License for the specific language governing permissions and --# limitations under the License.

--# These are defaults. This file just demonstrates how to override some settings. bootstrap.servers=10.188.5.86:9092,10.188.5.86:9093,10.188.5.86:9094

--# The converters specify the format of data in Kafka and how to translate it into Connect data. Every Connect user will --# need to configure these based on the format they want their data in when loaded from or stored into Kafka key.converter=org.apache.kafka.connect.json.JsonConverter value.converter=org.apache.kafka.connect.json.JsonConverter --# Converter-specific settings can be passed in by prefixing the Converter's setting with the converter we want to apply --# it to key.converter.schemas.enable=true value.converter.schemas.enable=true

offset.storage.file.filename=/tmp/connect.offsets --# Flush much faster than normal, which is useful for testing/debugging offset.flush.interval.ms=10000

--# Set to a list of filesystem paths separated by commas (,) to enable class loading isolation for plugins --# (connectors, converters, transformations). The list should consist of top level directories that include --# any combination of: --# a) directories immediately containing jars with plugins and their dependencies --# b) uber-jars with plugins and their dependencies --# c) directories immediately containing the package directory structure of classes of plugins and their dependencies --# Note: symlinks will be followed to discover dependencies or plugins. --# Examples: plugin.path=/root/connectors/ --######################################################

Created (from docs/examples) /root/myconnectorproperties/CamelCassandraPieSink.properties --###################################################### --##--------------------------------------------------------------------------- --## Licensed to the Apache Software Foundation (ASF) under one or more --## contributor license agreements. See the NOTICE file distributed with --## this work for additional information regarding copyright ownership. --## The ASF licenses this file to You under the Apache License, Version 2.0 --## (the "License"); you may not use this file except in compliance with --## the License. You may obtain a copy of the License at --## --## http://www.apache.org/licenses/LICENSE-2.0 --## --## Unless required by applicable law or agreed to in writing, software --## distributed under the License is distributed on an "AS IS" BASIS, --## WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. --## See the License for the specific language governing permissions and --## limitations under the License. --## ---------------------------------------------------------------------------

name=CamelCassandra-sinkSinkConnector connector.class=org.apache.camel.kafkaconnector.cassandrasink.CamelCassandrasinkSinkConnector tasks.max=1

--# use the kafka converters that better suit your needs, these are just defaults: key.converter=org.apache.kafka.connect.storage.StringConverter value.converter=org.apache.kafka.connect.storage.StringConverter

--# comma separated topics to get messages from topics=pie

--# mandatory properties (for a complete properties list see the connector documentation):

--# Hostname(s) cassandra server(s). Multiple hosts can be separated by comma. Example: localhost camel.kamelet.cassandra-sink.connectionHost=10.66.16.10 --# Port number of cassandra server(s) Example: 9042 camel.kamelet.cassandra-sink.connectionPort=9042 --# user camel.kamelet.cassandra-sink.username=cameldevloader --# password camel.kamelet.cassandra-sink.password=newpassword --# Keyspace to use Example: customers camel.kamelet.cassandra-sink.keyspace=dev --# The query to execute against the Cassandra cluster table camel.kamelet.cassandra-sink.query=INSERT INTO pie (type, invented) VALUES (?,?) --######################################################

Connect command: /kafka # ./bin/connect-standalone.sh /root/kafka/config/connect-standalone.properties /root/myconnectorproperties/CamelCassandraPieSink.properties

Error: Kamelet with id cassandra-sink not found in locations: classpath:/kamelets

Full log attache pie_result.txt d

nathanhagemann commented 2 years ago

Also tried with CamelCassandraQLSinkConnector --###################################### name=CamelCassandraQLSinkConnector topics=pie tasks.max=1 connector.class=org.apache.camel.kafkaconnector.cql.CamelCqlSinkConnector key.converter=org.apache.kafka.connect.storage.StringConverter value.converter=org.apache.kafka.connect.storage.StringConverter

camel.sink.path.hosts=10.66.16.10 camel.sink.path.port=9042 camel.sink.path.keyspace=dev camel.sink.endpoint.cql=INSERT INTO pie (type, invented) VALUES (?,?) camel.sink.endpoint.username=cameldevloader camel.sink.endpoint.password=newpassword --######################################

This works but all of the data is in the first field like this: FIRST COLUMN: type SECOND COLUMN: invented FIRST COLUMN: {"type":"Pecan","invented":1870} SECOND COLUMN: NaN

oscerd commented 2 years ago

You should pass your Json file line by line, as far as I see you're passing the full JSON as record in your Kafka topic.

oscerd commented 2 years ago

For the 1.0.x version, can you please report the content of your plugin.path?

nathanhagemann commented 2 years ago

Thank you for your help oscerd,

When I pass in ["Pecan","1870"] as my message I just get ["Pecan","1870"] in the first column of the Cassandra table. When I pass in "pecan","1871" as the message I get "pecan","1871" in the first column of the Cassandra table. What would passing the Json file line by line look like?

for 1.0.x I'm using this plugin.path=/root/connectors/

Here is my connect-standalone.properties: --####################################################################### --# These are defaults. This file just demonstrates how to override some settings. bootstrap.servers=10.188.5.86:9092,10.188.5.86:9093,10.188.5.86:9094

--# The converters specify the format of data in Kafka and how to translate it into Connect data. Every Connect user will --# need to configure these based on the format they want their data in when loaded from or stored into Kafka key.converter=org.apache.kafka.connect.json.JsonConverter value.converter=org.apache.kafka.connect.json.JsonConverter --# Converter-specific settings can be passed in by prefixing the Converter's setting with the converter we want to apply --# it to key.converter.schemas.enable=true value.converter.schemas.enable=true

offset.storage.file.filename=/tmp/connect.offsets --# Flush much faster than normal, which is useful for testing/debugging offset.flush.interval.ms=10000

--# Set to a list of filesystem paths separated by commas (,) to enable class loading isolation for plugins --# (connectors, converters, transformations). The list should consist of top level directories that include --# any combination of: --# a) directories immediately containing jars with plugins and their dependencies --# b) uber-jars with plugins and their dependencies --# c) directories immediately containing the package directory structure of classes of plugins and their dependencies --# Note: symlinks will be followed to discover dependencies or plugins. --# Examples: plugin.path=/root/connectors/

--############################################################

with this CamelCassandraPieQLSink.properties: --############################################################ name=CamelCassandraQLSinkConnector topics=pie tasks.max=1 connector.class=org.apache.camel.kafkaconnector.cql.CamelCqlSinkConnector key.converter=org.apache.kafka.connect.storage.StringConverter value.converter=org.apache.kafka.connect.storage.StringConverter

camel.sink.path.hosts=10.66.16.10 camel.sink.path.port=9042 camel.sink.path.keyspace=dev camel.sink.endpoint.cql=INSERT INTO pie (type, invented) VALUES (?,?) camel.sink.endpoint.username=cameldevloader camel.sink.endpoint.password=newpassword

--############################################################

At /root/connectors/ I have : camel-cassandra-sink-kafka-connector camel-cql-kafka-connector

Both contain many jar files, here's what is in camel-cql-kafka-connector: HdrHistogram-2.1.12.jar jackson-dataformat-avro-2.12.2.jar LICENSE.txt jackson-datatype-jdk8-2.12.2.jar NOTICE.txt java-driver-core-4.11.1.jar README.adoc java-driver-query-builder-4.11.1.jar annotations-13.0.jar java-driver-shaded-guava-25.1-jre-graal-sub-1.jar apicurio-registry-common-1.3.2.Final.jar javax.annotation-api-1.3.2.jar apicurio-registry-rest-client-1.3.2.Final.jar jboss-jaxrs-api_2.1_spec-2.0.1.Final.jar apicurio-registry-utils-converter-1.3.2.Final.jar jcip-annotations-1.0-1.jar apicurio-registry-utils-serde-1.3.2.Final.jar jctools-core-3.3.0.jar asm-9.1.jar jffi-1.3.1-native.jar asm-analysis-9.1.jar jffi-1.3.1.jar asm-commons-9.1.jar jnr-a64asm-1.0.0.jar asm-tree-9.1.jar jnr-constants-0.10.1.jar asm-util-9.1.jar jnr-ffi-2.2.2.jar avro-1.10.2.jar jnr-posix-3.1.5.jar camel-api-3.11.5.jar jnr-x86asm-1.0.2.jar camel-base-3.11.5.jar json-20090211.jar camel-base-engine-3.11.5.jar jsr305-3.0.2.jar camel-cassandraql-3.11.5.jar kafka-clients-2.8.0.jar camel-core-engine-3.11.5.jar kotlin-reflect-1.3.20.jar camel-core-languages-3.11.5.jar kotlin-stdlib-1.3.20.jar camel-core-model-3.11.5.jar kotlin-stdlib-common-1.3.20.jar camel-core-processor-3.11.5.jar lz4-java-1.7.1.jar camel-core-reifier-3.11.5.jar medeia-validator-core-1.1.1.jar camel-cql-kafka-connector-0.11.5.jar medeia-validator-jackson-1.1.1.jar camel-direct-3.11.5.jar metrics-core-4.1.19.jar camel-jackson-3.11.5.jar native-protocol-1.5.0.jar camel-kafka-3.11.5.jar netty-buffer-4.1.66.Final.jar camel-kafka-connector-0.11.5.jar netty-codec-4.1.66.Final.jar camel-main-3.11.5.jar netty-common-4.1.66.Final.jar camel-management-api-3.11.5.jar netty-handler-4.1.66.Final.jar camel-seda-3.11.5.jar netty-resolver-4.1.66.Final.jar camel-support-3.11.5.jar netty-transport-4.1.66.Final.jar camel-util-3.11.5.jar okhttp-4.8.1.jar commons-compress-1.20.jar okio-2.7.0.jar config-1.4.1.jar protobuf-java-3.13.0.jar connect-json-2.6.0.jar reactive-streams-1.0.3.jar converter-jackson-2.9.0.jar retrofit-2.9.0.jar esri-geometry-api-1.2.1.jar slf4j-api-1.7.30.jar jackson-annotations-2.12.3.jar snappy-java-1.1.8.1.jar jackson-core-2.12.3.jar spotbugs-annotations-3.1.12.jar jackson-core-asl-1.9.12.jar zstd-jni-1.4.9-1.jar jackson-databind-2.12.3.jar

Thanks again for helping me.

nathanhagemann commented 2 years ago

Also, each of these lines is entered as a separate message: {"Type:":"Apple","Invented":1381} {"Type:":"Pecan","Invented":1870} {"Type:":"Cherry","Invented":1500}

oscerd commented 2 years ago

Thank you for your help oscerd,

When I pass in ["Pecan","1870"] as my message I just get ["Pecan","1870"] in the first column of the Cassandra table. When I pass in "pecan","1871" as the message I get "pecan","1871" in the first column of the Cassandra table. What would passing the Json file line by line look like?

Not sure if you noticed that you're using the TYPE field as primary key in your table definition

CREATE TABLE pie(type varchar, invented double, PRIMARY KEY (type));

In this case when you pass pecan the second time with 1781 as invented field, the original one (the one with 1780 as invented field, will be overriden).

So you can change the definition in this way

CREATE TABLE pie(type varchar, invented double, PRIMARY KEY (type, invented));

Or add an id

CREATE TABLE pie(id UUID PRIMARY KEY, type varchar, invented double);

for 1.0.x I'm using this plugin.path=/root/connectors/

Here is my connect-standalone.properties: --####################################################################### --# These are defaults. This file just demonstrates how to override some settings. bootstrap.servers=10.188.5.86:9092,10.188.5.86:9093,10.188.5.86:9094

--# The converters specify the format of data in Kafka and how to translate it into Connect data. Every Connect user will --# need to configure these based on the format they want their data in when loaded from or stored into Kafka key.converter=org.apache.kafka.connect.json.JsonConverter value.converter=org.apache.kafka.connect.json.JsonConverter --# Converter-specific settings can be passed in by prefixing the Converter's setting with the converter we want to apply --# it to key.converter.schemas.enable=true value.converter.schemas.enable=true

offset.storage.file.filename=/tmp/connect.offsets --# Flush much faster than normal, which is useful for testing/debugging offset.flush.interval.ms=10000

--# Set to a list of filesystem paths separated by commas (,) to enable class loading isolation for plugins --# (connectors, converters, transformations). The list should consist of top level directories that include --# any combination of: --# a) directories immediately containing jars with plugins and their dependencies --# b) uber-jars with plugins and their dependencies --# c) directories immediately containing the package directory structure of classes of plugins and their dependencies --# Note: symlinks will be followed to discover dependencies or plugins. --# Examples: plugin.path=/root/connectors/

--############################################################

with this CamelCassandraPieQLSink.properties: --############################################################ name=CamelCassandraQLSinkConnector topics=pie tasks.max=1 connector.class=org.apache.camel.kafkaconnector.cql.CamelCqlSinkConnector key.converter=org.apache.kafka.connect.storage.StringConverter value.converter=org.apache.kafka.connect.storage.StringConverter

camel.sink.path.hosts=10.66.16.10 camel.sink.path.port=9042 camel.sink.path.keyspace=dev camel.sink.endpoint.cql=INSERT INTO pie (type, invented) VALUES (?,?) camel.sink.endpoint.username=cameldevloader camel.sink.endpoint.password=newpassword

--############################################################

At /root/connectors/ I have : camel-cassandra-sink-kafka-connector camel-cql-kafka-connector

Both contain many jar files, here's what is in camel-cql-kafka-connector: HdrHistogram-2.1.12.jar jackson-dataformat-avro-2.12.2.jar LICENSE.txt jackson-datatype-jdk8-2.12.2.jar NOTICE.txt java-driver-core-4.11.1.jar README.adoc java-driver-query-builder-4.11.1.jar annotations-13.0.jar java-driver-shaded-guava-25.1-jre-graal-sub-1.jar apicurio-registry-common-1.3.2.Final.jar javax.annotation-api-1.3.2.jar apicurio-registry-rest-client-1.3.2.Final.jar jboss-jaxrs-api_2.1_spec-2.0.1.Final.jar apicurio-registry-utils-converter-1.3.2.Final.jar jcip-annotations-1.0-1.jar apicurio-registry-utils-serde-1.3.2.Final.jar jctools-core-3.3.0.jar asm-9.1.jar jffi-1.3.1-native.jar asm-analysis-9.1.jar jffi-1.3.1.jar asm-commons-9.1.jar jnr-a64asm-1.0.0.jar asm-tree-9.1.jar jnr-constants-0.10.1.jar asm-util-9.1.jar jnr-ffi-2.2.2.jar avro-1.10.2.jar jnr-posix-3.1.5.jar camel-api-3.11.5.jar jnr-x86asm-1.0.2.jar camel-base-3.11.5.jar json-20090211.jar camel-base-engine-3.11.5.jar jsr305-3.0.2.jar camel-cassandraql-3.11.5.jar kafka-clients-2.8.0.jar camel-core-engine-3.11.5.jar kotlin-reflect-1.3.20.jar camel-core-languages-3.11.5.jar kotlin-stdlib-1.3.20.jar camel-core-model-3.11.5.jar kotlin-stdlib-common-1.3.20.jar camel-core-processor-3.11.5.jar lz4-java-1.7.1.jar camel-core-reifier-3.11.5.jar medeia-validator-core-1.1.1.jar camel-cql-kafka-connector-0.11.5.jar medeia-validator-jackson-1.1.1.jar camel-direct-3.11.5.jar metrics-core-4.1.19.jar camel-jackson-3.11.5.jar native-protocol-1.5.0.jar camel-kafka-3.11.5.jar netty-buffer-4.1.66.Final.jar camel-kafka-connector-0.11.5.jar netty-codec-4.1.66.Final.jar camel-main-3.11.5.jar netty-common-4.1.66.Final.jar camel-management-api-3.11.5.jar netty-handler-4.1.66.Final.jar camel-seda-3.11.5.jar netty-resolver-4.1.66.Final.jar camel-support-3.11.5.jar netty-transport-4.1.66.Final.jar camel-util-3.11.5.jar okhttp-4.8.1.jar commons-compress-1.20.jar okio-2.7.0.jar config-1.4.1.jar protobuf-java-3.13.0.jar connect-json-2.6.0.jar reactive-streams-1.0.3.jar converter-jackson-2.9.0.jar retrofit-2.9.0.jar esri-geometry-api-1.2.1.jar slf4j-api-1.7.30.jar jackson-annotations-2.12.3.jar snappy-java-1.1.8.1.jar jackson-core-2.12.3.jar spotbugs-annotations-3.1.12.jar jackson-core-asl-1.9.12.jar zstd-jni-1.4.9-1.jar jackson-databind-2.12.3.jar

Thanks again for helping me.

nathanhagemann commented 2 years ago

Good suggestion, I tried adding in a pieid. I'm still getting the JSON unparsed when I produce {"pietype":"pecan","pieinvented":1871} into the kafka topic I get the whole JSON unparsed in the pietype field. Nothing in pieinvented. pieid looks fine because it's hardcoded to now() in the properties. I'm attaching my files. Thank you.

CamelCassandraPieQLSink.properties.txt connect-standalone.properties.txt

pie_console_producer pie_cassandra_data

pie_result.txt

oscerd commented 2 years ago

The problem is the input your passing, if you want to pass parameters to the query, you need to provide an array:

So in your case the message should be something like

["Pecan", 1871]

nathanhagemann commented 2 years ago

Thank you Oscerd, I tried this but I'm still getting all of the data in one column on the Cassandra side.

pie_list_console_producer pie_list_cassandra_data
oscerd commented 2 years ago

You need to specify the Content type as application/json. You're sending an application/string probably

oscerd commented 2 years ago

Need to reproduce, i don't have so much time these days

nathanhagemann commented 2 years ago

Thank you, appreciate your idea. I’ll try that.


From: Andrea Cosentino @.> Date: March 27, 2022 at 11:33:42 AM CDT To: apache/camel-kafka-connector-examples @.> Cc: Hagemann, Nathan @.>, Author @.> Subject: [EXTERNAL] Re: [apache/camel-kafka-connector-examples] Kamelet with id cassandra-sink not found in locations: classpath:/kamelets (Issue #336)

You need to specify the Content type as application/json. You're sending an application/string probably

— Reply to this email directly, view it on GitHubhttps://github.com/apache/camel-kafka-connector-examples/issues/336#issuecomment-1079965711, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AKSEKKUWIAMXC64XJNZUVSTVCCLSRANCNFSM5RC5QX3Q. You are receiving this because you authored the thread.Message ID: @.***>

nathanhagemann commented 2 years ago

Where do I specify the Content type? when I specify the topic (kafka-topic.sh), when I produce the message (kafka-console-producer) or is it in connect-standalone.properties or in my CamelCassandraPieQLSink.properties. I've been googling it and all I can find are people specifying it in the 'headers' but they aren't doing what we're doing.

nathanhagemann commented 2 years ago

I tried to change the key.converter and value.converter in CamelCassandraPieQLSinkConnector from .org.apache.kafka.connect.storage.StringConverter to .org.apache.kafka.connect.json.JsonConverter but it didn't work.

oscerd commented 2 years ago

You need to pass an header with the payload. To do that I usually use kafkacat

echo '["Pecan", 1871]' | ./kafkacat -P -b localhost:9092 -t mytopic -H "CamelHeader.Content-Type=application/json"

nathanhagemann commented 2 years ago

ah, i don't have kafkacat. I did have some luck with a regular camel route.

pom.xml =====================================start <?xml version="1.0" encoding="UTF-8"?>

4.0.0 org.springframework.boot spring-boot-starter-parent 2.6.6 com.camel.route kafka-to-cassandra-example 0.0.1-SNAPSHOT kafka-to-cassandra-example Demo project for Spring Boot 1.8 3.14.0 org.springframework.boot spring-boot-starter-web org.springframework.boot spring-boot-devtools runtime true org.springframework.boot spring-boot-starter-test test org.apache.camel.springboot camel-spring-boot-starter ${camel.version} org.apache.camel.springboot camel-jackson-starter ${camel.version} org.apache.camel.springboot camel-kafka-starter ${camel.version} org.apache.camel.springboot camel-cassandraql-starter ${camel.version} org.springframework.boot spring-boot-maven-plugin

pom.xml =====================================end

application.properties=============================start

Kafka Camel configuration

logging.level.org.springframework: DEBUG camel.component.kafka.groupId=consumer1 camel.component.kafka.brokers=10.188.5.86:9092,10.188.5.86:9093,10.188.5.86:9094

Cassandra

cassandra.username=cameldevloader cassandra.password=newpassword cassandra.ip=10.66.16.10 cassandra.port=9042 cassandra.keyspace=dev application.properties=============================end

PieRoute.java===================================start package com.camel.route.kafkatocassandraexample.route;

import org.apache.camel.Exchange; import org.apache.camel.Processor; import org.apache.camel.builder.RouteBuilder; import org.apache.camel.model.dataformat.JsonLibrary; import org.springframework.stereotype.Component;

@Component public class PieRoute extends RouteBuilder{

private String cql = "INSERT INTO pie (\r\n"
        + "pieid,\r\n"
        + "pietype,\r\n"
        + "pieinvented"
        + ") VALUES (\r\n"
        + "now(),\r\n"
        + "'${body[pietype]}',\r\n"
        + "${body[pieinvented]});";

@Override
public void configure() throws Exception {
    //final String mysql = "insert into car (carnm) values ('Lexus');";
    from("kafka:pie"
            + "?brokers={{camel.component.kafka.brokers}}"
            + "&groupId={{camel.component.kafka.groupId}}"
            + "&seekTo=beginning"
            )

    .unmarshal().json(JsonLibrary.Jackson)
    .setHeader("CamelCqlQuery", simple(cql))
    .setBody().simple("${null}")
    .process(new Processor() {
        public void process(Exchange exchange) throws Exception {
            String mycql = exchange.getMessage().getHeader("CamelCqlQuery").toString();
            // if pieinvented is not provided, add null keyword to insert statement
            mycql = mycql.replace("\r\n)","\r\nnull)");
            exchange.getIn().setHeader("CamelCqlQuery", mycql);
            //System.out.println(mycql);
        }
     })
    .to("cql://{{cassandra.ip}}/{{cassandra.keyspace}}"
            + "?username={{cassandra.username}}"
            + "&password={{cassandra.password}}");
}

}

PieRoute.java===================================end

kafka-pie cassandra-pie

Hope this helps someone.

nathanhagemann commented 2 years ago

Really appreciate all the help and suggestions!