CREATE OR REPLACE CONNECTION MY_CONNECTION TO '' USER ''
IDENTIFIED BY 'BOOTSTRAP_SERVERS=<host:port>;SCHEMA_REGISTRY_URL=<url>;SSL_ENABLED=true;...';
Then run imports with only the connection as parameter:
IMPORT INTO my_schema.my_table (json_doc_col, kafka_partition, kafka_offset)
FROM SCRIPT ETL.KAFKA_CONSUMER WITH
CONNECTION_NAME = 'MY_CONNECTION'
TOPIC_NAME = 'my.topic'
TABLE_NAME = 'MY_SCHEMA.MY_TABLE'
AS_JSON_DOC = 'true'
;
As of the 1.0 release, though, I get an error when doing this:
VM error: F-UDF-CL-LIB-1127: F-UDF-CL-SL-JAVA-1002: F-UDF-CL-SL-JAVA-1013:
com.exasol.ExaUDFException: F-UDF-CL-SL-JAVA-1080: Exception during run
com.exasol.cloudetl.kafka.KafkaConnectorException: SCHEMA_REGISTRY_URL must be provided for record type 'avro'
com.exasol.cloudetl.kafka.deserialization.AvroDeserialization$.getSingleColumnJsonDeserializer(AvroDeserialization.scala:29)
com.exasol.cloudetl.kafka.KafkaTopicDataImporter$.run(KafkaTopicDataImporter.scala:52)
com.exasol.cloudetl.kafka.KafkaTopicDataImporter.run(KafkaTopicDataImporter.scala)
com.exasol.ExaWrapper.run(ExaWrapper.java:196)
If I add the SCHEMA_REGISTRY_URL parameter to the IMPORT itself, too, it works (so a workaround is available).
I'm defining a Kafka connection like this:
Then run imports with only the connection as parameter:
As of the 1.0 release, though, I get an error when doing this:
If I add the SCHEMA_REGISTRY_URL parameter to the IMPORT itself, too, it works (so a workaround is available).