-
This fix solves the CLOB problem, but the blob still reports the same error。
![image](https://user-images.githubusercontent.com/13796957/96397611-68a4e580-11fc-11eb-91af-77674b329074.png)
2020-1…
-
Hi
Getting this error when trying to sink topic data to Microsoft SQL - any ideas of things to try to get around this please?
io.confluent.connect.jdbc.sink.JdbcSinkTask put - Write of 500 records…
-
Hi,
While I am trying to load data using bulk mode, tables are not getting created. Its throwing following error. I was using following property.
table.name.format=${topic}
WARN Write of 201 r…
-
When creating a JDBC Sink connector using postgres with the following config
```
{
"name":"pg-sink-connector",
"config":{
"connector.class":"io.confluent.connect.jdbc.JdbcSinkConnector",
…
-
>Here is connector configuration.
```
name=tx_user_nw_source
connector.class=io.confluent.connect.jdbc.JdbcSourceConnector
connection.url=jdbc:mysql://localhost:3306/?user=user&password=password…
-
Following on from https://github.com/confluentinc/ksql/issues/2250, there seems to be a problem with the JDBC Sink connector.
The topic's data is written from KSQL, with an Avro value and binary k…
rmoff updated
5 years ago
-
### Describe the bug
Hey team, we are using ClickHouse, [JDBC](https://www.confluent.io/hub/confluentinc/kafka-connect-jdbc) and [BigQuery](https://www.confluent.io/hub/wepay/kafka-connect-bigquery) …
-
My MongoDB source configuration is:-
{
"name": "MongoDbSourceConnector",
"config": {
"connector.class": "io.debezium.connector.mongodb.MongoDbConnector",
"tasks.max": "1",
"ke…
-
Hello all;
I'm trying to transfer my data from mssql to apache kafka. To do that I created one docker-compose file to deploy the ksqldb to my virtual machine.
I also configured connect.propertie…
-
I am getting the following issue while adding a new MySql Connector in Kafka Confluent:
Following are logs details:
The last packet sent successfully to the server was 0 milliseconds ago. The driv…