sanjuthomas / kafka-connect-gcp-bigtable

Kafka Sink Connect to GCP Bigtable - https://www.confluent.io/hub/sanjuthomas/kafka-connect-gcp-bigtable
http://sanjuthomas.com
MIT License
8 stars 8 forks source link
bigtable cloudbigtable gcp kafka kafka-connect kafka-connector kafka-sink

codecov Codacy Badge Maintainability codebeat badge Maven Central BCH compliance

Kafka Sink Connect Google Cloud (GCP) Bigtable

Apache Kafka Sink Only Connect can stream messages from Apache Kafka to Google Cloud Platform (GCP) wide column store Bigtable.

What is Apache Kafka

Apache Kafka is an open-source stream processing platform developed by the Apache Software Foundation and written in Scala and Java. The project aims to provide a unified, high-throughput, low-latency platform for real-time data feeds. Please look at Apache Kafka home page.

What is Google Cloud Bigtable

Bigtable is a compressed, high performance, proprietary data storage system built on Google File System, Chubby Lock Service, SSTable and a few other Google technologies. On May 6, 2015, a public version of Bigtable was made available as a service in the Google Cloud Platform. For more details, please refer to GCP Bigtable home page.

High Level Architecture

This project leverages bigtable-client-core library (NO HBase) to stream data to GCP Bigtable. bigtable-client-core internally use the gRPC framework to talk to GCP Bigtable.

Kafka Connect GCP Bigtable

Prerequisites

You have Apache ZooKeeper and Apache Kafka installed and running on your computer. Please refer to the respective sites to download and start ZooKeeper and Kafka. You also need Java version 11 or above.

Tested Software Versions

Software Version Note
Java 11 Tested using Java 11.
Kafka 3.3.1 Please refer. Tested using kafka_2.13-3.3.1, should work with older versions.
bigtable-client-core 1.27.1 Please refer.
Kafka connect-api 3.3.1 Please refer.
grpc-netty-shaded 1.51.0 Please refer.

Configurations

Please refer to project Wiki

Constraints

The current configuration system supports streaming messages from a given topic to a table. You can subscribe to any number of topics, but a topic can be pointed to one and only one table. Say, for example, if you subscribed from a topic named demo-topic, you should have a yml file named demo-topic.yml. That yml file contains all the configuration required to transform and write data into Bigtable.

How to build the artifact

Please refer to project Wiki

How to deploy the connector

Please refer to project Wiki

How to start the connector in stand-alone mode

Please refer to project Wiki

Questions

Either create issues in this project or send it to bt@sanju.org. Thanks!

License

FOSSA Status