tchiotludo / akhq

Kafka GUI for Apache Kafka to manage topics, topics data, consumers group, schema registry, connect and more...
https://akhq.io/
Apache License 2.0
3.41k stars 661 forks source link

Publish with raw Avro #1077

Open mmatviienko opened 2 years ago

mmatviienko commented 2 years ago

I'm trying to configure Avro deserialisation without having a schema registry like it is implemented in https://github.com/tchiotludo/akhq/pull/843 However, it seems it does not work. I can see that avro folder with schemas are in the container, but on UI the message is still shows as encoded and I can't select my schemas when trying to publish to the topic.

here is my docker-compose

version: '3.1'

services:
  postgres:
    image: postgres
    restart: always
    ports:
      - "5432:5432"
    environment:
      POSTGRES_PASSWORD: admin
      POSTGRES_USER: admin
      POSTGRES_DB: contest

  akhq:
    image: tchiotludo/akhq
    volumes:
      - ./src/main/resources/avro:/app/avro_schemas
    depends_on:
      - broker
    environment:
      AKHQ_CONFIGURATION: |
        akhq:
          clients-defaults:
            consumer:
              properties:
                default.api.timeout.ms: 60000
          connections:
            docker-kafka-server:
              properties:
                bootstrap.servers: "broker:29092"
              deserialization:
                avro-raw:
                  schemas-folder: "/app/avro_schemas"
                  topics-mapping:
                     - topic-regex: "backend.*"
                       value-schema-file: "ContestUpdateNotification.avsc"
    ports:
      - '9093:8080'
    links:
      - broker

  zookeeper:
    image: confluentinc/cp-zookeeper:7.0.1
    container_name: zookeeper
    environment:
      ZOOKEEPER_CLIENT_PORT: 2181
      ZOOKEEPER_TICK_TIME: 2000

  broker:
    image: confluentinc/cp-kafka:7.0.1
    volumes:
      - ./src/main/resources/avro:/app/avro_schemas
    container_name: broker
    ports:
      # To learn about configuring Kafka for access across networks see
      # https://www.confluent.io/blog/kafka-client-cannot-connect-to-broker-on-aws-on-docker-etc/
      - "9092:9092"
    depends_on:
      - zookeeper
    environment:
      KAFKA_BROKER_ID: 1
      KAFKA_ZOOKEEPER_CONNECT: 'zookeeper:2181'
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_INTERNAL:PLAINTEXT
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092,PLAINTEXT_INTERNAL://broker:29092
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
      KAFKA_TRANSACTION_STATE_LOG_MIN_ISR: 1
      KAFKA_TRANSACTION_STATE_LOG_REPLICATION_FACTOR: 1

  init-kafka:
    image: confluentinc/cp-kafka:7.0.1
    depends_on:
      - broker
    entrypoint: [ '/bin/sh', '-c' ]
    command: |
      "
      # blocks until kafka is reachable
      kafka-topics --bootstrap-server broker:29092 --list

      echo -e 'Creating kafka topics'
      kafka-topics --bootstrap-server broker:29092 --create --if-not-exists --topic backend.game_events.v1 --replication-factor 1 --partitions 1

      echo -e 'Successfully created the following topics:'
      kafka-topics --bootstrap-server broker:29092 --list
      "

And here is UI

image

and here is the akhq container

image
tchiotludo commented 2 years ago

@bgranvea, maybe you could help here ?

bgranvea commented 2 years ago

I've tested your docker-compose (just replaced the schema with my own avsc) and the message is displayed decoded.

My guess is that you don't produce the message in the "raw avro" format as it is intended here. See documentation https://akhq.io/docs/configuration/avro.html and an example in https://github.com/tchiotludo/akhq/blob/dev/src/test/java/org/akhq/utils/AvroToJsonDeserializerTest.java (method toByteArray).

As far as I remember, you can't produce this type of message directly in the UI. The "produce to topic" with a selection of a schema is a different feature that will use Avro with a schema registry.

tchiotludo commented 2 years ago

Make sense, PR are welcome on that in order to display the raw Avro schema in the produce drop-down and to publish with raw Avro