Read into the pbac implementation of jgoltz, as it uses a proxy based setup to enforce the access control policies. Unfortunately, the repository is not documented and rather fragmented, requiring more effort to extract use for our project.
unfortunately, the envoy implementation currently only supports collecting metrics and not modifying messages.
Another option for message modification ahead of transmission to a kafka broker seems to be a connector implementation leveraging the Kafka Connect framework. This way we could build a 'preprocessing' step into our producer, possibly filtering / transforming our messages.
I tried to follow the setup guides from confluent for a docker based kafka connect instance, but only ran into problems which I could not properly debug.
I experimented with various minimal versions of the cp-based container images, but could not yet make the connect-instance work.
Many hours later, learning some details and incompatibilities of the open-core structure of the confluent products, I finally managed to get a connector running in docker, reading in from a file and modifing the content before passing it to the kafka cluster. Pheww
debugging of the docker-compose setup: when started up for the first time, the output-streams topic can't be used by the streams application
I was suspecting some startup-race condition problem here, as the broker is configured to allow auto-creation of new topics and the problem should therefore not come up - in theory.
This is the development diary for the ADSP Project by @overflw.