Open Krith-man opened 1 year ago
Yes. Absolutely.
Embedding an analytic model is the appropriate way to do reliable model scoring with low latency.
Some model servers also add native Kafka interfaces (see, e.g., https://www.kai-waehner.de/blog/2020/10/27/streaming-machine-learning-kafka-native-model-server-deployment-rpc-embedded-streams/). This is another good option for some use cases, but not as robust and fast.
Hello this repo is very helpful, but it is 4 years old. Is this still the recommended way to use ML inference with Kafka Streams?