Open twashing opened 6 years ago
Also, is there an example of multiple calls to submit-job
, for separate but overlapping workflows (core.async or kafka)?
So that would be i) an input core.async channel (or kafka topic) that goes into an onyx processing function which then ii) outputs to a channel (or topic), then subsequently iii) is the input to a downstream onyx processing function.
I'm trying and failing (both with kafka and core.async), and don't see any examples here.
Only 1 output path in the workflow :inputA -> :processB -> :outputC
(example). And an example with the node in the middle, is just a processing function.
Hey there,
I'm playing with a recent onyx kafka plugin ("0.10.0.0-SNAPSHOT"). For some reason, I can't get Onyx to write to an output kafka topic. Code is in this git sample project, in the
cloud-orchestration-2
branch. I'm using a very simple workflow below. Here,:process-commands
is just Clojure's identity function.My setup uses docker-compose. A simple workflow to stand everything up, might look like the following.
Now, the docker service logs for app and kafka look fine. But the zookeeper logs keep giving me these kind of
Error:KeeperErrorCode = NoNode for /onyx/dev/...
errors, when trying to make Onyx write to kafka. Googling this gave me some leads, but no answers. And producing to and consuming from the topics, using just kafka tools, works fine.Now, I've set the number of peers for this job, to be 1, as there's only one machine running the job. But somehow I've not configured Onyx incorrectly.
I feel I've overlooked some small detail that I can't put my finger on. Probably something obvious to someone who sees it. Any ideas? Here's zookeeper's full error log, in this pastebin link.