Open ronoaldo opened 1 year ago
I can confirm that the same should have worked for both, but I don't have the experience with Xlang to understand what's going wrong with what you have here.
The logs indicate that both the Go and the Java containers are starting up, but then the workers simply crash/die.
It would be useful to confirm whether the Python pipeline and the Go pipeline are uploading the same java artifacts however. Dumps of the pipeline protos (which should also be uploaded to the GCS folders) would also be valuable.
I don't see workitems actually executing in the worker log you provided.
Can you check other logs (for example, kubelet) to see if there are any errors related to pipeline setup ?
What happened?
As mentioned in #22931 - I am testing xlang support for the Go Runtime. I first was doing it wrong, using the Go direct runner, but then I moved on to test with the Dataflow runner.
I noticed that on Dataflow Runner, I could call a custom Java Ptransform from Python:
However,I could not call the same PTransform from Go. To submit the job, I started the Java expansion service as I did for the Python run, and called my Go pipeline. I see that the expansions service is called from Go, that the Go runtime stages the .jar files into Cloud Storage, but then the pipeline fails:
The only error message available is:
Here is a full pipeline log and worker logs downloaded from Logs explorer.
Is this an actual bug or a unsupported workflow? I am trying to follow the Multi language pipelines section of the Beam Programming Model docs, regarding expose the Java PTransform (which I assume is correct since calling from Python works) and consuming it from Go, which I'm not sure if I missed any important steps.
Issue Priority
Priority: 2
Issue Component
Component: sdk-go