Closed davidscottturner closed 4 years ago
Calling #get on the ValueProvider resolves the value so if you do this at pipeline construction time as your doing above you get the errors.
Only transforms / IO connectors where the public API of takes in ValueProvider support those parameters to be used within a template.
See https://stackoverflow.com/questions/43992120/valueprovider-issue for more details.
Thanks for the answer and see you closed it. I guess your answer is fairly unhelpful due to the fact the KafkaIO Im using is the Apache Beam KafkaIO (https://github.com/apache/beam/tree/master/sdks/java/io/kafka). Are you saying that because its API doesn't support a ValueProvider I am unable to overcome this issue? Do I log an issue there?
https://issues.apache.org/jira/browse/BEAM-3925 already exists.
You can not overcome this issue unless you implement support for KafkaIO to use ValueProviders.
Thanks Luke, appreciate the help on this.
Hi,
I am fairly new to Dataflow and tried to upload a streaming job template. I am having problems with executing a dataflow template with RuntimeValueProvider.
NOTE - This all worked with hardcoded values as a test
My code looks as follows:
Options:
Job:
This seems correct according to the Dataflow documentation. Im also trying Kotlin for this. This all ran with hard coded values before so I know the execution code works fine.
I then build the project and execute it with Gradle:
build.gradle.kts:
I then use
gradle clean build
followed by:gradle execute -DmainClass="com.example.orders.OrderKafkaToElasticsearchPipeline" -Dexec.args="--runner=DataflowRunner --templateLocation=gs:/<PATH>/OrderKafkaToElasticsearchPipeline --project=<PROJECT> --tempLocation=gs://<PATH>/temp" -Pdataflow-runner --info
Before trying to use the RuntimeValueProvider this all worked fine and the template was uploaded. Now that Im trying to get the Kafka configuration passed in I get the following error:
Any help or suggestions would be very welcome as to where I am going wrong.