Closed RuPingCen closed 2 months ago
This is a good idea, but there is a lot of small details that needs to be handled such as how to connect the different dataflow together, how to overwrite parameters and so on.
I think that something like docker compose
merging feature would be great: https://docs.docker.com/compose/multiple-compose-files/merge/
Happy to receive contribution on this, but for us maintainers, I don't think we have the bandwidth to work on this just yet
1、How to pass parameters to the dora node?
You can specify arguments through a args
key in the dataflow YAML file. For example:
nodes:
- id: some-node
custom:
source: xxxx
args: arg1 arg2
1、How to pass parameters to the dora node? When we start the dora node through dataflow.yml, can we pass parameters to the node? just like this case: we start a C/C++ executable file with " ./ xxxx arg1 arg2", and then we can receive this parameter in the "int main(int argc, char* argv[]) " function.
2、Is there any way to embed the dataflow_lidar.yml file in the dataflow1.yml file? We hope to start multiple Dora nodes through a dataflow.yml file, but currently we can only write all nodes to a single dataflow.yml file. Is there a way similar to ROS's launch file that can embed other launch files in the launch file?