apla / dataflo.ws

workflow processing for javascript
107 stars 21 forks source link

Clustered running of dataflows #33

Open behrad opened 11 years ago

behrad commented 11 years ago

I am running multiple processes of dataflows, e.g. dataflows daemon core dataflows daemon redis-events dataflows daemon listeners2 But I can not define how much of each initiator to run in each command, since we use initiator types in daemons. for this to happen:

1) Can we add multiple etc/project files in dataflow? (e.g. dataflows project2 daemon test)

OR

2) is it easier to change the project json definition so that I can use the same initiators in multiple daemons? something like this:


"daemon": {
        "test": {
            "initiator": [ "cluster1" ]
        },
        "test2": {
            "initiator": [ "cluster2"]
        }
}

"initiator":{
    "cluster1": {
        "type": "http",
        "workflows":[
            {1}, {2}, {3}
    ]},
    "cluster2": {
        "type": "http",
        "workflows":[
            {4}, {5}, {6}
    ]}
}