Open mrcabellom opened 7 years ago
What is your goal? CNTK uses MPI as inter-process communication channel, so two instances of distributed learner would conflict with each other.
Thanks for your response.
My goal is execute two different learners with different models in the same python process, finish a parallel learner scenario and launch other new again.
Hello,
Can I have two cntk data parallel trainer class in the same python process? Cntk throw an error: "Data parallel learning is a singleton class"
Regards