Closed SestoAle closed 2 months ago
Hi. While we have designed Pearl with distributed processing in mind, it is true that this is not currently supported yet, and probably won't be for some time. We are preparing a public roadmap indicating the next steps for Pearl and will address it there.
Hello,
I was wondering if the framework allows for parallel/distributed environments. The paper says pearl supports distributed training but I can't find any examples or documentation on how to do that. Not even for running multiple environments in the same machine.
What is the plan of pearl for supporting distributed/parallelised environments training?
Thanks!