Open isaactpetersen opened 1 year ago
Hi Isaac,
Thanks for getting in touch and for your suggestion!
Yes, implementing a parallel computing option has been on my to-do list for the package for a while. Unfortunately, since I changed job 3 years ago, I haven’t had much time to work on the package, but I hope to be able to get to do this at some point… It shouldn’t be that much of a problem, should probably just switch from using a for loop to a foreach. The burn-in will not be affected, but I could significantly reduce computing time for the later imps!
Many thanks, Matteo
From: isaactpetersen @.> Sent: 12 November 2022 13:20 To: Matteo21Q/jomo @.> Cc: Subscribed @.***> Subject: [Matteo21Q/jomo] Allow parallelization to speed up imputation (Issue #1)
⚠ Caution: External sender
Thank you very much for your development of the jomo package. Would it be possible to add the ability to fit a jomo model with parallel processing? This would significantly speed up the imputation process. The mice package has this ability with the futuremice() function, and the Amelia package has this ability with the parallel and ncpus arguments.
Thank you for your consideration!
— Reply to this email directly, view it on GitHubhttps://github.com/Matteo21Q/jomo/issues/1, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AJSP74VM7ZGEET4IPNNBZGTWH6KRJANCNFSM6AAAAAAR6KHB5U. You are receiving this because you are subscribed to this thread.Message ID: @.***>
Sounds great, thanks Matteo. Much appreciated!
Dear Matteo,
I´m also interested in fitting my jomo model with parallel processing to speed up the process and just wanted to ask if there is any news in this regard.
Any help is well appreciated! Sophie
Dear Sophie,
Thanks for getting in touch! Unfortunately, I haven’t had time to do this yet, and I can’t foresee this being done in the near future. In the meantime, one thing I can advise is that you can do the imputation in parallel if you use the .MCMCchain functions, which generate a single imputation and can be given the output from the burn-in phase as input. If you use a foreach loop with the %dorng% option you can run these in parallel. This is broadly what I am planning to do, but as I said, I currently just don’t have the time to do it. Sorry about that and hope you can manage to do this! Let me know if you needed any other info.
Matteo
From: sophieschneemelcher @.> Sent: 10 May 2023 15:07 To: Matteo21Q/jomo @.> Cc: Quartagno, Matteo @.>; Comment @.> Subject: Re: [Matteo21Q/jomo] Allow parallelization to speed up imputation (Issue #1)
⚠ Caution: External sender
Dear Matteo,
I´m also interested in fitting my jomo model with parallel processing to speed up the process and just wanted to ask if there is any news in this regard.
Any help is well appreciated! Sophie
— Reply to this email directly, view it on GitHubhttps://github.com/Matteo21Q/jomo/issues/1#issuecomment-1542279682, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AJSP74TQHY445NSKSXXZ7HLXFOOIPANCNFSM6AAAAAAR6KHB5U. You are receiving this because you commented.Message ID: @.***>
Thank you very much for your development of the
jomo
package. Would it be possible to add the ability to fit a jomo model with parallel processing? This would significantly speed up the imputation process. Themice
package has this ability with thefuturemice()
function, and theAmelia
package has this ability with theparallel
andncpus
arguments.Thank you for your consideration!