huilinye / OpenFSI

A highly efficient and portable fluid-structure simulation package based on immersed-boundary method
GNU General Public License v3.0
29 stars 20 forks source link

Problem with MPI #4

Open josepedro opened 2 years ago

josepedro commented 2 years ago

Hello,

I would like to know if there is a workaround or comments that could help concerning this problem:

./palammps3D in.adsphere > log.file The MPI_Finalize() function was called after MPI_FINALIZE was invoked. This is disallowed by the MPI standard. *** Your MPI job will now abort. [(null):193940] Local abort after MPI_FINALIZE completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!

Thank you in advance!

JLheu commented 1 year ago

Hello,

I would like to know if there is a workaround or comments that could help concerning this problem:

./palammps3D in.adsphere > log.file The MPI_Finalize() function was called after MPI_FINALIZE was invoked. This is disallowed by the MPI standard. *** Your MPI job will now abort. [(null):193940] Local abort after MPI_FINALIZE completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!

Thank you in advance!

Hello, I have the same problem. Have you solved it? Can you give me some suggestion? Thank you !

liuyangggg commented 1 year ago

Hello, I would like to know if there is a workaround or comments that could help concerning this problem: ./palammps3D in.adsphere > log.file The MPI_Finalize() function was called after MPI_FINALIZE was invoked. This is disallowed by the MPI standard. *** Your MPI job will now abort. [(null):193940] Local abort after MPI_FINALIZE completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed! Thank you in advance!

Hello, I have the same problem. Have you solved it? Can you give me some suggestion? Thank you !

Have you figure it out,do you know what's problem with that?

JLheu commented 1 year ago

Hello,       I have tried many methods.But in the end, they all failed.Perhaps this answer can give you some inspiration: https://github.com/TJFord/palabos-lammps/issues/1

------------------ 原始邮件 ------------------ 发件人: "huilinye/OpenFSI" @.>; 发送时间: 2023年5月5日(星期五) 中午12:17 @.>; @.**@.>; 主题: Re: [huilinye/OpenFSI] Problem with MPI (#4)

Hello, I would like to know if there is a workaround or comments that could help concerning this problem: ./palammps3D in.adsphere > log.file The MPI_Finalize() function was called after MPI_FINALIZE was invoked. This is disallowed by the MPI standard. *** Your MPI job will now abort. [(null):193940] Local abort after MPI_FINALIZE completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed! Thank you in advance!

Hello, I have the same problem. Have you solved it? Can you give me some suggestion? Thank you !

Have you figure it out,do you know what's problem with that?

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>

liuyangggg commented 1 year ago

Hello,       I have tried many methods.But in the end, they all failed.Perhaps this answer can give you some inspiration: TJFord/palabos-lammps#1 ------------------ 原始邮件 ------------------ 发件人: "huilinye/OpenFSI" @.>; 发送时间: 2023年5月5日(星期五) 中午12:17 @.>; @.**@.>; 主题: Re: [huilinye/OpenFSI] Problem with MPI (#4) Hello, I would like to know if there is a workaround or comments that could help concerning this problem: ./palammps3D in.adsphere > log.file The MPI_Finalize() function was called after MPI_FINALIZE was invoked. This is disallowed by the MPI standard. Your MPI job will now abort. [(null):193940] Local abort after MPI_FINALIZE completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed! Thank you in advance! Hello, I have the same problem. Have you solved it? Can you give me some suggestion? Thank you ! Have you figure it out,do you know what's problem with that? — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.>

ok, thank you for your tip

josepedro commented 10 months ago

./embolism in.adsphere worked well, if someone is interested just send me a message

josepedro commented 10 months ago

It is important to use the version lammps-30Jul2016

logagnosia commented 1 month ago

I have been using the version lammps-30Jul2016 and the case simulations performed well by "mpirun". However, I still cannot run the OpenFSI related cases. When I try to use "mpirun" to run the executable file, it always only calls the cores of cpu but does not perform the calculations. I would greatly appreciate any suggestions you could provide.