DelNov / T-Flows

Program for Simulation of Turbulent Flows
Other
109 stars 48 forks source link

Multiple domains #194

Closed Niceno closed 4 years ago

Niceno commented 4 years ago

With this pull request, I am trying to incorporate changes pertinent to handling of multiple domains (which also means multiple materials). The changes being proposed here did not conflict at all with the latest development_branch, which is great news, but there are a few of downsides.

  1. All *.neu.gz files were missing from the development_branch and I had to reintroduce them. I am not sure how they got missing, maybe a push was made while they were not present in their test directories.

  2. While running the tests, I saw the following warnings:

    
    ./test_build.sh: line 1180: [: -eq: unary operator expected                     
    ./test_build.sh: line 1183: [: -eq: unary operator expected                        
    ./test_build.sh: line 1186: [: -eq: unary operator expected                        
    ./test_build.sh: line 1191: [: -eq: unary operator expected                        
    ./test_build.sh: line 1192: [: -eq: unary operator expected                        
    ./test_build.sh: line 1198: [: -eq: unary operator expected                        
    ./test_build.sh: line 1204: [: -eq: unary operator expected                        
    ./test_build.sh: line 1210: [: -eq: unary operator expected                        
    ./test_build.sh: line 1213: [: -eq: unary operator expected                        
    ./test_build.sh: line 1223: [: -eq: unary operator expected
but couldn't fix it because I didn't write these lines in the first place :-(

3. The most troublesome, test 5 (Processor backup tests), fails with the message:
*** An error occurred in MPI_Comm_create_keyval 
*** reported by process [1672806401,0]
*** on communicator MPI_COMM_WORLD
*** MPI_ERR_ARG: invalid argument of some other kind
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job) 

when trying to create a CGNS file with results at the beginning of simulation.

Any help with resolving these issues would be highly appreciated.

P.S. The most important changes to handle multiple materials (domains) took place in Main_Pro.f90, where all occurrences of `grid`, `flow`, `turb` etc are replaced with arrays `grid(MD)`, `flow(MD)`, `turb(MD)`, where `MD` is a constant standing for **M**ax number of **D**omains.  In addition, a new module called `Interface_Mod` has been created which handles interfaces between different domains.  To be able to run different turbulence models in different domains, variables in `Turb_Mod` such as `turbulence_model`, `turbulent_heat_flux` have been replaced with `turb % model`, `turb % heat_flux` etc. so that they can be different in each domain.  The demonstration of multiple domain simulations can be found in `Test\Laminar\Heat_Exchanger` and `Test\Membrane`.