comp-physics / RBC3D

3D Spectral boundary integral solver for cell-scale blood flow
MIT License
7 stars 3 forks source link

readme describes installation on a specific computer #42

Closed sbryngelson closed 2 months ago

sbryngelson commented 3 months ago

The readme should describe how to install the code on any *nix based computer or at least a specific generic one, like ubuntu or macos. A link can be included to Phoenix or other specific install instructions in version control.

suzanmanasreh commented 3 months ago

I'm working on this! Just got lazy cause netcdf-fortran has so many manual install steps and other dependencies (netcdf-c, hdf5), and I don't want to go through a package manager to install stuff cause then it's hard to get the path to the library for the cmake file on everyone's machine. Oh, also I can usually only run simulations with like 100+ cores, so this project requires a supercomputing cluster anyways.

suzanmanasreh commented 3 months ago

Wait I think the manual install is necessary anyways because they removed netcdf-fortran from ICE modules, and now my ICE runner is failing cause of an unrelated srun error. I haven't been able to get netcdf-fortran to manually install properly though.

suzanmanasreh commented 2 months ago

Was able to get all the packages to compile on mac with brew installed gcc and mpi compilers, but now common modules don't compile because of mpi argument issues. I think it's best to leave the repo as cluster only.

make -j 24
[  4%] Building Fortran object common/CMakeFiles/common.dir/ModDataTypes.F90.o
[  4%] Building Fortran object common/CMakeFiles/common.dir/ModPolyRoots.F90.o
[  8%] Building Fortran object common/CMakeFiles/common.dir/ModFFT.F90.o
[  8%] Building Fortran object common/CMakeFiles/common.dir/ModBasicMath.F90.o
[ 10%] Building Fortran object common/CMakeFiles/common.dir/ModDataStruct.F90.o
[ 12%] Building Fortran object common/CMakeFiles/common.dir/ModSphpk.F90.o
[ 14%] Building Fortran object common/CMakeFiles/common.dir/ModQuadRule.F90.o
[ 16%] Building Fortran object common/CMakeFiles/common.dir/ModConf.F90.o
[ 18%] Building Fortran object common/CMakeFiles/common.dir/ModPolarPatch.F90.o
[ 20%] Building Fortran object common/CMakeFiles/common.dir/ModWall.F90.o
[ 22%] Building Fortran object common/CMakeFiles/common.dir/ModSpline.F90.o
/Users/suzanm/Library/CloudStorage/OneDrive-GeorgiaInstituteofTechnology/cpg/repos/RBC3D/common/ModConf.F90:142:78:

  142 |       call MPI_Send(lenname, 1, MPI_Integer, 0, 1, MPI_Comm_World, stat, ierr)
      |                                                                              1
Error: There is no specific subroutine for the generic 'mpi_send' at (1)
/Users/suzanm/Library/CloudStorage/OneDrive-GeorgiaInstituteofTechnology/cpg/repos/RBC3D/common/ModConf.F90:143:90:

  143 |       call MPI_Send(machinename, lenname, MPI_Character, 0, 1, MPI_Comm_World, stat, ierr)
      |                                                                                          1
Error: There is no specific subroutine for the generic 'mpi_send' at (1)
make[2]: *** [common/CMakeFiles/common.dir/ModConf.F90.o] Error 1
make[2]: *** Waiting for unfinished jobs....
make[1]: *** [common/CMakeFiles/common.dir/all] Error 2
make: *** [all] Error 2
sbryngelson commented 2 months ago

I disagree and think support for conventional desktops/laptops (at least *nix-based ones, Windows is... something else entirely) is critical. Not every simulation needs to be large. MFC supports laptops even though it sometimes requires exascale computers. The idea is that most users will run on their laptops before moving to a cluster.

For your issue above, this seems like a compilation issue, perhaps with how you handled installing your MPI wrapper via brew. It's hard to say for sure. But it doesn't appear as a complicated bug from this snippet.

sbryngelson commented 2 months ago

I'm fine if you break PRs into two, one for each compartmentalized issue, but this seems important, and you are close to fixing it.

suzanmanasreh commented 2 months ago

I know my brew install gcc open-mpi or mpich (i don't really know which one it's using) works for cpp mpi code, but i've never used it to run fortran with mpi. I'll see if I can get help for that.

suzanmanasreh commented 2 months ago

Apparently the stat variable isn't an argument to MPI_Send. Fixed that and RBC3D examples/case is running on my mac right now with 4 processors but consuming lots of power. My cpu is at like 95% utilization. @sbryngelson do you think this PR needs a smaller example case to go with it? Also, resolving link options between the clusters and mac is really hard.

suzanmanasreh commented 2 months ago

Question: if MFC uses gpu's, how does it run on macs?

sbryngelson commented 2 months ago

It can use either, it doesn't require a GPU.