uDALES / u-dales

uDALES: large-eddy-simulation software for urban flow, dispersion and microclimate modelling
https://udales.github.io/u-dales
GNU General Public License v3.0
47 stars 16 forks source link

JOSS review: comments on installation and running #138

Closed wimvanderbauwhede closed 3 years ago

wimvanderbauwhede commented 3 years ago

I went through the Getting Started guide and managed to install and run everything fine, but here are some comments on the installation and running to help you improve the documentation.

The "Getting Started" guide does not tell me where to get the code.

So I cloned the master branch

https://github.com/uDALES/u-dales.git

Singularity

Singularity-based installation requires golang and some other packages. The instructions to install golang are messy, I just used the Ubuntu packages and that worked fine.

"To build and download the Singularity image use:"

singularity build --remote tools/singularity/image.sif tools/singularity/image.def

FATAL:   Unable to submit build job: no authentication token, log in with `singularity remote login`

So I did

singularity remote login

It asks me to create a token and paste it

I did this, I got:

Access Token: 

    FATAL:   while verifying token: error response from server: Invalid Credentials

What worked:

singularity remote login --tokenfile ./sylabs-token 

So once again I did

singularity build --remote tools/singularity/image.sif tools/singularity/image.def

That takes a while. Some indication of how long it may take would be welcome

"then, to install uDALES use:"

./tools/singularity/udales_build.sh 2 Release

Ditto

"Finally, to run an example case use:"

./tools/singularity/udales_run.sh 2 Release examples/001 namoptions.001

That works but an explanation of how to inspect the results would be very useful.

Prerequisites

That should maybe say "Prerequisites when not using Singularity"

The Linux/WSL (Ubuntu) instructions work fine but could have come earlier because I had already followed the link to the netcdf web site, downloaded netcdf-c and netcdf-fortran and was about to start building when I thought, there must be a better way.

dmey commented 3 years ago

Thanks for the feedback! I'd be happy to create a PR that addresses how to clone and clarify the prerequisites section for cases when singularity is not used. With regards to the installation of Singularity I am not sure -- as this is a separate dependency, currently we assume that information on how to install and set up Singularity are sourced directly from Singularity's website (dependencies and instructions may change at any time) there how would you feel if instead of adding specific doc information about this we were to link to the Singularity's website and ask the user to refer to that for up to date information about the installation and use of Singularity?

With regards to "an explanation of how to inspect the results would be very useful", we have a section at the end of the page (https://udales.github.io/u-dales/udales-getting-started/#whats-next) that takes you to simulate and view the results -- did you find this too hidden away? We wanted to avoid duplication but maybe there is a better place to add this information. @tomgrylls @samoliverowens any thoughts/preferences?

wimvanderbauwhede commented 3 years ago

Re "an explanation of how to inspect the results would be very useful": I found a way. I searched the web for a netcdf viewer and found Paraview and Panoply. Panoply works just fine to visualise the results.

dmey commented 3 years ago

Re "an explanation of how to inspect the results would be very useful": I found a way. I searched the web for a netcdf viewer and found Paraview and Panoply. Panoply works just fine to visualise the results.

Great idea -- I will add them to the docs as well 👍

samoliverowens commented 3 years ago

ncview can also be used to visualise the results quickly.

dmey commented 3 years ago

ncview can also be used to visualise the results quickly.

👍. Added in 0d36037

ashwinvis commented 3 years ago

For the record, I had similar experience as @wimvanderbauwhede with the singularity build --remote ... instruction. I wonder if the remote option is required. I managed to get the build go forward by skipping it and executing sudo singularity build ....

dmey commented 3 years ago

@ashwinvis thanks for the feedback -- we had updated the docs at https://github.com/uDALES/u-dales/pull/139. With regards to the specifics of Singularity, we ask users to refer to the Singularity docs as this is not something related to uDALES. With regards to the remote build option, we wanted something that works on most systems and that is what the remote options does. As you pointed out, if you can run a program with elevated privileges, you don't have to use the remote build but, in most HPC/shared environments, this is often not the case.

ashwinvis commented 3 years ago

Fair enough. I also tried the normal cmake and make route today, and it did not go through. I ran it on ArchLinux, and here are

dependencies ``` community/netcdf-fortran 4.5.3-1 [installed] NetCDF fortran bindings community/netcdf-openmpi 4.7.4-1 [installed] network Common Data Form interface for array-oriented data access and corresponding library with parallel support (openmpi version) extra/openmpi 4.0.5-2 [installed] High performance message passing library (MPI) local/nco 4.9.2-2 [installed] netCDF Operators allow users to manipulate and analyse data stored in NetCDF files core/gcc-fortran 10.2.0-4 [installed: 10.2.0-6] usr/bin/gfortran extra/cmake 3.19.7-1 [installed] A cross-platform open-source make system extra/python 3.9.2-1 [installed] Next generation of the python high-level scripting language extra/python-pip 20.3.1-1 [installed] The PyPA recommended tool for installing Python packages ```
build log ```fortran ❯ make [ 1%] Performing update step for 'fishpack-cmake-project' [ 3%] No patch step for 'fishpack-cmake-project' [ 5%] Performing configure step for 'fishpack-cmake-project' -- Configuring done -- Generating done -- Build files have been written to: /home/avmo/src/sandbox/u-dales/build/release/external/fishpack-cmake-project-prefix/src/fishpack-cmake-project-build [ 7%] Performing build step for 'fishpack-cmake-project' [ 92%] Built target objlib [ 96%] Built target fishpack_static [100%] Built target fishpack_shared [ 9%] No install step for 'fishpack-cmake-project' [ 11%] Completed 'fishpack-cmake-project' [ 15%] Built target fishpack-cmake-project [ 17%] Performing update step for 'vfftpack-cmake-project' [ 19%] No patch step for 'vfftpack-cmake-project' [ 21%] Performing configure step for 'vfftpack-cmake-project' -- Configuring done -- Generating done -- Build files have been written to: /home/avmo/src/sandbox/u-dales/build/release/external/vfftpack-cmake-project-prefix/src/vfftpack-cmake-project-build [ 23%] Performing build step for 'vfftpack-cmake-project' [ 93%] Built target objlib [ 96%] Built target vfftpack_static [100%] Built target vfftpack_shared [ 25%] No install step for 'vfftpack-cmake-project' [ 26%] Completed 'vfftpack-cmake-project' [ 30%] Built target vfftpack-cmake-project [ 32%] Building Fortran object CMakeFiles/u-dales.dir/src/initfac.f90.o /home/avmo/src/sandbox/u-dales/src/initfac.f90:212:23: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 212 | call MPI_BCAST(walltypes, nwallprops*nwalltypes, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:353:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 353 | call MPI_BCAST(block, 11*nblocks, MPI_INTEGER, 0, comm3d, mpierr) | 1 Error: Rank mismatch between actual argument at (1) and actual argument at (2) (scalar and rank-2) /home/avmo/src/sandbox/u-dales/src/initfac.f90:355:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 355 | call MPI_BCAST(faclGR(0:nfcts), nfcts + 1, mpi_logical, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (LOGICAL(4)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:356:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 356 | call MPI_BCAST(facz0(0:nfcts), nfcts + 1, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:357:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 357 | call MPI_BCAST(facz0h(0:nfcts), nfcts + 1, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:358:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 358 | call MPI_BCAST(facalb(0:nfcts), nfcts + 1, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:359:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 359 | call MPI_BCAST(facem(0:nfcts), nfcts + 1, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:360:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 360 | call MPI_BCAST(facd(0:nfcts,1:nwalllayers),(nfcts+1)*nwalllayers, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:361:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 361 | call MPI_BCAST(facdi(0:nfcts, 1:nwalllayers), (nfcts + 1)*nwalllayers, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:362:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 362 | call MPI_BCAST(faccp(0:nfcts, 1:nwalllayers), (nfcts + 1)*nwalllayers, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:363:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 363 | call MPI_BCAST(faclami(0:nfcts, 1:nwalllayers), (nfcts + 1)*nwalllayers, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:364:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 364 | call MPI_BCAST(fackappa(0:nfcts, 1:nwalllayers+1), (nfcts + 1)*(nwalllayers+1), MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:365:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 365 | call MPI_BCAST(faca(0:nfcts), nfcts + 1, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:366:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 366 | call MPI_BCAST(facain(0:nfcts), nfcts + 1, MPI_Integer, 0, comm3d, mpierr) | 1 Error: Rank mismatch between actual argument at (1) and actual argument at (2) (scalar and rank-1) /home/avmo/src/sandbox/u-dales/src/initfac.f90:367:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 367 | call MPI_BCAST(facets, 4*nfcts, MPI_Integer, 0, comm3d, mpierr) | 1 Error: Rank mismatch between actual argument at (1) and actual argument at (2) (scalar and rank-2) /home/avmo/src/sandbox/u-dales/src/initfac.f90:370:26: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 370 | call MPI_BCAST(svf(1:nfcts), nfcts, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:371:26: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 371 | call MPI_BCAST(netsw(1:nfcts), nfcts, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:375:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 375 | call MPI_BCAST(Tfacinit(1:nfcts), nfcts, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:376:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 376 | call MPI_BCAST(facT(0:nfcts, 1:nwalllayers+1), (nfcts + 1)*(nwalllayers+1), MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:377:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 377 | call MPI_BCAST(facTdash(1:nfcts, 1:nwalllayers+1), (nfcts)*(nwalllayers+1), MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:378:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 378 | call MPI_BCAST(facef(1:nfcts), nfcts, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:379:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 379 | call MPI_BCAST(facefi(1:nfcts), nfcts, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:380:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 380 | call MPI_BCAST(facefsum(1:nfcts), nfcts, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:381:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 381 | call MPI_BCAST(fachf(0:nfcts), nfcts + 1, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:382:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 382 | call MPI_BCAST(fachfi(0:nfcts), nfcts + 1, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:383:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 383 | call MPI_BCAST(fachfsum(1:nfcts), nfcts, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:387:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 387 | call MPI_BCAST(facf(0:nfcts, 1:5), (nfcts + 1)*5, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:388:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 388 | call MPI_BCAST(fachurel(0:nfcts), nfcts + 1, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:389:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 389 | call MPI_BCAST(facwsoil(0:nfcts), nfcts + 1, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:390:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 390 | call MPI_BCAST(faccth(0:nfcts), nfcts + 1, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). /home/avmo/src/sandbox/u-dales/src/initfac.f90:394:24: 195 | call MPI_BCAST(nwalltypes, 1, MPI_Integer, 0, comm3d, mpierr) | 2 ...... 394 | call MPI_BCAST(facqsat(0:nfcts), nfcts + 1, MY_REAL, 0, comm3d, mpierr) | 1 Error: Type mismatch between actual argument at (1) and actual argument at (2) (REAL(8)/INTEGER(4)). make[2]: *** [CMakeFiles/u-dales.dir/build.make:134: CMakeFiles/u-dales.dir/src/initfac.f90.o] Error 1 make[1]: *** [CMakeFiles/Makefile2:118: CMakeFiles/u-dales.dir/all] Error 2 make: *** [Makefile:103: all] Error 2 ```

Do you understand this?

dmey commented 3 years ago

Yes, we only support GNU compilers versions 9 and below -- you seem to be running 10.2.

ashwinvis commented 3 years ago

@dmey That's probably it. I managed to compile in a HPC cluster. Thanks.