svalinn / Cubit-plugin

Plugins and command extensions for Coreform Cubit
BSD 3-Clause "New" or "Revised" License
17 stars 14 forks source link

Cubit 2020.2 compatibility -- Linux action and scripts #84

Closed bam241 closed 3 years ago

bam241 commented 3 years ago

This aims to:

  1. modify the plugin implementation to build against Cubit 2020.2, see files :
    • CMakeLists.txt
    • CubitVersionCompatibility.hpp
    • SvalinnPlugin.cpp
    • export_dagmc_cmd/DAGMCExportCommand.cpp
    • iGeom/iGeom.cpp
    • iGeom/tests/iGeom_test.cpp
    • import_mcnp_cmd/MCNPImp.cp
  2. Update the build scripts to allow plugin build for Cubit 2020.2: scripts/*.sh
  3. Setup GitHub Action to build both version of the plugin: .github/workflows/linux.yml
bam241 commented 3 years ago

So the plugin built with Ubuntu 18.04 will work on 20.04? @pshriwise to answer your question, I think so. I am not sure I have tried it, tho...

gonuke commented 3 years ago

Once the required PR is in for Windows, should we do a micro-release of DAGMC and then use that tag for your build process?

bam241 commented 3 years ago

probably a good idea. I think we probably need a micro release for PyNE, so we can use it for dagmc amalgamation...

pshriwise commented 3 years ago

Going to reiterate my comment from #76 here:

We have scripts for building the plugin for Linux and most of the lines here are copied from there. Can those scripts be refactored so that we can source the linux_build_share.sh file and call the functions from there for each step in this file? It would be significantly easier to maintain one set of scripts than can be used in the GHA's as well.

bam241 commented 3 years ago

running an export command with cubit 2020.2 and the new plugin I got:

Warning! ***HDF5 library version mismatched error***
The HDF5 header files used to compile this application do not match
the version used by the HDF5 library to which this application is linked.
Data corruption or segmentation faults may occur if the application continues.
This can happen when an application was compiled by one version of HDF5 but
linked with a different version of static or shared HDF5 library.
You should recompile the application or check your shared library related
settings such as 'LD_LIBRARY_PATH'.
You can, at your own risk, disable this warning by setting the environment
variable 'HDF5_DISABLE_VERSION_CHECK' to a value of '1'.
Setting it to 2 or higher will suppress the warning messages totally.
Headers are 1.10.4, library is 1.12.0

@gonuke @pshriwise shall we worry about it ?

bam241 commented 3 years ago

trying to export I also got a segfault: here is the trace

#0  0x00007ffff5ed318b in raise () from /lib/x86_64-linux-gnu/libc.so.6
#1  0x00007ffff5eb2859 in abort () from /lib/x86_64-linux-gnu/libc.so.6
#2  0x00007fffe4f6152b in H5check_version (majnum=1, minnum=10, relnum=4) at H5.c:782
#3  0x00007fffa64e3b7a in mhdf_createFile () from /opt/Coreform-Cubit-2020.2/bin/plugins/svalinn/libMOAB.so.5
#4  0x00007fffa64b5519 in moab::WriteHDF5::serial_create_file(char const*, bool, std::vector<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::allocator<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > > > const&, moab::TagInfo* const*, int, int) () from /opt/Coreform-Cubit-2020.2/bin/plugins/svalinn/libMOAB.so.5
#5  0x00007fffa64a7c8a in moab::WriteHDF5::write_file_impl(char const*, bool, moab::FileOptions const&, unsigned long const*, int, std::vector<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::allocator<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > > > const&, moab::TagInfo* const*, int, int) () from /opt/Coreform-Cubit-2020.2/bin/plugins/svalinn/libMOAB.so.5
#6  0x00007fffa64a7207 in moab::WriteHDF5::write_file(char const*, bool, moab::FileOptions const&, unsigned long const*, int, std::vector<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::allocator<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > > > const&, moab::TagInfo* const*, int, int) () from /opt/Coreform-Cubit-2020.2/bin/plugins/svalinn/libMOAB.so.5
#7  0x00007fffa6190a8f in moab::Core::write_file(char const*, char const*, char const*, moab::Range const&, moab::TagInfo* const*, int) () from /opt/Coreform-Cubit-2020.2/bin/plugins/svalinn/libMOAB.so.5
#8  0x00007fffa6190606 in moab::Core::write_file(char const*, char const*, char const*, unsigned long const*, int, moab::TagInfo* const*, int) () from /opt/Coreform-Cubit-2020.2/bin/plugins/svalinn/libMOAB.so.5
#9  0x00007ffff00a85b1 in DAGMCExportCommand::execute(CubitCommandData&) () from /opt/Coreform-Cubit-2020.2/bin/plugins/libsvalinn_plugin.so
#10 0x00007fffe92b3e79 in ?? () from /opt/Coreform-Cubit-2020.2/bin/libcubiti19.so
#11 0x00007fffe96b2a96 in ?? () from /opt/Coreform-Cubit-2020.2/bin/libcubiti19.so
#12 0x00007fffe92b6778 in ?? () from /opt/Coreform-Cubit-2020.2/bin/libcubiti19.so
#13 0x00007fffe9275c3a in ?? () from /opt/Coreform-Cubit-2020.2/bin/libcubiti19.so
#14 0x00007fffe92781f5 in ?? () from /opt/Coreform-Cubit-2020.2/bin/libcubiti19.so
#15 0x00007fffe9168b8e in CubitInterface::cmd(char const*) () from /opt/Coreform-Cubit-2020.2/bin/libcubiti19.so
#16 0x00007fffa5792fb1 in ?? () from /opt/Coreform-Cubit-2020.2/bin/_cubit2.so
#17 0x00007ffff126f32f in call_function (oparg=<optimized out>, pp_stack=0x7fffffffce20) at Python/ceval.c:4376
#18 PyEval_EvalFrameEx (f=f@entry=0x7ffff0f53210, throwflag=throwflag@entry=0) at Python/ceval.c:3013
#19 0x00007ffff12709c5 in fast_function (nk=<optimized out>, na=<optimized out>, n=1, pp_stack=0x7fffffffcf30, func=0x7fffa5dad9d0) at Python/ceval.c:4461
#20 call_function (oparg=<optimized out>, pp_stack=0x7fffffffcf30) at Python/ceval.c:4396
#21 PyEval_EvalFrameEx (f=f@entry=0x7ffff101c8c0, throwflag=throwflag@entry=0) at Python/ceval.c:3013
#22 0x00007ffff1271208 in PyEval_EvalCodeEx (co=co@entry=0x7ffff0f48230, globals=globals@entry=0x7ffff102d050, locals=locals@entry=0x7ffff102d050, args=args@entry=0x0, argcount=argcount@entry=0, kws=kws@entry=0x0, kwcount=0,
    defs=0x0, defcount=0, closure=0x0) at Python/ceval.c:3608
#23 0x00007ffff12713bd in PyEval_EvalCode (co=co@entry=0x7ffff0f48230, globals=globals@entry=0x7ffff102d050, locals=locals@entry=0x7ffff102d050) at Python/ceval.c:669
#24 0x00007ffff1296a8a in run_mod (arena=0x55555614aee0, flags=<optimized out>, locals=0x7ffff102d050, globals=0x7ffff102d050, filename=0x7ffff12c7175 "<string>", mod=<optimized out>) at Python/pythonrun.c:1385
#25 PyRun_StringFlags (str=0x7ffff1088f74 "cubit.cmd('export dagmc \"/home/mouginot/work/FRENSIE-tests/neutron_photon/bare_sphere/sphere.h5m\" faceting_tolerance 1.e-4')\n", start=start@entry=257,
    globals=globals@entry=0x7ffff102d050, locals=locals@entry=0x7ffff102d050, flags=<optimized out>) at Python/pythonrun.c:1348
#26 0x00007ffff126ad13 in exec_statement (locals=0x7ffff102d050, globals=0x7ffff102d050, prog=<optimized out>, f=0x7ffff0ff29b0) at Python/ceval.c:5126
#27 PyEval_EvalFrameEx (f=f@entry=0x7ffff0ff29b0, throwflag=throwflag@entry=0) at Python/ceval.c:2122
#28 0x00007ffff1271208 in PyEval_EvalCodeEx (co=<optimized out>, globals=<optimized out>, locals=locals@entry=0x0, args=<optimized out>, argcount=<optimized out>, kws=0x7ffff0ff33a8, kwcount=0, defs=0x0, defcount=0,
    closure=0x0) at Python/ceval.c:3608
#29 0x00007ffff126ddce in fast_function (nk=<optimized out>, na=<optimized out>, n=<optimized out>, pp_stack=0x7fffffffd320, func=0x7ffff101eed0) at Python/ceval.c:4471
#30 call_function (oparg=<optimized out>, pp_stack=0x7fffffffd320) at Python/ceval.c:4396
#31 PyEval_EvalFrameEx (f=f@entry=0x7ffff0ff3210, throwflag=throwflag@entry=0) at Python/ceval.c:3013
#32 0x00007ffff12709c5 in fast_function (nk=<optimized out>, na=<optimized out>, n=2, pp_stack=0x7fffffffd430, func=0x7ffff0f472d0) at Python/ceval.c:4461
#33 call_function (oparg=<optimized out>, pp_stack=0x7fffffffd430) at Python/ceval.c:4396
#34 PyEval_EvalFrameEx (f=f@entry=0x7ffff101c710, throwflag=throwflag@entry=0) at Python/ceval.c:3013
#35 0x00007ffff1271208 in PyEval_EvalCodeEx (co=co@entry=0x7ffff0ff47b0, globals=globals@entry=0x7ffff10f1170, locals=locals@entry=0x7ffff10f1170, args=args@entry=0x0, argcount=argcount@entry=0, kws=kws@entry=0x0, kwcount=0,
    defs=0x0, defcount=0, closure=0x0) at Python/ceval.c:3608
#36 0x00007ffff12713bd in PyEval_EvalCode (co=co@entry=0x7ffff0ff47b0, globals=globals@entry=0x7ffff10f1170, locals=locals@entry=0x7ffff10f1170) at Python/ceval.c:669
#37 0x00007ffff1296a8a in run_mod (arena=0x55555614e0c0, flags=0x7ffff10f1170, locals=0x7ffff10f1170, globals=0x7ffff10f1170, filename=0x7ffff12c7175 "<string>", mod=<optimized out>) at Python/pythonrun.c:1385
#38 PyRun_StringFlags (str=str@entry=0x555555fe6b28 "bhelper.instance().run_code('cubit.cmd(\\'export dagmc \"/home/mouginot/work/FRENSIE-tests/neutron_photon/bare_sphere/sphere.h5m\" faceting_tolerance 1.e-4\\')\\n')",
    start=start@entry=257, globals=0x7ffff10f1170, locals=0x7ffff10f1170, flags=flags@entry=0x0) at Python/pythonrun.c:1348
#39 0x00007ffff129868f in PyRun_SimpleStringFlags (
    command=0x555555fe6b28 "bhelper.instance().run_code('cubit.cmd(\\'export dagmc \"/home/mouginot/work/FRENSIE-tests/neutron_photon/bare_sphere/sphere.h5m\" faceting_tolerance 1.e-4\\')\\n')", flags=0x0)
    at Python/pythonrun.c:983
#40 0x00007ffff7e8594f in ?? () from /opt/Coreform-Cubit-2020.2/bin/libclarofw.so
#41 0x00007ffff7e8ca81 in ?? () from /opt/Coreform-Cubit-2020.2/bin/libclarofw.so
#42 0x00007ffff7e8b8d4 in ?? () from /opt/Coreform-Cubit-2020.2/bin/libclarofw.so
#43 0x00007ffff7e899e2 in Framework::run_app() () from /opt/Coreform-Cubit-2020.2/bin/libclarofw.so
#44 0x0000555555560c3b in ?? ()
#45 0x00007ffff5eb40b3 in __libc_start_main () from /lib/x86_64-linux-gnu/libc.so.6
#46 0x0000555555560f8e in ?? ()
bam241 commented 3 years ago

wondering if the 2 are related

pshriwise commented 3 years ago

shall we worry about it?

Yes. We should worry about it. First step is to figure if this is a clash between the hdf5 libraries shipped with Cubit and the one used to build DAGMC I think.

running an export command with cubit 2020.2 and the new plugin I got:

What Ubuntu version?

bam241 commented 3 years ago

What Ubuntu version?

20.04

bam241 commented 3 years ago

Yes. We should worry about it. First step is to figure if this is a clash between the hdf5 libraries shipped with Cubit and the one used to build DAGMC I think.

I'll try to uninstall my hdf5 package and see what append.

bam241 commented 3 years ago

removing all hdf5 apt packages from my computer, I still get the warning and the segfault.

pshriwise commented 3 years ago

we include the hdf5 libraries with the plugin, right? So it's still going to find those.

pshriwise commented 3 years ago

Good to see all the work going into relying on our bash scripts for GHA. I'm going unsubscribe from the updates here until the work stabilizes a bit. Feel free to ping me when this is ready for another look @bam241!

bam241 commented 3 years ago

@pshriwise @gonuke this seems to be working properly (only tested cubit 2020.2 build on githubaction on my local desktop) what I did so far:

I didn't find a way to make this work without the tweak on MOAb CMakefile for HDF5, I didn't investigated more, but it seems to me that this FindHDF5_MOAB.cmake shipped with MOAB source code as problems in it....

bam241 commented 3 years ago

I just tested the Trelis17.1 plugin it seems to be working as well

Both have been tested on: https://github.com/bam241/FRENSIE-tests/blob/neutron_photon/neutron_photon/bare_cube/trelis_script_pb

pshriwise commented 3 years ago

Getting a docker error about the CMake version when building hdf5 w/ Ubuntu 18.04.

+ cmake ../hdf5 -DBUILD_SHARED_LIBS:BOOL=ON
CMake Error at CMakeLists.txt:1 (cmake_minimum_required):
  CMake 3.12 or higher is required.  You are running version 3.10.2
bam241 commented 3 years ago

Getting a docker error about the CMake version when building hdf5 w/ Ubuntu 18.04.

+ cmake ../hdf5 -DBUILD_SHARED_LIBS:BOOL=ON
CMake Error at CMakeLists.txt:1 (cmake_minimum_required):
  CMake 3.12 or higher is required.  You are running version 3.10.2

only tried with 20.04...

pshriwise commented 3 years ago

From what I can see, Coreform has updated its package names for Cubit 2020.2 from Coreform-Cubit-2020.2-Lin64.deb to Coreform-Cubit-2020.2-Ubuntu20.deb. I'm wondering if it would be cleaner to require the Trelis Debian package and SDK filepaths to avoid problems like this in the future.

bam241 commented 3 years ago

I am not sure I understand what you mean...

pshriwise commented 3 years ago

The name of the downloaded Debian package file has changed on their site:

Screenshot from 2021-03-10 22-20-39

bam241 commented 3 years ago

The name of the downloaded Debian package file has changed on their site:

Screenshot from 2021-03-10 22-20-39

ok, I got that part, I was unsure about what you suggested we do about it.

My understanding is you suggest to rename the package name we have in the script.

pshriwise commented 3 years ago

My understanding is you suggest to rename the package name we have in the script.

My thought was that we could have two arguments, one that points to the Trelis/Cubit package file and another that points to the SDK file. We wouldn't have any hardcoded names in the scripts, making it easier to adjust to new names for these files going forward.

bam241 commented 3 years ago

this would me 4 arguments for the scripts,

I honestly don't mind about implementing it, but wondering if it is not a lot of arguments ?

pshriwise commented 3 years ago

I honestly don't mind about implementing it, but wondering if it is not a lot of arguments ?

I don't find that to be too bad personally. The tradeoff in flexibility vs. verbosity seems worth it here, especially as new versions of Trelis/Cubit-Learn are released. It would be nice not to have to add blocks of code to these scripts every time there's a new version released.

bam241 commented 3 years ago

@pshriwise this should be good now.

sorry for the delay

pshriwise commented 3 years ago

I'm still getting that CMake error in hdf5 1.12.0 building in Ubuntu 18.04. If we can rely on the system hdf5 install when building the plugin on that OS maybe we should stick to that?

bam241 commented 3 years ago

@pshriwise the last commit should have fix the hdf5 issue.

basically checking the ubuntu version and installing hdf5 form apt-get when ubuntu_version <= 18

bam241 commented 3 years ago

since your approval @pshriwise I added the build folder into .gitignore for convenience...