amusecode / amuse

Astrophysical Multipurpose Software Environment. This is the main repository for AMUSE
http://www.amusecode.org
Apache License 2.0
155 stars 98 forks source link

Installing AMUSE on a Mac where Anaconda Python Distribution Installed #235

Open OrbitalMechanic opened 6 years ago

OrbitalMechanic commented 6 years ago

I'm interested in installing AMUSE on a MacPro (Late 2013) that run MacOS X Sierra (ver. 10.12.6) that has Python 3.6.5 Anaconda distribution installed. Is there a procedure for installing AMUSE under Mac OS X with the Anaconda distribution of Python already installed that does not require the use of MacPorts?

Please advise.

ipelupessy commented 6 years ago

Myself, I am not too familier with mac installs (or anaconda).. @rieder: have you tried this?

rieder commented 6 years ago

I’m in the process of trying :). I’ll hopefully have something more to say later this week.

rieder commented 6 years ago

I tried this by installing the prerequisites through conda. Unfortunately, all the Fortran-based community codes fail to build (and the others build, but don't work properly). This is because the openmpi I installed (from conda-forge) doesn't have fortran support - probably because it's built for the Xcode compilers which lack this (maybe this shouldn't be a blocking issue for installing any part of AMUSE, but that's a separate issue). If there is an openmpi (or mpich) with fortran support which can be installed through conda, I guess it could work. I'll keep looking for now, but if anyone knows of such a package, please let us know.

egpbos commented 6 years ago

Hm, I deduced from this that the conda-forge openmpi would be fortran enabled. Indeed the fortran codes build on my machine (High Sierra 10.13.5, Xcode 9.4.1, clean conda env with python2.7). However, I also have Macports gfortran installed, so I'm not 100% sure which fortran gets used.

egpbos commented 6 years ago

I also installed gfortran_osx-64, that seems to make sure that indeed the conda fortran compiler is used instead of the macports one.

rieder commented 6 years ago

If that works, great! Did you install gfortran_osx-64 through Anaconda as well?

egpbos commented 6 years ago

Yes, through conda-forge.

Not everything is working for me yet though. I have some issues with the linker, which is not finding libraries, complaining with messages like:

dyld: Library not loaded: @rpath/libmpi.40.dylib
  Referenced from: /Users/pbos/src/git/amuse/src/amuse/community/athena/build/./a.out
  Reason: image not found

The library is there, and the linker doesn't complain during compilation, but on execution, apparently some issues pop up. I think one way to fix this is to add -Wl,-rpath,$CONDA_PREFIX to the build commands for the failing executables (-rpath puts the runtime link path into the executable if I understand correctly), but this is probably too invasive. Another "obvious" way would be to add the path to DYLD_LIBRARY_PATH, but this should be frowned upon (a "black screen" answer in QI ;) ). Possibly it is an issue with pkg-config... lemme try that.

egpbos commented 6 years ago

Ok, just installing pkg-config does not magically fix everything.

I did find a way to add the necessary rpath to a complaining executable after the fact. One example is hermite0_worker:

cd src/amuse/community/hermite0
install_name_tool -add_rpath ${CONDA_PREFIX}/lib hermite0_worker

After this it works without complaining (well, it complains about MPI not being able to connect through my firewall, different issue #243). So indeed, supplying rpath at build time using -Wl,-rpath,$CONDA_PREFIX in ~LD_FLAGS~ LDFLAGS should do the trick... let's try.

rieder commented 6 years ago

Issue #182 is also related to / the same as this one

egpbos commented 6 years ago

Ok, so that indeed seems to have done the trick! I can now compile 43/46 codes and 5/6 libraries. Is there a list of non-built libraries somewhere?

And importantly: the built codes also seem to run without rpath problems.

egpbos commented 6 years ago

First testing results:

Core tests

nosetests -e 'grid_implementation' test/core_tests

The grid_implementation test is excluded, see #258.

Results:

...........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................F................................................................................./Users/pbos/src/git/amuse/src/amuse/test/amusetest.py:119: RuntimeWarning: overflow encountered in ulong_scalars
  is_one_zero = second_num * first_num == 0.0
.............................................................................
======================================================================
FAIL: test10 (test.core_tests.test_rotation.TestRotations)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/pbos/src/git/amuse/test/core_tests/test_rotation.py", line 162, in test10
    self.assertAlmostEquals(dot1,dot2,14)
  File "/Users/pbos/src/git/amuse/src/amuse/test/amusetest.py", line 99, in failUnlessAlmostEqual
    self._raise_exceptions_if_any(failures, first, second, '{0} != {1} within {2} places', msg, places)
  File "/Users/pbos/src/git/amuse/src/amuse/test/amusetest.py", line 64, in _raise_exceptions_if_any
    raise self.failureException(msg or err_fmt_string.format(first, second, *args))
AssertionError: 24.0 != 24.0 within 14 places
-------------------- >> begin captured stdout << ---------------------
test conservation of dot, transformation of cross

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
Ran 922 tests in 25.819s

FAILED (failures=1)

Report tests

nosetests -e 'test_speed' nosetests test/reports

The test_speed test fails as well, see again #258.

Results

----------------------------------------------------------------------
Ran 0 tests in 0.000s

OK

Seems a bit odd to have an almost empty testing directory.

egpbos commented 6 years ago

Ext tests

nosetests test/ext_tests

Results

......................................................................S....SE.EWriting output to: /Users/pbos/src/git/amuse/data/gaslactics/output/model7603316886328939275

 running dbh

dyld: Library not loaded: @rpath/libgfortran.3.dylib
  Referenced from: /Users/pbos/src/git/amuse/src/amuse/community/galactics/gas_src/bin/dbh
  Reason: image not found
Exception occurred in commit_parameters: dbh fail
-1 OrderedDictionary({})
E.....SSSSSSS......Checking parameters, calculating halo properties and initialising grid in r...
Done in 0.151459 seconds.
Initialising grid for distribution function...
Warning!
x = 6.9077553e+00 was above range of array! Index i = 2000 / n = 2001.
Array between -1.3815511e+01 and 6.9077553e+00.
Relative error (= 1.28577e-16) at upper boundary was within tolerance of 1.000000e-10
Index set to i = n-2 = 1999
Done in 2.2651 seconds.
Setting particle positions...
Done in 0.000552 seconds.
Setting particle velocities...
Done in 0.000489 seconds.
Setting remaining particle attributes...
Done in 8e-06 seconds
Calculating a few things, doing mass scaling and correct center of mass position and velocity...
Warning!
x = 5.0000000e-04 was below range of array! Index i = -1 / n = 2001.
Array between 1.0000000e+00 and 1.0010000e+00.
Done in 0.000203 seconds
Total time needed was 2.41781 seconds
...............E....SSS................................................................E.E........E............................S/Users/pbos/src/git/amuse/src/amuse/ext/static_potentials.py:184: RuntimeWarning: divide by zero encountered in log
  log_m_over_mb = numpy.log(mass_above/mass_below) * numpy.log(r/radius_below) / numpy.log(radius_above/radius_below)
/Users/pbos/src/git/amuse/src/amuse/ext/static_potentials.py:184: RuntimeWarning: invalid value encountered in double_scalars
  log_m_over_mb = numpy.log(mass_above/mass_below) * numpy.log(r/radius_below) / numpy.log(radius_above/radius_below)
.........S.SSSS.............................
======================================================================
ERROR: test1 (test.ext_tests.test_galactics_model.NewGalactICsModelTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/pbos/src/git/amuse/test/ext_tests/test_galactics_model.py", line 17, in test1
    do_scale = True)
  File "/Users/pbos/src/git/amuse/src/amuse/ext/galactics_model.py", line 80, in __call__
    return _new_galactics_model(*args, code=code,**kwargs)
  File "/Users/pbos/src/git/amuse/src/amuse/ext/galactics_model.py", line 18, in _new_galactics_model
    instance.generate_particles()
  File "/Users/pbos/src/git/amuse/src/amuse/community/galactics/interface.py", line 883, in generate_particles
    result = self.overridden().generate_particles()
  File "/Users/pbos/src/git/amuse/src/amuse/support/methods.py", line 107, in __call__
    result = self.method(*list_arguments, **keyword_arguments)
  File "/Users/pbos/src/git/amuse/src/amuse/support/methods.py", line 109, in __call__
    result = self.convert_result(result)
  File "/Users/pbos/src/git/amuse/src/amuse/support/methods.py", line 145, in convert_result
    return self.definition.convert_result(self, result)
  File "/Users/pbos/src/git/amuse/src/amuse/support/interface.py", line 628, in convert_result
    return self.handle_return_value(method, result)
  File "/Users/pbos/src/git/amuse/src/amuse/support/interface.py", line 571, in handle_as_unit
    unit.append_result_value(method, self, value, result)
  File "/Users/pbos/src/git/amuse/src/amuse/support/interface.py", line 60, in append_result_value
    self.convert_result_value(method, definition, value)
  File "/Users/pbos/src/git/amuse/src/amuse/support/interface.py", line 70, in convert_result_value
    definition.handle_errorcode(errorcode)
  File "/Users/pbos/src/git/amuse/src/amuse/support/interface.py", line 548, in handle_errorcode
    raise exceptions.AmuseException("Error when calling '{0}' of a '{1}', errorcode is {2}".format(self.name, type(self.wrapped_object).__name__, errorcode), errorcode)
AmuseException: Error when calling 'generate_particles' of a 'GalactICs', errorcode is -4

======================================================================
ERROR: test5 (test.ext_tests.test_galactics_model.NewGalactICsModelTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/pbos/src/git/amuse/test/ext_tests/test_galactics_model.py", line 73, in test5
    do_scale = False)
  File "/Users/pbos/src/git/amuse/src/amuse/ext/galactics_model.py", line 80, in __call__
    return _new_galactics_model(*args, code=code,**kwargs)
  File "/Users/pbos/src/git/amuse/src/amuse/ext/galactics_model.py", line 18, in _new_galactics_model
    instance.generate_particles()
  File "/Users/pbos/src/git/amuse/src/amuse/community/galactics/interface.py", line 883, in generate_particles
    result = self.overridden().generate_particles()
  File "/Users/pbos/src/git/amuse/src/amuse/support/methods.py", line 107, in __call__
    result = self.method(*list_arguments, **keyword_arguments)
  File "/Users/pbos/src/git/amuse/src/amuse/support/methods.py", line 109, in __call__
    result = self.convert_result(result)
  File "/Users/pbos/src/git/amuse/src/amuse/support/methods.py", line 145, in convert_result
    return self.definition.convert_result(self, result)
  File "/Users/pbos/src/git/amuse/src/amuse/support/interface.py", line 628, in convert_result
    return self.handle_return_value(method, result)
  File "/Users/pbos/src/git/amuse/src/amuse/support/interface.py", line 571, in handle_as_unit
    unit.append_result_value(method, self, value, result)
  File "/Users/pbos/src/git/amuse/src/amuse/support/interface.py", line 60, in append_result_value
    self.convert_result_value(method, definition, value)
  File "/Users/pbos/src/git/amuse/src/amuse/support/interface.py", line 70, in convert_result_value
    definition.handle_errorcode(errorcode)
  File "/Users/pbos/src/git/amuse/src/amuse/support/interface.py", line 548, in handle_errorcode
    raise exceptions.AmuseException("Error when calling '{0}' of a '{1}', errorcode is {2}".format(self.name, type(self.wrapped_object).__name__, errorcode), errorcode)
AmuseException: Error when calling 'generate_particles' of a 'GalactICs', errorcode is -4

======================================================================
ERROR: test6 (test.ext_tests.test_galactics_model.NewGalactICsModelTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/pbos/src/git/amuse/test/ext_tests/test_galactics_model.py", line 113, in test6
    do_scale = False, reuse_cached_model=False, verbose=True)
  File "/Users/pbos/src/git/amuse/src/amuse/ext/galactics_model.py", line 80, in __call__
    return _new_galactics_model(*args, code=code,**kwargs)
  File "/Users/pbos/src/git/amuse/src/amuse/ext/galactics_model.py", line 18, in _new_galactics_model
    instance.generate_particles()
  File "/Users/pbos/src/git/amuse/src/amuse/community/galactics/gas_interface.py", line 753, in generate_particles
    result = self.overridden().generate_particles()
  File "/Users/pbos/src/git/amuse/src/amuse/support/methods.py", line 105, in __call__
    object = self.precall()
  File "/Users/pbos/src/git/amuse/src/amuse/support/methods.py", line 148, in precall
    return self.definition.precall(self)
  File "/Users/pbos/src/git/amuse/src/amuse/support/interface.py", line 355, in precall
    transition.do()
  File "/Users/pbos/src/git/amuse/src/amuse/support/state.py", line 124, in do
    self.method.new_method()()
  File "/Users/pbos/src/git/amuse/src/amuse/support/methods.py", line 107, in __call__
    result = self.method(*list_arguments, **keyword_arguments)
  File "/Users/pbos/src/git/amuse/src/amuse/community/galactics/gas_interface.py", line 745, in commit_parameters
    self.overridden().commit_parameters()
  File "/Users/pbos/src/git/amuse/src/amuse/support/methods.py", line 107, in __call__
    result = self.method(*list_arguments, **keyword_arguments)
  File "/Users/pbos/src/git/amuse/src/amuse/support/methods.py", line 109, in __call__
    result = self.convert_result(result)
  File "/Users/pbos/src/git/amuse/src/amuse/support/methods.py", line 145, in convert_result
    return self.definition.convert_result(self, result)
  File "/Users/pbos/src/git/amuse/src/amuse/support/interface.py", line 628, in convert_result
    return self.handle_return_value(method, result)
  File "/Users/pbos/src/git/amuse/src/amuse/support/interface.py", line 564, in handle_as_unit
    return self.return_units.convert_result_value(method, self, return_value)
  File "/Users/pbos/src/git/amuse/src/amuse/support/interface.py", line 70, in convert_result_value
    definition.handle_errorcode(errorcode)
  File "/Users/pbos/src/git/amuse/src/amuse/support/interface.py", line 548, in handle_errorcode
    raise exceptions.AmuseException("Error when calling '{0}' of a '{1}', errorcode is {2}".format(self.name, type(self.wrapped_object).__name__, errorcode), errorcode)
AmuseException: Error when calling 'commit_parameters' of a 'GaslactICs', errorcode is -1
-------------------- >> begin captured stdout << ---------------------
adopted galaxy model parameters:
bulge_cutoff_potential: -20.0 length**2 / (time**2)
bulge_cutoff_radius: 7.0 length
bulge_density_parameter: 5.5 mass / (length**3)
bulge_do_center_flag: False
bulge_number_of_particles: 1000
bulge_random_seed: 12345678
bulge_scale_radius: 0.8 length
bulge_streaming_fraction: 0.5
bulge_type_parameter: 0
bulge_velocity_dispersion: 1.0 length / time
disk_central_radial_vdisp_over_z_vdisp: 1.0
disk_do_center_flag: False
disk_mass: 50.0
disk_number_of_particles: 1000
disk_outer_radius: 13.0 length
disk_random_seed: 98765432
disk_scale_height: 0.6 length
disk_scale_length: 3.0 length
disk_scale_length_of_radial_vdisp: 3.0 length
disk_truncation_dr: 1.0 length
disk_type_parameter: 0
gas_disk_gamma: 1.0
gas_disk_mass: 5.0
gas_disk_max_z: 5.0 length
gas_disk_number_of_particles: 1000
gas_disk_number_of_radial_bins: 50
gas_disk_outer_radius: 26.0 length
gas_disk_random_seed: 543212345
gas_disk_scale_length: 3.0 length
gas_disk_sound_speed: 0.12 length / time
gas_disk_truncation_dr: 1.0 length
gas_disk_velocity_dispersion: 0.0
halo_Ra: 4.0 length
halo_central_potential: -28.0 length**2 / (time**2)
halo_coreparam: 0.0
halo_density_parameter: 0.01 mass / (length**3)
halo_do_center_flag: False
halo_einasto_mass: 400.0
halo_einasto_nindex: 5.0
halo_number_of_particles: 1000
halo_q: 1.0
halo_random_seed: -1
halo_scale_radius: 13.7 length
halo_streaming_fraction: 0.5
halo_type_parameter: 3
halo_v0: 2.5 length / time
halo_virial_radius: 40.0 length
number_of_grid_intervals: 20000
number_of_iterations: 12
number_of_radial_steps_correction_fns: 200
order_of_multipole_expansion: 0
output_directory: /Users/pbos/src/git/amuse/data/gaslactics/output
radial_grid_delta_r: 0.025 length
reuse_cached_model: False

generating galaxy model, this may take a while...

--------------------- >> end captured stdout << ----------------------
-------------------- >> begin captured logging << --------------------
amuse.rfi.channel: DEBUG: got 1 strings of size [48]
amuse.rfi.channel: DEBUG: got 1 strings of size [48], data = [u'/Users/pbos/src/git/amuse/data/gaslactics/output']
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test4 (test.ext_tests.test_jobserver.TestRemoteCode)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/pbos/src/git/amuse/test/ext_tests/test_jobserver.py", line 83, in test4
    remote=RemoteCodeInterface(channel_type="sockets")
  File "/Users/pbos/src/git/amuse/src/amuse/ext/job_server.py", line 94, in __init__
    PythonCodeInterface.__init__(self, RemoteCodeImplementation, **options)
  File "/Users/pbos/src/git/amuse/src/amuse/rfi/core.py", line 1070, in __init__
    CodeInterface.__init__(self, name_of_the_worker, **options)
  File "/Users/pbos/src/git/amuse/src/amuse/rfi/core.py", line 711, in __init__
    self._start(name_of_the_worker = name_of_the_worker, **options)
  File "/Users/pbos/src/git/amuse/src/amuse/rfi/core.py", line 1086, in _start
    CodeInterface._start(self, name_of_the_worker = name_of_the_worker, **options)
  File "/Users/pbos/src/git/amuse/src/amuse/rfi/core.py", line 739, in _start
    self.channel.start()
  File "/Users/pbos/src/git/amuse/src/amuse/rfi/channel.py", line 2290, in start
    self.socket, address = self.accept_worker_connection(server_socket, self.process)
  File "/Users/pbos/src/git/amuse/src/amuse/rfi/channel.py", line 2218, in accept_worker_connection
    raise exceptions.CodeException('could not connect to worker, worker process terminated')
CodeException: could not connect to worker, worker process terminated
-------------------- >> begin captured logging << --------------------
amuse.rfi.channel: DEBUG: initializing SocketChannel with options {'channel_type': 'sockets', 'dynamic_python_code': True}
amuse.rfi.channel: DEBUG: full name of worker is /Users/pbos/src/git/amuse/job_server
amuse.rfi.channel: DEBUG: starting socket worker process, listening for worker connection on ('0.0.0.0', 59369)
amuse.rfi.channel: DEBUG: mpi_enabled: True
amuse.rfi.channel: DEBUG: starting process with command `/Users/pbos/sw/miniconda3/envs/amuse/bin/mpiexec`, arguments `['/Users/pbos/sw/miniconda3/envs/amuse/bin/mpiexec', '-n', '1', '/Users/pbos/sw/miniconda3/envs/amuse/bin/python', '/Users/pbos/src/git/amuse/src/amuse/rfi/run_command_redirected.pyc', '/dev/null', '/dev/null', '/Users/pbos/src/git/amuse/job_server', '59369', 'ESLT0072', 'true']` and environment '{'_': '/Users/pbos/sw/miniconda3/envs/amuse/bin/nosetests', 'CONDA_PYTHON_EXE': '/Users/pbos/sw/miniconda3/bin/python', 'LESS': '-F -g -i -M -R -S -w -X -z-4', 'FC': '/Users/pbos/sw/miniconda3/envs/amuse/bin/x86_64-apple-darwin13.4.0-gfortran', 'OMPI_MCA_rmaps_base_oversubscribe': 'yes', 'TERM_PROGRAM_VERSION': '3.1.6', 'PKG_CONFIG_PATH': '/Users/pbos/sw/lib/pkgconfig', 'FORTRANFLAGS': '-march=nocona -mtune=core2 -ftree-vectorize -fPIC -fstack-protector -O2 -pipe', 'LOGNAME': 'pbos', 'USER': 'pbos', 'PATH': '/Users/pbos/sw/miniconda3/envs/amuse/bin:/Users/pbos/sw/miniconda3/bin:/opt/local/bin:/opt/local/sbin:/usr/local/bin:/usr/local/sbin:/usr/bin:/bin:/usr/sbin:/sbin:/Library/TeX/texbin:/usr/local/MacGPG2/bin:/opt/X11/bin', 'HOME': '/Users/pbos', 'DISPLAY': '/private/tmp/com.apple.launchd.AAnKNWQE98/org.macosforge.xquartz:0', 'TERM_PROGRAM': 'iTerm.app', 'LANG': 'en_US.UTF-8', 'LESS_TERMCAP_se': '\x1b[0m', 'TERM': 'xterm-256color', 'Apple_PubSub_Socket_Render': '/private/tmp/com.apple.launchd.h55hb6of6R/Render', 'COLORFGBG': '12;8', 'SHLVL': '1', 'LESS_TERMCAP_me': '\x1b[0m', 'LESS_TERMCAP_md': '\x1b[01;31m', 'CONDA_PREFIX': '/Users/pbos/sw/miniconda3/envs/amuse', 'LESS_TERMCAP_mb': '\x1b[01;31m', 'XPC_FLAGS': '0x0', 'ITERM_SESSION_ID': 'w0t1p0:3984955F-3D6A-40FA-A4B8-6005D882148A', 'EDITOR': 'vim', 'CONDA_DEFAULT_ENV': 'amuse', 'FFLAGS': '-march=nocona -mtune=core2 -ftree-vectorize -fPIC -fstack-protector -O2 -pipe', 'GREP_COLOR': '37;45', 'JAVA_HOME': '/Library/Internet Plug-Ins/JavaAppletPlugin.plugin/Contents/Home', 'TERM_SESSION_ID': 'w0t1p0:3984955F-3D6A-40FA-A4B8-6005D882148A', 'XPC_SERVICE_NAME': '0', 'GREP_COLORS': 'mt=37;45', 'PYTHONPATH': '/Users/pbos/code/python_modules', 'SSH_AUTH_SOCK': '/private/tmp/com.apple.launchd.x3PBS16MJa/Listeners', 'POWERLEVEL9K_MODE': 'nerdfont-complete', 'CONDA_PROMPT_MODIFIER': '', 'VISUAL': 'vim', 'SHELL': '/bin/zsh', 'GFORTRAN': '/Users/pbos/sw/miniconda3/envs/amuse/bin/x86_64-apple-darwin13.4.0-gfortran', 'ITERM_PROFILE': 'Default', 'CONDA_SHLVL': '1', 'TMPDIR': '/var/folders/_k/2ysbjq6j6cz6ybqxl777795r0000gn/T/', 'BROWSER': 'open', 'F95': '/Users/pbos/sw/miniconda3/envs/amuse/bin/x86_64-apple-darwin13.4.0-gfortran', 'LESS_TERMCAP_ue': '\x1b[0m', 'LSCOLORS': 'exfxcxdxbxGxDxabagacad', 'OLDPWD': '/Users/pbos/mpi_spawn_test', 'HOST': 'x86_64-apple-darwin13.4.0', 'CONDA_BACKUP_HOST': 'ESLT0072', '__CF_USER_TEXT_ENCODING': '0x0:0:5', 'PWD': '/Users/pbos/src/git/amuse', 'DEBUG_FFLAGS': '-march=nocona -mtune=core2 -ftree-vectorize -fPIC -fstack-protector -O2 -pipe -march=nocona -mtune=core2 -ftree-vectorize -fPIC -fstack-protector -O2 -pipe -Og -g -Wall -Wextra -fcheck=all -fbacktrace -fimplicit-none -fvar-tracking-assignments', 'LESS_TERMCAP_us': '\x1b[01;32m', 'COLORTERM': 'truecolor', 'LESS_TERMCAP_so': '\x1b[00;47;30m', 'LS_COLORS': 'di=34:ln=35:so=32:pi=33:ex=31:bd=36;01:cd=33;01:su=31;40;07:sg=36;40;07:tw=32;40;07:ow=33;40;07:', 'PAGER': 'less', 'CONDA_EXE': '/Users/pbos/sw/miniconda3/bin/conda', 'DEFAULT_USER': 'pbos'}'
amuse.rfi.channel: DEBUG: waiting for connection from worker
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test2 (test.ext_tests.test_sph_to_grid.TestSPH2Grid)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/pbos/src/git/amuse/test/ext_tests/test_sph_to_grid.py", line 71, in test2
    sph_code = self.setup_sph_code(Fi, number_of_particles, L, rho, u)
  File "/Users/pbos/src/git/amuse/test/ext_tests/test_sph_to_grid.py", line 16, in setup_sph_code
    sph_code = sph_code(converter, mode = 'periodic')#, redirection = 'none')
  File "/Users/pbos/src/git/amuse/src/amuse/community/fi/interface.py", line 1679, in __init__
    legacy_interface = FiInterface(mode = mode, **options)
  File "/Users/pbos/src/git/amuse/src/amuse/community/fi/interface.py", line 53, in __init__
    CodeInterface.__init__(self, name_of_the_worker = self.name_of_the_worker(mode), **options)
  File "/Users/pbos/src/git/amuse/src/amuse/rfi/core.py", line 711, in __init__
    self._start(name_of_the_worker = name_of_the_worker, **options)
  File "/Users/pbos/src/git/amuse/src/amuse/rfi/core.py", line 730, in _start
    self.channel = self.channel_factory(name_of_the_worker, type(self), interpreter_executable = interpreter_executable, **options)
  File "/Users/pbos/src/git/amuse/src/amuse/rfi/channel.py", line 1364, in __init__
    self.full_name_of_the_worker = self.get_full_name_of_the_worker(legacy_interface_type)
  File "/Users/pbos/src/git/amuse/src/amuse/rfi/channel.py", line 1167, in get_full_name_of_the_worker
    raise exceptions.CodeException("The worker application does not exist, it should be at: \n{0}".format('\n'.join(tried_workers)))
CodeException: The worker application does not exist, it should be at:
/Users/pbos/src/git/amuse/src/amuse/community/fi/fi_worker_periodic
/Users/pbos/src/git/amuse/src/amuse/rfi/fi_worker_periodic
-------------------- >> begin captured stdout << ---------------------
Testing convert_SPH_to_grid with Fi

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test4 (test.ext_tests.test_sph_to_grid.TestSPH2Grid)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/pbos/src/git/amuse/test/ext_tests/test_sph_to_grid.py", line 121, in test4
    sph_code = self.setup_sph_code(Fi, number_of_particles, L, rho, u)
  File "/Users/pbos/src/git/amuse/test/ext_tests/test_sph_to_grid.py", line 16, in setup_sph_code
    sph_code = sph_code(converter, mode = 'periodic')#, redirection = 'none')
  File "/Users/pbos/src/git/amuse/src/amuse/community/fi/interface.py", line 1679, in __init__
    legacy_interface = FiInterface(mode = mode, **options)
  File "/Users/pbos/src/git/amuse/src/amuse/community/fi/interface.py", line 53, in __init__
    CodeInterface.__init__(self, name_of_the_worker = self.name_of_the_worker(mode), **options)
  File "/Users/pbos/src/git/amuse/src/amuse/rfi/core.py", line 711, in __init__
    self._start(name_of_the_worker = name_of_the_worker, **options)
  File "/Users/pbos/src/git/amuse/src/amuse/rfi/core.py", line 730, in _start
    self.channel = self.channel_factory(name_of_the_worker, type(self), interpreter_executable = interpreter_executable, **options)
  File "/Users/pbos/src/git/amuse/src/amuse/rfi/channel.py", line 1364, in __init__
    self.full_name_of_the_worker = self.get_full_name_of_the_worker(legacy_interface_type)
  File "/Users/pbos/src/git/amuse/src/amuse/rfi/channel.py", line 1167, in get_full_name_of_the_worker
    raise exceptions.CodeException("The worker application does not exist, it should be at: \n{0}".format('\n'.join(tried_workers)))
CodeException: The worker application does not exist, it should be at:
/Users/pbos/src/git/amuse/src/amuse/community/fi/fi_worker_periodic
/Users/pbos/src/git/amuse/src/amuse/rfi/fi_worker_periodic
-------------------- >> begin captured stdout << ---------------------
Testing convert_SPH_to_grid with Fi and do_scale

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test10 (test.ext_tests.test_spherical_model.TestUniformSphericalDistribution)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/pbos/src/git/amuse/test/ext_tests/test_spherical_model.py", line 118, in test10
    type="glass", target_rms=0.3)
  File "/Users/pbos/src/git/amuse/src/amuse/ext/spherical_model.py", line 278, in new_uniform_spherical_particle_distribution
    x, y, z = UniformSphericalDistribution(number_of_particles, **keyword_arguments).result
  File "/Users/pbos/src/git/amuse/src/amuse/ext/spherical_model.py", line 246, in result
    return getattr(self, self.type)()
  File "/Users/pbos/src/git/amuse/src/amuse/ext/spherical_model.py", line 191, in glass
    sph = Fi(mode = 'periodic', redirection = 'none')
  File "/Users/pbos/src/git/amuse/src/amuse/community/fi/interface.py", line 1679, in __init__
    legacy_interface = FiInterface(mode = mode, **options)
  File "/Users/pbos/src/git/amuse/src/amuse/community/fi/interface.py", line 53, in __init__
    CodeInterface.__init__(self, name_of_the_worker = self.name_of_the_worker(mode), **options)
  File "/Users/pbos/src/git/amuse/src/amuse/rfi/core.py", line 711, in __init__
    self._start(name_of_the_worker = name_of_the_worker, **options)
  File "/Users/pbos/src/git/amuse/src/amuse/rfi/core.py", line 730, in _start
    self.channel = self.channel_factory(name_of_the_worker, type(self), interpreter_executable = interpreter_executable, **options)
  File "/Users/pbos/src/git/amuse/src/amuse/rfi/channel.py", line 1364, in __init__
    self.full_name_of_the_worker = self.get_full_name_of_the_worker(legacy_interface_type)
  File "/Users/pbos/src/git/amuse/src/amuse/rfi/channel.py", line 1167, in get_full_name_of_the_worker
    raise exceptions.CodeException("The worker application does not exist, it should be at: \n{0}".format('\n'.join(tried_workers)))
CodeException: The worker application does not exist, it should be at:
/Users/pbos/src/git/amuse/src/amuse/community/fi/fi_worker_periodic
/Users/pbos/src/git/amuse/src/amuse/rfi/fi_worker_periodic
-------------------- >> begin captured stdout << ---------------------
Test new_uniform_spherical_particle_distribution, glass

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
Ran 270 tests in 206.600s

FAILED (SKIP=18, errors=7)

Not bad, but not perfect either. Here again some @rpath issues. Discussed this with @ipelupessy, will fix these dynamic build tests by adding LDFLAGS in the from-Python-compilation scripting as well.

egpbos commented 6 years ago

Btw, I think this should be done in a Travis build to keep things as clean as possible, otherwise it's nearly impossible to make sure whether indeed no local toolchains are interfering with the Conda toolchain. I think due to the long test times, we should then do either some representative selection or perhaps a random selection of tests. Alternatively, we could do just a build test and a separate (longer) actual test test.

egpbos commented 6 years ago

Regarding the LDFLAGS issues in the ext_tests: #264

nluetzge commented 5 years ago

Hey I guess this is the right thread for my problems. I'm also using the conda environment but since I didn't get it to work with the conda packages (even the ones from conda-forge) I decided to go with the macport packages but still being in the conda env for the python stuff... urgh so it installs 45/47 codes without errormessages but then if I want to run the code I'm getting this error:

----> 1 ph4()

/anaconda3/envs/amuse/lib/python2.7/site-packages/amuse/community/ph4/interface.pyc in __init__(self, convert_nbody, **keyword_arguments)
    467 
    468     def __init__(self, convert_nbody = None, **keyword_arguments):
--> 469         legacy_interface = ph4Interface(**keyword_arguments)
    470 
    471         self.stopping_conditions = StoppingConditions(self)

/anaconda3/envs/amuse/lib/python2.7/site-packages/amuse/community/ph4/interface.pyc in __init__(self, mode, **options)
     31             self,
     32             name_of_the_worker=self.name_of_the_muse_worker(mode),
---> 33             **options
     34         )
     35 

/anaconda3/envs/amuse/lib/python2.7/site-packages/amuse/rfi/core.pyc in __init__(self, name_of_the_worker, **options)
    750 
    751         if self.must_start_worker:
--> 752             self._start(name_of_the_worker = name_of_the_worker, **options)
    753 
    754     def __del__(self):

/anaconda3/envs/amuse/lib/python2.7/site-packages/amuse/rfi/core.pyc in _start(self, name_of_the_worker, interpreter_executable, **options)
    778         self.channel.initialize_mpi = self.initialize_mpi
    779 
--> 780         self.channel.start()
    781 
    782         # must register stop interfaces after channel start

/anaconda3/envs/amuse/lib/python2.7/site-packages/amuse/rfi/channel.pyc in start(self)
   1807 
   1808         logger.debug("starting process with command `%s`, arguments `%s` and environment '%s'", command, arguments, os.environ)
-> 1809         self.process = Popen(arguments, executable=command, stdin=PIPE, stdout=None, stderr=None, close_fds=True)
   1810         logger.debug("waiting for connection from worker")
   1811 

/anaconda3/envs/amuse/lib/python2.7/subprocess.pyc in __init__(self, args, bufsize, executable, stdin, stdout, stderr, preexec_fn, close_fds, shell, cwd, env, universal_newlines, startupinfo, creationflags)
    392                                 p2cread, p2cwrite,
    393                                 c2pread, c2pwrite,
--> 394                                 errread, errwrite)
    395         except Exception:
    396             # Preserve original exception in case os.close raises.

/anaconda3/envs/amuse/lib/python2.7/subprocess.pyc in _execute_child(self, args, executable, preexec_fn, close_fds, cwd, env, universal_newlines, startupinfo, creationflags, shell, to_close, p2cread, p2cwrite, c2pread, c2pwrite, errread, errwrite)
   1045                         raise
   1046                 child_exception = pickle.loads(data)
-> 1047                 raise child_exception
   1048 
   1049 

OSError: [Errno 2] No such file or directory

I looked which file it was looking for and it looks like that it is trying to get /anaconda3/envs/amuse/bin/mpiexec which does not exist sigh it should look in the macports directory for this but I don't know where I tell AMUSE to use this one. It obviously used those directories for the installation also, even weirder, which mpiexec yields /opt/local/bin/mpiexec so the right directory. Where is Amuse getting its information from? All I could find is that it is an attribute in that channel.SocketChannel but this is a also a structure where my python knowledge ends...

ipelupessy commented 5 years ago

did you rerun configure? you can make sure by specifying MPIEXEC when running configure, but you can also change config.mk and config.py by hand now. I see that there is a slight oddity with the config, here the MPICC etc in the config files don't have the full path, where MPIEXEC points to a full path...

nluetzge commented 5 years ago

Actually I did not run configure but I did edit the config.mk by hand. And then just run python setup.py install I know you are not supposed to do this but I find this much easier than having to guess how to call the config with the right inputs. So config.mk has the right path:

config.mk.txt

I also checked that the config.mk in the anaconda3/envs/amuse/lib/python2.7/site-packages/amuse has the right paths and it does.

ipelupessy commented 5 years ago

did you also change the config.py? (in src/amuse)

nluetzge commented 5 years ago

Urgh I just found it... will try again now. To do it correctly I would have to set all of this in the ./configure command right? I'm just always having a hard time to get the syntax right for this.

nluetzge commented 5 years ago

It would be nice having a config file rather than the command :)

ipelupessy commented 5 years ago

well, you should not have to change anything in the configure at all (if you mean that), there are options to set (./configure --help)

ipelupessy commented 5 years ago

(the configure is auto-generated and does some detections and checks)

nluetzge commented 5 years ago

I didn't change anything in the configure. What I mean is... in my case I want to make sure that it is using the right packages. Because conda sometimes installs MPI for some of their packages and I don't want AMUSE to use this MPI but the one from the macports. However if I"m in the conda environment it will by design take the conda packages if I don't specify it. And I guess the way to do it would be to run ./configure MPI=use_my_maport_pi or something like that. But I much rather would have a text file where I can see which of the flags/paths I have to change so that I'm sure it is using all the right packages. With the configure method I might miss some...so thats the reasin why I started editing the config.mk directly.

nluetzge commented 5 years ago

Arrgh it still doesn't work :(

config.py.txt

ipelupessy commented 5 years ago

is there an amuserc somewhere?

nluetzge commented 5 years ago

There is. The one in my local folder is empty. The one in the site-packages folder says this:

[channel]
must_check_if_worker_is_up_to_date=0
use_python_interpreter=1
[data]
output_data_root_directory=_amuse_output_data
ipelupessy commented 5 years ago

hmm that does not explain it (mpiexec can be overridden in the amuserc, but it isn't). You can try to add it to the channel section, but first can you try:

from amuse import config
print(config.mpi.mpiexec)

??

nluetzge commented 5 years ago

This gives me the correct path:

In [5]: print(config.mpi.mpiexec)
/opt/local/mpiexec

However, when I go into the debugger for the error I'm getting loading ph4 or any other code I get this:

-> self.process = Popen(arguments, executable=command, stdin=PIPE, stdout=None, stderr=None, close_fds=True)
(Pdb) arguments
['/anaconda3/envs/amuse/bin/mpiexec', '-n', '1', '/anaconda3/envs/amuse/bin/python', '/anaconda3/envs/amuse/lib/python2.7/site-packages/amuse/rfi/run_command_redirected.pyc', '/dev/null', '/dev/null', '/anaconda3/envs/amuse/lib/python2.7/site-packages/amuse/_workers/ph4_worker', '56047', 'Linus.local', 'true']
(Pdb) self.mpiexec
'/anaconda3/envs/amuse/bin/mpiexec'
ipelupessy commented 5 years ago

btw, if you are using the latest amuse, and an installed amuse, there is: config.mk in /share/amuse and in .../site-packages/amuse config.py in ../site-packages/amuse

nluetzge commented 5 years ago

OK found that but looks ok to me too.

ipelupessy commented 5 years ago

hmm also it is strange that it is using socket channel... can you run the test with

import logging
logging.basicConfig(level=logging.DEBUG)

somewhere at the top. Did you try adding

mpiexec=<iopt/local/mpiexec>

to the amuserc (maybe make a copy to your working dir - it should find it there too)

can you check that is is using the correct config (printing config.file)??

its either using config.mpi.mpiexec or something that comes from an amuserc file...

ipelupessy commented 5 years ago

also from amuserc.example: This is an example configuration file for AMUSE AMUSE will look for the amuserc file in the following locations 1) amuse python path (src/amuse, used for binary installs) 2) amuse root directory (set by the AMUSE_DIR environment variable) 3) your home directory (under the name .amuserc, so a hidden file) 4) a file set in the AMUSERC environment variable 5) the current working directory AMUSE will combine the files it finds in these paths. The file in the path with the higher number will overwrite the values in the file(s) in the path with lower number. For example the values in the file in the current working directory will override the values in the file in the amuse root directory. Amuse will use the 'amuserc' filename, except for the home directory were it will look for .amuserc (${HOME}/.amuserc)

so...do you have an old amusrc somewhere??

nluetzge commented 5 years ago

Ok found it. I think somewhere on the way I even managed to mess up the macports path and my conig.py had /opt/local/mpiexec instead of /opt/local/bin/mpiexec Its running now yay! Thanks Inti!

So in order to do this right the next time. What is the best way? It can't be the right me editing the config.mk and config.py by hand...

ipelupessy commented 5 years ago

ideally you should have your environment setup that it finds the correct mpi and then ./configure should do, otherwise ./configure MPIEXEC=/opt/local/bin/mpiexec should do it....

ipelupessy commented 5 years ago

@nluetzge I have changed config.py so that there is a single configuration file (config.mk), so you can change this by hand easier...