NanoComp / meep

free finite-difference time-domain (FDTD) software for electromagnetic simulations
GNU General Public License v2.0
1.21k stars 618 forks source link

improved Apple silicon support #1853

Open stevengj opened 2 years ago

stevengj commented 2 years ago

I now have an Apple M4 laptop, and I've verified that Meep master and all of its dependencies build from source. I've also updated the macOS compilation instructions in the documentation. Some to-do's:

The C++ tests pass for me, but I'm currently getting some Python test failures that I'm tracking down.

In general, it's nice to have code running on multiple architectures, because it helps expose bugs.

stevengj commented 2 years ago

Most of the Python tests are passing.

Several tests are failing simply because I don't have h5py installed — pip3 install h5py is failing, but presumably I should be able to get it to work if I can tell pip where to find the homebrew hdf5 library. Update: HDF5_DIR="$(brew --prefix hdf5)" pip3 install h5py did the trick.

The only really concerning case is that tests/test_source is segfaulting.

stevengj commented 2 years ago

The JAX test is failing:

FAIL: tests/test_adjoint_jax
============================

/Users/stevenj/Library/Python/3.8/lib/python/site-packages/jax/_src/lib/__init__.py:32: UserWarning: JAX on Mac ARM machines is experimental and minimally tested. Please see https://github.com/google/jax/issues/5501 in the event of problems.
  warnings.warn("JAX on Mac ARM machines is experimental and minimally tested. "
Traceback (most recent call last):
  File "/Users/stevenj/Library/Python/3.8/lib/python/site-packages/jax/_src/lib/__init__.py", line 37, in <module>
    import jaxlib as jaxlib
ModuleNotFoundError: No module named 'jaxlib'

(I did pip3 install jax)

pip3 install jaxlib says "Could not find a version that satisfies the requirement jaxlib", so I guess it's not available for ARM yet?

stevengj commented 2 years ago

The MPB test is failing, possibly just due to the tolerances being too low:

AIL: test_compute_field_energy (__main__.TestModeSolver)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "./tests/test_mpb.py", line 439, in test_compute_field_energy
    self.assertTrue(expected_fp.close(field_pt))
AssertionError: False is not true

======================================================================
FAIL: test_hole_slab (__main__.TestModeSolver)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "./tests/test_mpb.py", line 755, in test_hole_slab
    self.compare_h5_files(ref_path, res_path)
  File "./tests/test_mpb.py", line 242, in compare_h5_files
    compare_arrays(self, ref[k][()], res[k][()], tol=tol)
  File "/Users/stevenj/Documents/Code/meep/python/tests/utils.py", line 15, in compare_arrays
    test_instance.assertLess(diff, tol)
AssertionError: 0.0010791135682827247 not less than 0.001

======================================================================
FAIL: test_output_efield_z (__main__.TestModeSolver)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "./tests/test_mpb.py", line 486, in test_output_efield_z
    self.compare_h5_files(ref_path, res_path)
  File "./tests/test_mpb.py", line 242, in compare_h5_files
    compare_arrays(self, ref[k][()], res[k][()], tol=tol)
  File "/Users/stevenj/Documents/Code/meep/python/tests/utils.py", line 15, in compare_arrays
    test_instance.assertLess(diff, tol)
AssertionError: 0.0022693433759486347 not less than 0.001

======================================================================
FAIL: test_run_te_no_geometry (__main__.TestModeSolver)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "./tests/test_mpb.py", line 295, in test_run_te_no_geometry
    self.check_band_range_data(expected_brd, ms.band_range_data)
  File "./tests/test_mpb.py", line 201, in check_band_range_data
    self.assertAlmostEqual(exp[0][0], res[0][0], places=places)
AssertionError: 0.5000000000350678 != 0.509901952544687 within 3 places (0.009901952509619116 difference)

----------------------------------------------------------------------
Ran 46 tests in 101.831s

FAILED (failures=4)
ahoenselaar commented 2 years ago

See #1785 for a potentially similar MPB failure.

oradwastaken commented 2 years ago

I just got my own Apple Silicon laptop. I'm used to installing meep using conda and conda-forge. How do I go about it now?

stevengj commented 2 years ago

You currently have to compile from source.

oradwastaken commented 2 years ago

I've never compiled from source but I think I did it properly. When I run "meep" in terminal I get a command line.

However, my python install in my conda environment doesn't seem to be aware of the meep package. Any advice on what step could have gone wrong?

I get: ModuleNotFoundError: No module named 'meep' or ModuleNotFoundError: No module named 'mpb'

oradwastaken commented 2 years ago

I tried pip install-ing meep and mpb from the meep directory. Now I'm able to find the module, but they're effectively empty, e.g.:

AttributeError: module 'meep' has no attribute 'Medium'

oradwastaken commented 2 years ago

I figured it out. pip install-ing is the wrong approach as explained here. You need to add meep to your PYTHON path.

For me, I added the following to .zshrc:

export PYTHONPATH=$PYTHONPATH:/usr/local/lib/python3.9/site-packages

oradwastaken commented 2 years ago

This doesn't entirely solve the problem. I'm moving these question to a discussion:

1988

oradwastaken commented 2 years ago

@stevengj Is there any progress on building a conda-forge build for Apple silicon? Currently I've gotten meep to work by sim-linking the installation built from source into my conda environment... but this feels very precarious, and I'm worried any changes to my environment will affect my meep installation.

In fact I came here looking for an update precisely because this is what recently went wrong for me.

stevengj commented 2 years ago

I don't know if anyone is working on updating the conda build right now…

You shouldn't have to symlink anything. Just do make install into some directory of your choice (not in the conda directory) and then add this to your PYTHONPATH.

lucasgrjn commented 2 years ago

Hi !

I rented an Apple Silicon M1 to play with Meep and do some test comparisons. I was able to install Meep for both simple and MPI version. (Thanks @oradwastaken for #1988 !). I put the script here if needed.

Here is my result for the MPB test. What surprises me the most is the difference for test_diamond.

======================================================================
FAIL: test_diamond (__main__.TestModeSolver)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/m1/install_meep/meep-1.23.0/python/./tests/test_mpb.py", line 728, in test_diamond
    self.compare_h5_files(ref_path, res_path)
  File "/Users/m1/install_meep/meep-1.23.0/python/./tests/test_mpb.py", line 240, in compare_h5_files
    compare_arrays(self, ref[k][()], res[k][()], tol=tol)
  File "/Users/m1/install_meep/meep-1.23.0/python/tests/utils.py", line 15, in compare_arrays
    test_instance.assertLess(diff, tol)
AssertionError: 0.6768967148310044 not less than 0.001

======================================================================
FAIL: test_epsilon_input_file (__main__.TestModeSolver)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/m1/install_meep/meep-1.23.0/python/./tests/test_mpb.py", line 1248, in test_epsilon_input_file
    self.check_gap_list(expected_gap_list, ms.gap_list)
  File "/Users/m1/install_meep/meep-1.23.0/python/./tests/test_mpb.py", line 216, in check_gap_list
    self.check_freqs(expected_gap_list, result)
  File "/Users/m1/install_meep/meep-1.23.0/python/./tests/test_mpb.py", line 213, in check_freqs
    self.assertAlmostEqual(r, e, places=3)
AssertionError: 1.4651880980146579 != 1.469257717113039 within 3 places (0.004069619098381105 difference)

======================================================================
FAIL: test_hole_slab (__main__.TestModeSolver)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/m1/install_meep/meep-1.23.0/python/./tests/test_mpb.py", line 756, in test_hole_slab
    self.compare_h5_files(ref_path, res_path)
  File "/Users/m1/install_meep/meep-1.23.0/python/./tests/test_mpb.py", line 238, in compare_h5_files
    self.assertEqual(ref[k][()], res[k][()])
AssertionError: b'h field, kpoint 1, band 9, freq=0.592619' != b'h field, kpoint 1, band 9, freq=0.59253'

======================================================================
FAIL: test_output_efield_z (__main__.TestModeSolver)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/m1/install_meep/meep-1.23.0/python/./tests/test_mpb.py", line 485, in test_output_efield_z
    self.compare_h5_files(ref_path, res_path)
  File "/Users/m1/install_meep/meep-1.23.0/python/./tests/test_mpb.py", line 240, in compare_h5_files
    compare_arrays(self, ref[k][()], res[k][()], tol=tol)
  File "/Users/m1/install_meep/meep-1.23.0/python/tests/utils.py", line 15, in compare_arrays
    test_instance.assertLess(diff, tol)
AssertionError: 0.0022693433759486347 not less than 0.001

======================================================================
FAIL: test_run_te_no_geometry (__main__.TestModeSolver)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/m1/install_meep/meep-1.23.0/python/./tests/test_mpb.py", line 293, in test_run_te_no_geometry
    self.check_band_range_data(expected_brd, ms.band_range_data)
  File "/Users/m1/install_meep/meep-1.23.0/python/./tests/test_mpb.py", line 199, in check_band_range_data
    self.assertAlmostEqual(exp[0][0], res[0][0], places=places)
AssertionError: 0.5000000000350678 != 0.509901952544687 within 3 places (0.009901952509619116 difference)

======================================================================
FAIL: test_strip (__main__.TestModeSolver)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/m1/install_meep/meep-1.23.0/python/./tests/test_mpb.py", line 894, in test_strip
    self.assertAlmostEqual(e, r, places=3)
AssertionError: -0.9945407488966614 != -0.9950905165812965 within 3 places (0.0005497676846351052 difference)

----------------------------------------------------------------------
Ran 47 tests in 90.214s

FAILED (failures=6)
oradwastaken commented 2 years ago

Good morning Lucas @Dj1312! I'm glad to hear someone else has been looking into this.

I ended up doing my installation using my own script that is very similar to yours. I discovered you can make your life a tiny bit easier by exporting these flags once at the top instead of including them in every .configure script:

export CC=mpicc
export CPPFLAGS="-I$(brew --prefix)/include"
export LDFLAGS="-L$(brew --prefix)/lib"

One thing I tried to do is offload as many of the dependencies from brew and pip to conda. So instead of

brew install hdf5 guile fftw gsl libpng autoconf automake libtool swig
HDF5_DIR="$(brew --prefix hdf5)" pip3 install numpy matplotlib scipy autograd jax parameterized h5py jaxlib mpi4py

I tried:

brew install guile
export HDF5_DIR="$(brew --prefix hdf5)"
conda install -c conda-forge hdf5 fftw gsl libpng autoconf automake libtool swig numpy matplotlib scipy autograd jax parameterized h5py jaxlib mpi4py

for a simple installation, and for MPI:

brew install guile
export HDF5_DIR="$(brew --prefix hdf5)"
conda install -c conda-forge "hdf5=*=mpi_mpich_*" "fftw=*=mpi_mpich_*" gsl libpng autoconf automake libtool swig numpy matplotlib scipy autograd jax parameterized h5py jaxlib mpi4py

With those installed by conda, I then need to also add the conda prefix to the meep installation, so:

export CC=mpicc
export CPPFLAGS="-I$(CONDA_PREFIX)/include -I$(brew --prefix)/include"
export LDFLAGS="-L$(CONDA_PREFIX)/lib -L$(brew --prefix)/lib"

This... doesn't seem to work, I'm not sure why. I was never able to get this work when I replaced the homebrew packages with conda ones, something kept failing with the compilation. I tried different things like conda-install-ing mpicc or cpp-compiler and things like that, but it never solved the problem

I was able to move all of the pip dependencies to conda, but then the mpi features don't work in meep. I'd appreciate any insight you might have on this!

oradwastaken commented 2 years ago

Another realization: I never used configure with --enable-shared. So perhaps that's my issue? Maybe. I'll look into that myself.

oradwastaken commented 2 years ago

I tried it out. Even when using --enable-shared everywhere, I still get configure: error: could not find mpi library for --with-mpi

a-coffee commented 1 year ago

Hi all, I've tried to follow the build from source instructions on an M1 Mac, particularly @Dj1312 's script, but I'm getting odd errors with Guile. Despite having Guile v3.0.9 installed from homebrew, libctl, MPB, and Meep all fail configuration with error: ("/opt/homebrew/opt/pkg-config/bin/pkg-config" "--libs" "guile-3.0") exited with non-zero error code 127 (and similarly for the --cflags).

I eventually installed without guile, but the Meep configure required the additional flag —without-scheme, which is not mentioned in the readthedocs. All tests passed except for test_get_epsilon_grid.py and test_mpb.py.

mcrobbins commented 1 year ago

Hi everyone, I'm getting the same configuration error as @a-coffee with Guile (Guile-config is more specifically where the error is triggered). Has there been any progress figuring that out? Thanks!

stevengj commented 1 year ago

Unless you need the Scheme interface, I would just configure libctl --without-guile, configure MPB --without-libctl, and configure Meep --without-scheme (installed in that order!). (This should probably be in the docs.)

mcrobbins commented 1 year ago

Thanks for the help, I ran things in this manner, but I'm running into a couple issues (I'm still learning Python btw so I may be making stupid errors). When running make check I pass all of the "making check in tests", but error out after that with the error below: meep-python.cxx:3023:10: fatal error: 'numpy/arrayobject.h' file not found

Additionally, if I try to import meep in python I get the same error I got when trying to (mistakenly) install with mamba forge:

File "/Users/matthewrobbins/miniforge3/envs/env_meep2/lib/python3.9/site-packages/meep/__init__.py", line 13, in <module>
    from . import _meep
ImportError: dlopen(/Users/matthewrobbins/miniforge3/envs/env_meep2/lib/python3.9/site-packages/meep/_meep.so, 0x0002): tried: '/usr/local/lib/_meep.so' (no such file), '/opt/homebrew/lib/_meep.so' (no such file), '/usr/local/lib/_meep.so' (no such file), '/opt/homebrew/lib/_meep.so' (no such file), '/_meep.so' (no such file), '/Users/matthewrobbins/miniforge3/envs/env_meep2/lib/python3.9/site-packages/meep/_meep.so' (mach-o file, but is an incompatible architecture (have (arm64), need (x86_64))), '/usr/local/lib/_meep.28.so' (no such file), '/opt/homebrew/lib/_meep.28.so' (no such file), '/usr/local/lib/_meep.28.so' (no such file), '/opt/homebrew/lib/_meep.28.so' (no such file), '/_meep.28.so' (no such file), '/usr/local/lib/python3.9/site-packages/meep/_meep.28.so' (mach-o file, but is an incompatible architecture (have (arm64), need (x86_64)))

Any thoughts or ideas here? If I can clarify the question at all please let me know, still getting used to github as well.

lucasgrjn commented 1 year ago

Sorry guys for the lack of support. I don't actually have a Silicon Mac. I use a virtual instance. Tomorrow I'll rent one and do some testing to see if I can help!

mcrobbins commented 1 year ago

I appreciate the help! FYI, I'm now at the point where the make check passes all of the C++ tests but fails all of the Python tests. But still getting the same error I mentioned previously when importing in Python.

marcus-o commented 10 months ago

Hi All, just if someone is struggling to install meep on M1, I got a working version of parallel MEEP on an M1 Macbook using brew and conda and the following commands. It uses the current directory to build and install everything, so it does not require sudo. I need to use python 11 due to this issue: https://github.com/cython/cython/issues/5238 .

It fails these tests: symmetry due to a numeric error (Testing nonlinear in 3D... real part = -4.32937 differs by 0.00400953 from -4.33338), python adjoint test_periodic_design due to a numeric error (AssertionError: 5.015462736938494e-16 != 0 within 15 places (5.015462736938494e-16 difference)), python test_chunk_layout due to a numeric error (7.999999999999998 != 8), python test_dump_load for a reason I do not understand (meep: inconsistent data size for sigma_cd in structure::load), python test_mpb due to a numeric error (0.5000000000350678 != 0.5099019525446858 within 3 places (0.009901952509618006 difference)), python test_multilevel_atom due to a large error (-0.007963856763454085 != -2.7110969214986387 within 7 places (2.7031330647351846 difference)), python test_ring due to a numeric error (0.0034951461179838616 != 0.00341267634436 within 4 places (8.246977362386153e-05 difference)).

# %%
# clean up (should normally not be necessary)
# conda activate base
# conda env remove --name pmp
# conda clean --all -y
# pip cache purge
# rm -r product
# rm -r install

# in a new folder:
mkdir product
mkdir install
cd install

# prepare an environment that for sure uses open mpi and parallel hdf5
xcode-select --install
brew unlink hdf5
brew unlink mpich
# install prerequisites from homebrew
brew install hdf5-mpi fftw gsl libpng autoconf automake libtool swig wget openblas open-mpi
# prepare conda environment
conda create -n pmp -y
conda activate pmp
# install prerequisites from conda
conda install python=3.11 numpy matplotlib scipy autograd jax parameterized ffmpeg nlopt -y

# include dirs for open mpi
INCLUDEADD=$(mpicc --showme:incdirs)
LINKADD=$(mpicc --showme:link)
# must be an absolute path
CURDIR=$(pwd)
PREFIX=$CURDIR/product/
CPPFLAGS="-O3 -I$PREFIX/include -I$(brew --prefix openblas)/include -I$(brew --prefix)/include -I$INCLUDEADD"
LDFLAGS="-L$PREFIX/lib -L$(brew --prefix openblas)/lib -L$(brew --prefix)/lib $LINKADD"
CC="$(brew --prefix)/bin/mpicc"
CXX="$(brew --prefix)/bin/mpic++"

# install mpi4py from pip
CC=$CC CXX=$CXX CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" python -m pip install --no-cache-dir --no-binary=mpi4py mpi4py

# install h5py with open mpi from github
git clone https://github.com/h5py/h5py.git
cd h5py
HDF5_MPI="ON" HDF5_DIR="$(brew --prefix)" CC=$CC CXX=$CXX CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" pip install --no-cache-dir --no-binary=h5py,mpi4py .
cd ..

# check that mpi4py works and h5py can use openmpi
echo "from mpi4py import MPI" > mpitest.py
echo "print('Hello World (from process %d)' % MPI.COMM_WORLD.Get_rank())" >> mpitest.py
echo "import h5py" >> mpitest.py
echo "rank = MPI.COMM_WORLD.rank" >> mpitest.py
echo "f = h5py.File('parallel_test.hdf5', 'w', driver='mpio', comm=MPI.COMM_WORLD)" >> mpitest.py
echo "dset = f.create_dataset('test', (4,), dtype='i')" >> mpitest.py
echo "dset[rank] = rank" >> mpitest.py
echo "f.close()" >> mpitest.py
mpirun -np 4 python -m mpi4py ../mpitest.py

# install current version of harminv from github
git clone https://github.com/NanoComp/harminv.git
cd harminv
sh autogen.sh CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" PYTHON=python --enable-shared --enable-maintainer-mode --prefix=$PREFIX
make -j 6 && make install
cd ..

# install version of libctl from github that allows the without guile option
wget https://github.com/NanoComp/libctl/releases/download/v4.5.0/libctl-4.5.0.tar.gz
tar -xzf libctl-4.5.0.tar.gz
cd libctl-4.5.0
./configure CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" PYTHON=python --enable-shared --enable-maintainer-mode --without-guile --prefix=$PREFIX
make -j 6 && make install
cd ..

# install current version of mpb from github
git clone https://github.com/NanoComp/mpb.git
cd mpb
sh autogen.sh CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" CC=$CC PYTHON=python --enable-shared --enable-maintainer-mode --without-libctl --prefix=$PREFIX
make -j 6 && make install
cd ..

# install current version of h5utils from github
git clone https://github.com/NanoComp/h5utils.git
cd h5utils
sh autogen.sh CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" CC=$CC PYTHON=python --enable-maintainer-mode --enable-parallel --prefix=$PREFIX
make -j 6 && make install
cd ..

# install current version of libgdsii from github
git clone https://github.com/HomerReid/libGDSII.git
cd libGDSII
sh autogen.sh CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" PYTHON=python --enable-shared --enable-maintainer-mode --prefix=$PREFIX
make -j 6 && make install
cd ..

# install current version of meep from github
git clone https://github.com/NanoComp/meep.git
cd meep
sh autogen.sh CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" CC=$CC CXX=$CXX PYTHON=python --enable-shared --enable-maintainer-mode --with-libctl --without-scheme --with-mpi --prefix=$PREFIX
make -j 6 && make install
make RUNCODE="mpirun -np 6" check
cd python
make RUNCODE="mpirun -np 6" check
cd ..
cd ..
oradwastaken commented 5 months ago

I've recently updated to MacOs Sonoma 14.4.1 and now none of the build scripts are working for me anymore. I'm wondering if anyone else has gone through this and had any success? Thanks

marcus-o commented 5 months ago

I've recently updated to MacOs Sonoma 14.4.1 and now none of the build scripts are working for me anymore. I'm wondering if anyone else has gone through this and had any success? Thanks

To me, the following works on Sonoma 14.4.1 (needed to add the mpifort compiler) and only uses conda packages:

# %%

# some way to get the effects of conda init in a shell script
source ~/.zshrc

# clean up (should normally not be necessary)
conda activate base
conda env remove --name pmp
conda clean --all -y
pip cache purge
rm -r install

# in a new folder:
mkdir install
cd install
mkdir product

# prepare an environment that for sure uses open mpi and parallel hdf5
xcode-select --install
# prepare conda environment
conda create -n pmp -y
conda activate pmp
# install prerequisites from conda
conda install python=3.11 numpy matplotlib scipy autograd jax parameterized ffmpeg nlopt mpi4py "h5py>=2.9=mpi*" openmpi openmpi-mpicc openmpi-mpicxx openmpi-mpifort fftw gsl libpng autoconf automake libtool swig wget coreutils -y

# include dirs for open mpi
INCLUDEADD=$(mpicc --showme:incdirs)
LINKADD=$(mpicc --showme:link)
# must be an absolute path
CURDIR=$(pwd)
PREFIX=$CURDIR/product/
CPPFLAGS="-O3 -I$PREFIX/include -I$CONDA_PREFIX/include -I$INCLUDEADD"
LDFLAGS="-L$PREFIX/lib -L$CONDA_PREFIX/lib $LINKADD"
CC="mpicc"
CXX="mpic++"

# check that mpi4py works and h5py can use openmpi
echo "from mpi4py import MPI" > mpitest.py
echo "print('Hello World (from process %d)' % MPI.COMM_WORLD.Get_rank())" >> mpitest.py
echo "import h5py" >> mpitest.py
echo "rank = MPI.COMM_WORLD.rank" >> mpitest.py
echo "f = h5py.File('parallel_test.hdf5', 'w', driver='mpio', comm=MPI.COMM_WORLD)" >> mpitest.py
echo "dset = f.create_dataset('test', (4,), dtype='i')" >> mpitest.py
echo "dset[rank] = rank" >> mpitest.py
echo "f.close()" >> mpitest.py
mpirun -np 4 python -m mpi4py mpitest.py

# install current version of harminv from github
git clone https://github.com/NanoComp/harminv.git
cd harminv
sh autogen.sh CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" PYTHON=python --enable-shared --enable-maintainer-mode --prefix=$PREFIX
make -j 6 && make install
cd ..

# install version of libctl from github that allows the without guile option
wget https://github.com/NanoComp/libctl/releases/download/v4.5.0/libctl-4.5.0.tar.gz
tar -xzf libctl-4.5.0.tar.gz
cd libctl-4.5.0
wget -O config.sub https://git.savannah.gnu.org/cgit/config.git/plain/config.sub    
wget -O config.guess https://git.savannah.gnu.org/cgit/config.git/plain/config.guess
./configure CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" PYTHON=python --enable-shared --enable-maintainer-mode --without-guile --prefix=$PREFIX
make -j 6 && make install
cd ..

# install current version of mpb from github
git clone https://github.com/NanoComp/mpb.git
cd mpb
sh autogen.sh CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" CC=$CC PYTHON=python --enable-shared --enable-maintainer-mode --without-libctl --prefix=$PREFIX
make -j 6 && make install
cd ..

# install current version of h5utils from github
git clone https://github.com/NanoComp/h5utils.git
cd h5utils
sh autogen.sh CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" CC=$CC PYTHON=python --enable-maintainer-mode --enable-parallel --prefix=$PREFIX
make -j 6 && make install
cd ..

# install current version of libgdsii from github
git clone https://github.com/HomerReid/libGDSII.git
cd libGDSII
sh autogen.sh CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" PYTHON=python --enable-shared --enable-maintainer-mode --prefix=$PREFIX
make -j 6 && make install
cd ..

# install current version of meep from github to conda environment
git clone https://github.com/NanoComp/meep.git
cd meep
sh autogen.sh CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" CC=$CC CXX=$CXX PYTHON=python --enable-shared --enable-maintainer-mode --with-libctl --without-scheme --with-mpi --prefix=$CONDA_PREFIX
make -j 6 && make install
cd python
make install
cd ..

# test
make RUNCODE="mpirun -np 6" check
cd python
make install
make RUNCODE="mpirun -np 6" check
cd ..
cd ..
bernwo commented 3 months ago

I've recently updated to MacOs Sonoma 14.4.1 and now none of the build scripts are working for me anymore. I'm wondering if anyone else has gone through this and had any success? Thanks

To me, the following works on Sonoma 14.4.1 (needed to add the mpifort compiler) and only uses conda packages:

# %%

# some way to get the effects of conda init in a shell script
source ~/.zshrc

# clean up (should normally not be necessary)
conda activate base
conda env remove --name pmp
conda clean --all -y
pip cache purge
rm -r install

# in a new folder:
mkdir install
cd install
mkdir product

# prepare an environment that for sure uses open mpi and parallel hdf5
xcode-select --install
# prepare conda environment
conda create -n pmp -y
conda activate pmp
# install prerequisites from conda
conda install python=3.11 numpy matplotlib scipy autograd jax parameterized ffmpeg nlopt mpi4py "h5py>=2.9=mpi*" openmpi openmpi-mpicc openmpi-mpicxx openmpi-mpifort fftw gsl libpng autoconf automake libtool swig wget coreutils -y

# include dirs for open mpi
INCLUDEADD=$(mpicc --showme:incdirs)
LINKADD=$(mpicc --showme:link)
# must be an absolute path
CURDIR=$(pwd)
PREFIX=$CURDIR/product/
CPPFLAGS="-O3 -I$PREFIX/include -I$CONDA_PREFIX/include -I$INCLUDEADD"
LDFLAGS="-L$PREFIX/lib -L$CONDA_PREFIX/lib $LINKADD"
CC="mpicc"
CXX="mpic++"

# check that mpi4py works and h5py can use openmpi
echo "from mpi4py import MPI" > mpitest.py
echo "print('Hello World (from process %d)' % MPI.COMM_WORLD.Get_rank())" >> mpitest.py
echo "import h5py" >> mpitest.py
echo "rank = MPI.COMM_WORLD.rank" >> mpitest.py
echo "f = h5py.File('parallel_test.hdf5', 'w', driver='mpio', comm=MPI.COMM_WORLD)" >> mpitest.py
echo "dset = f.create_dataset('test', (4,), dtype='i')" >> mpitest.py
echo "dset[rank] = rank" >> mpitest.py
echo "f.close()" >> mpitest.py
mpirun -np 4 python -m mpi4py mpitest.py

# install current version of harminv from github
git clone https://github.com/NanoComp/harminv.git
cd harminv
sh autogen.sh CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" PYTHON=python --enable-shared --enable-maintainer-mode --prefix=$PREFIX
make -j 6 && make install
cd ..

# install version of libctl from github that allows the without guile option
wget https://github.com/NanoComp/libctl/releases/download/v4.5.0/libctl-4.5.0.tar.gz
tar -xzf libctl-4.5.0.tar.gz
cd libctl-4.5.0
wget -O config.sub https://git.savannah.gnu.org/cgit/config.git/plain/config.sub    
wget -O config.guess https://git.savannah.gnu.org/cgit/config.git/plain/config.guess
./configure CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" PYTHON=python --enable-shared --enable-maintainer-mode --without-guile --prefix=$PREFIX
make -j 6 && make install
cd ..

# install current version of mpb from github
git clone https://github.com/NanoComp/mpb.git
cd mpb
sh autogen.sh CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" CC=$CC PYTHON=python --enable-shared --enable-maintainer-mode --without-libctl --prefix=$PREFIX
make -j 6 && make install
cd ..

# install current version of h5utils from github
git clone https://github.com/NanoComp/h5utils.git
cd h5utils
sh autogen.sh CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" CC=$CC PYTHON=python --enable-maintainer-mode --enable-parallel --prefix=$PREFIX
make -j 6 && make install
cd ..

# install current version of libgdsii from github
git clone https://github.com/HomerReid/libGDSII.git
cd libGDSII
sh autogen.sh CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" PYTHON=python --enable-shared --enable-maintainer-mode --prefix=$PREFIX
make -j 6 && make install
cd ..

# install current version of meep from github to conda environment
git clone https://github.com/NanoComp/meep.git
cd meep
sh autogen.sh CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" CC=$CC CXX=$CXX PYTHON=python --enable-shared --enable-maintainer-mode --with-libctl --without-scheme --with-mpi --prefix=$CONDA_PREFIX
make -j 6 && make install
cd python
make install
cd ..

# test
make RUNCODE="mpirun -np 6" check
cd python
make install
make RUNCODE="mpirun -np 6" check
cd ..
cd ..

Tried this script on Sonoma 14.5, and unfortunately didn't work. The procedure failed at the meep building step where it was complaining something about sphere_quad and sphere_pt.

Edit: I was able to make it work. The fix is to update the following lines:

CPPFLAGS="-O3 -I$PREFIX/include -I$CONDA_PREFIX/include -I$INCLUDEADD"
LDFLAGS="-L$PREFIX/lib -L$CONDA_PREFIX/lib $LINKADD"

to (assuming you have installed openblas via brew):

CPPFLAGS="-O3 -I$PREFIX/include -I$(brew --prefix openblas)/include -I$(brew --prefix)/include -I$INCLUDEADD"
LDFLAGS="-L$PREFIX/lib -L$(brew --prefix openblas)/lib -L$(brew --prefix)/lib $LINKADD"
akshay-8 commented 3 months ago

I've recently updated to MacOs Sonoma 14.4.1 and now none of the build scripts are working for me anymore. I'm wondering if anyone else has gone through this and had any success? Thanks

To me, the following works on Sonoma 14.4.1 (needed to add the mpifort compiler) and only uses conda packages:

# %%

# some way to get the effects of conda init in a shell script
source ~/.zshrc

# clean up (should normally not be necessary)
conda activate base
conda env remove --name pmp
conda clean --all -y
pip cache purge
rm -r install

# in a new folder:
mkdir install
cd install
mkdir product

# prepare an environment that for sure uses open mpi and parallel hdf5
xcode-select --install
# prepare conda environment
conda create -n pmp -y
conda activate pmp
# install prerequisites from conda
conda install python=3.11 numpy matplotlib scipy autograd jax parameterized ffmpeg nlopt mpi4py "h5py>=2.9=mpi*" openmpi openmpi-mpicc openmpi-mpicxx openmpi-mpifort fftw gsl libpng autoconf automake libtool swig wget coreutils -y

# include dirs for open mpi
INCLUDEADD=$(mpicc --showme:incdirs)
LINKADD=$(mpicc --showme:link)
# must be an absolute path
CURDIR=$(pwd)
PREFIX=$CURDIR/product/
CPPFLAGS="-O3 -I$PREFIX/include -I$CONDA_PREFIX/include -I$INCLUDEADD"
LDFLAGS="-L$PREFIX/lib -L$CONDA_PREFIX/lib $LINKADD"
CC="mpicc"
CXX="mpic++"

# check that mpi4py works and h5py can use openmpi
echo "from mpi4py import MPI" > mpitest.py
echo "print('Hello World (from process %d)' % MPI.COMM_WORLD.Get_rank())" >> mpitest.py
echo "import h5py" >> mpitest.py
echo "rank = MPI.COMM_WORLD.rank" >> mpitest.py
echo "f = h5py.File('parallel_test.hdf5', 'w', driver='mpio', comm=MPI.COMM_WORLD)" >> mpitest.py
echo "dset = f.create_dataset('test', (4,), dtype='i')" >> mpitest.py
echo "dset[rank] = rank" >> mpitest.py
echo "f.close()" >> mpitest.py
mpirun -np 4 python -m mpi4py mpitest.py

# install current version of harminv from github
git clone https://github.com/NanoComp/harminv.git
cd harminv
sh autogen.sh CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" PYTHON=python --enable-shared --enable-maintainer-mode --prefix=$PREFIX
make -j 6 && make install
cd ..

# install version of libctl from github that allows the without guile option
wget https://github.com/NanoComp/libctl/releases/download/v4.5.0/libctl-4.5.0.tar.gz
tar -xzf libctl-4.5.0.tar.gz
cd libctl-4.5.0
wget -O config.sub https://git.savannah.gnu.org/cgit/config.git/plain/config.sub    
wget -O config.guess https://git.savannah.gnu.org/cgit/config.git/plain/config.guess
./configure CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" PYTHON=python --enable-shared --enable-maintainer-mode --without-guile --prefix=$PREFIX
make -j 6 && make install
cd ..

# install current version of mpb from github
git clone https://github.com/NanoComp/mpb.git
cd mpb
sh autogen.sh CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" CC=$CC PYTHON=python --enable-shared --enable-maintainer-mode --without-libctl --prefix=$PREFIX
make -j 6 && make install
cd ..

# install current version of h5utils from github
git clone https://github.com/NanoComp/h5utils.git
cd h5utils
sh autogen.sh CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" CC=$CC PYTHON=python --enable-maintainer-mode --enable-parallel --prefix=$PREFIX
make -j 6 && make install
cd ..

# install current version of libgdsii from github
git clone https://github.com/HomerReid/libGDSII.git
cd libGDSII
sh autogen.sh CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" PYTHON=python --enable-shared --enable-maintainer-mode --prefix=$PREFIX
make -j 6 && make install
cd ..

# install current version of meep from github to conda environment
git clone https://github.com/NanoComp/meep.git
cd meep
sh autogen.sh CPPFLAGS="$CPPFLAGS" LDFLAGS="$LDFLAGS" CC=$CC CXX=$CXX PYTHON=python --enable-shared --enable-maintainer-mode --with-libctl --without-scheme --with-mpi --prefix=$CONDA_PREFIX
make -j 6 && make install
cd python
make install
cd ..

# test
make RUNCODE="mpirun -np 6" check
cd python
make install
make RUNCODE="mpirun -np 6" check
cd ..
cd ..

I am using MacOS 14.5, but this script is getting stuck at Checks, particularly after cylindrical . . . /Library/Developer/CommandLineTools/usr/bin/make check-TESTS FAIL: aniso_disp PASS: bench PASS: bragg_transmission PASS: convergence_cyl_waveguide PASS: cylindrical (here it gets stuck, I waited at least 30 minutes)

Any help will be appreciated!