Semantic Segmentation for Aerial / Satellite Images with Convolutional Neural Networks including an unofficial implementation of Volodymyr Mnih's methods
I'm not having any luck running the build.sh script in utils. I built an Ubuntu VM from scratch and followed the install instructions in README.md. Anaconda3, Boost, and Boost.Numpy installed successfully. I've tried a number of potential fixes, but keep getting the following error with boost/boost.numpy when running build.sh:
-- Build files have been written to: /home/usr/ssai-cnn-master/utils
Scanning dependencies of target patches
[ 16%] Building CXX object CMakeFiles/patches.dir/src/devide_to_patches.cpp.o
In file included from /usr/local/include/boost/python/detail/prefix.hpp:13:0,
from /usr/local/include/boost/python/args.hpp:8,
from /usr/local/include/boost/python.hpp:11,
from /home/usr/ssai-cnn-master/utils/src/devide_to_patches.cpp:4:
/usr/local/include/boost/python/detail/wrap_python.hpp:50:23: fatal error: pyconfig.h: No such file or directory
compilation terminated.
Any idea how to point the program to the pyconfig.h file?
I'm not having any luck running the build.sh script in utils. I built an Ubuntu VM from scratch and followed the install instructions in README.md. Anaconda3, Boost, and Boost.Numpy installed successfully. I've tried a number of potential fixes, but keep getting the following error with boost/boost.numpy when running build.sh:
-- Build files have been written to: /home/usr/ssai-cnn-master/utils Scanning dependencies of target patches [ 16%] Building CXX object CMakeFiles/patches.dir/src/devide_to_patches.cpp.o In file included from /usr/local/include/boost/python/detail/prefix.hpp:13:0, from /usr/local/include/boost/python/args.hpp:8, from /usr/local/include/boost/python.hpp:11, from /home/usr/ssai-cnn-master/utils/src/devide_to_patches.cpp:4: /usr/local/include/boost/python/detail/wrap_python.hpp:50:23: fatal error: pyconfig.h: No such file or directory compilation terminated.
Any idea how to point the program to the pyconfig.h file?
Thanks.