Open apaszke opened 3 years ago
I don't have much experience in this area but I suppose that this section could be helpful to you.
Thanks for the link! I've already seen it and I don't think that this is the same issue. Though from what I understand, some people on SO did have success with modifying the "C compiler flags" in the settings file to always include -fPIC
. It just feels like a bit of a hack to me, but maybe that's the way to go about it.
Hi! I have a Haskell project that I'd like to redistribute as an
.so
library with a C interface, but without any assumptions on Haskell being installed on the user machine.To get the
.so
I use aforeign-library
declaration in my Cabal file. For as long as I'm ok with dynamically linking all my dependencies, Stack seems to correctly recognize the library and build it. (One annoyance is that it doesn't install it be default when runningstack install
, but I've worked around that).However, that doesn't give me a self-contained library, because
ldd
prints a ton of Haskell shared libs when I query the artifact. To help with that I tried addingghc-options: -static -optc-static -optl-static
to myforeign-library
target, but this results in a huge log of errors from the linker (a selection of messages below, they're all quite similar):This seems to imply that all the static libraries stack produces are built without
-fPIC
, which makes sense because this is the default use case, but it completely breaks down for me. Is there any way to override this decision and have stack build both shared and static libraries with-fPIC
(and have it apply to all GHC libraries, all my dependencies, and my project)?