Perl5-Alien / Alien-Base

Base classes for Alien:: modules (deprecated, see Alien-Build)
Other
35 stars 19 forks source link

Consumers of modules based on Alien::Base may have trouble loading shared libraries #10

Closed run4flat closed 12 years ago

run4flat commented 12 years ago

Synopsis: Alien::Base ok, Alien::FFCall ok, Perl-FFI fails: can't load shared object file

On Ubuntu, I can successfully build and install Alien::Base and Alien::FFCall (the latter can be found here: http://github.com/run4flat/Alien-FFCall). I revised perl-FFI based upon the documentation and examples in Alien::Base; you can find that work here: http://github.com/run4flat/perl-FFI).

Although I can build perl-FFI, I am unable to run any tests. It complains with the following error:

$ ./Build test
t/00-load.t .. 1/2 Bailout called.  Further testing stopped:  Unable to load FFI!

#   Failed test 'use FFI;'
#   at t/00-load.t line 8.
#     Tried to use 'FFI'.
#     Error:  Can't load '/home/dcmertens/packages/perl-FFI/blib/arch/auto/FFI/FFI.so' for module FFI: libavcall.so.0: cannot open shared object file: No such file or directory at /home/dcmertens/perl5/perlbrew/perls/perl-5.14.2/lib/5.14.2/i686-linux/DynaLoader.pm line 190.

The weird part of this: /home/dcmertens/packages/perl-FFI/blib/arch/auto/FFI/FFI.so is the location of the library as built, not as installed, all the stranger since it found the proper location (/home/dcmertens/perl5/perlbrew/perls/perl-5.14.2/lib/site_perl/5.14.2/auto/share/dist/Alien-FFCall...) at build time:

./Build 
Building FFI
cc -I/home/dcmertens/perl5/perlbrew/perls/perl-5.14.2/lib/5.14.2/i686-linux/CORE -DXS_VERSION="1.04" -DVERSION="1.04" -fPIC -I/home/dcmertens/perl5/perlbrew/perls/perl-5.14.2/lib/site_perl/5.14.2/auto/share/dist/Alien-FFCall/include -c -fno-strict-aliasing -pipe -fstack-protector -I/usr/local/include -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -O2 -o lib/FFI.o lib/FFI.c
ExtUtils::Mkbootstrap::Mkbootstrap('blib/arch/auto/FFI/FFI.bs')
cc -shared -O2 -L/usr/local/lib -fstack-protector -o blib/arch/auto/FFI/FFI.so lib/FFI.o -L/home/dcmertens/perl5/perlbrew/perls/perl-5.14.2/lib/site_perl/5.14.2/auto/share/dist/Alien-FFCall/lib -lavcall -lcallback

Here I'm using perlbrew, obviously, but I can reproduce this behavior on various Ubuntu platforms with various Perl versions. It might be an error in my use of Alien::Base, but I thought I followed the examples correctly. Is this a bug in Alien::Base, or a bug in my code?

jtpalmer commented 12 years ago

I'm having the same problem. See Alien-Chipmunk and Chipmunk for my code.

My Alien module finds the shared libraries during the tests:

$ ldd t/src/test
    linux-vdso.so.1 =>  (0x00007ffff2994000)
    libchipmunk.so.6.0.3 => /home/jtpalmer/src/personal/perl/Alien-Chipmunk/blib/lib/auto/share/dist/Alien-Chipmunk/lib/libchipmunk.so.6.0.3 (0x00007fe1201d1000)
    libm.so.6 => /lib/libm.so.6 (0x00007fe11ff46000)
    libc.so.6 => /lib/libc.so.6 (0x00007fe11fbc2000)
    /lib64/ld-linux-x86-64.so.2 (0x00007fe1203f6000)

but not in the module using the Alien module:

$ ldd blib/arch/auto/Chipmunk/Chipmunk.so 
    linux-vdso.so.1 =>  (0x00007fff9e5ff000)
    libchipmunk.so.6.0.3 => not found
    libm.so.6 => /lib/libm.so.6 (0x00007f3a58935000)
    libc.so.6 => /lib/libc.so.6 (0x00007f3a585b1000)
    /lib64/ld-linux-x86-64.so.2 (0x00007f3a58dc5000)

Similar results with @run4flat 's perl-FFI:

$ ldd blib/arch/auto/FFI/FFI.so 
    linux-vdso.so.1 =>  (0x00007fff4ae65000)
    libavcall.so.0 => not found
    libcallback.so.0 => not found
    libc.so.6 => /lib/libc.so.6 (0x00007f0030a93000)
    /lib64/ld-linux-x86-64.so.2 (0x00007f0031028000)

Manually setting LD_LIBRARY_PATH will work, but that shouldn't be necessary.

run4flat commented 12 years ago

Joel and I discussed the library linking/loading a few weeks ago but neither he nor I are C shared library gurus. Alien::Base manually sets LD_RUN_PATH when you say "use Alien::MyModule" in your consuming module (see https://github.com/jberger/Alien-Base/blob/master/lib/Alien/Base.pm#L28). Some investigations ensued and he added LD_LIBRARY_PATH in a special branch: https://github.com/jberger/Alien-Base/blob/mertens/lib/Alien/Base.pm#L28. Does you module work if you use that branch? (I'm testing my own code atm.)

run4flat commented 12 years ago

@jtpalmer, where do you "manually set LD_LIBRARY_PATH" such that it works? During the library build stage? Just before Dynaloader looks for the library?

jtpalmer commented 12 years ago

The mertens branch didn't work for me:

$ ./Build
Name "Win32::Locale::Lexicon" used only once: possible typo at /home/jtpalmer/.perlbrew/perls/perl-5.12.3/lib/5.12.3/I18N/LangTags/Detect.pm line 140.
Building Alien-Chipmunk
Internal Exception at /home/jtpalmer/.perlbrew/perls/perl-5.12.3/lib/site_perl/5.12.3/Alien/Base/ModuleBuild.pm line 237.
Could not find any matching files at /home/jtpalmer/.perlbrew/perls/perl-5.12.3/lib/site_perl/5.12.3/Alien/Base/ModuleBuild.pm line 237.
Can't call method "version" on an undefined value at /home/jtpalmer/.perlbrew/perls/perl-5.12.3/lib/site_perl/5.12.3/Alien/Base/ModuleBuild.pm line 246.

I didn't set LD_LIBRARY_PATH until running the tests. e.g.:

$ LD_LIBRARY_PATH=/home/jtpalmer/.perlbrew/perls/perl-5.12.3/lib/site_perl/5.12.3/auto/share/dist/Alien-FFCall/lib ./Build test
run4flat commented 12 years ago

I wonder if using DyanLoader, and its @dl_library_path, might be the proper way to handle this. It seems like XSLoader is not paying attention to the environment variable changes.

run4flat commented 12 years ago

I've taken a whack at using DynaLoader directly, but I've had no luck. Another possible avenue: p3rl.org/DynaLoader::Functions

jberger commented 12 years ago

You know, you almost got me there @run4flat! The reason you can't use those is that that is where perl looks for Perl libraries, but we need to help the compiler to look for libraries not perl. AFAICT DynaLoader is no help.

run4flat commented 12 years ago

What you say is true, but my last idea is that we load the Alien libraries ourselves, rather than using the system's dynamic loader to load them for us. :-)

jberger commented 12 years ago

I'm all ears, how do you propose that we do it? Using FFI or TCC? or would we parse out the symbols and pass them to DynaLoader manually? @jtpalmer made the point that this gets easier if we use static libraries, I can't remember why I was set against them before, but I know that I was.

run4flat commented 12 years ago

If I understand things correctly, we simply need to call the dynamic loader with the file to load. Later, when XSLoader says, "I need libsomething.so.0", the dynamic loader will say, "Oh, I've already loaded it. Now, what symbols do you need?" At least, that's how I hope things work for shared libraries.

I think this may solve our problem, though it seems rather hackish. :-(

jberger commented 12 years ago

Sounds fine to me, where is that loader? Is it part of DynaLoader?

jtpalmer commented 12 years ago

I've had some success by adding more options to the linker flags. This works for me on Linux, but not Mac OS X: https://github.com/jtpalmer/Alien-Base/commit/7281494485a19aaee3ba7a475eecfa6d776ea8cd

run4flat commented 12 years ago

@jtpalmer, this is tentatively fixed by the dlopen branch. Check it out and see if it works for you. It worked on my system for FFI/Alien::FFCall.

jtpalmer commented 12 years ago

The dlopen branch fixes the problem for the consuming module (on linux, I haven't tested on OS X yet, but I may have other problems there). But now a test in my Alien:: module fails. I borrowed a test from Alien::ODE that uses Ext::CBuilder to create an object file, then an executable file using the library built by the module. This now fails with error while loading shared libraries. ldd shows not found for the .so file linked in the test executable, presumably since LD_RUN_PATH is no longer being set.

run4flat commented 12 years ago

I cannot speak to the technical details here, but I have two items worth considering that may solve the problem for you:

1) I hacked Devel::CheckLib to do similar sorts of testing for FFCall. See https://github.com/run4flat/Alien-FFCall/blob/master/t/10-compile-and-call.t

2) Once Alien::FFCall and a revised FFI.pm hit CPAN, you will be able to load a library and call it's functions directly from Perl, without needing to link or compile anything. (It uses assembler-based code to dynamically manipulate the C stack and call functions.) FFCall isn't as robust as libffi, but it will hopefully provide a basic framework until Ctypes (which is Reini Urban's Perl wrapper for libffi) comes out.

jberger commented 12 years ago

@jtpalmer that is the problem, I don't set that variable anymore. Even so I wouldn't recommend that test, its not portable and wouldn't work without a compiler. One of the end goals is to allow a repository to contain precompiled binary libraries for the platforms (notably Windows); in that case Alien::Foo can still be used, even if there is no C compiler, your test would then fail even if the library would load.

To test that the library is findable, you might investigate @DynaLoader::dl_resolve_using which I do populate with paths to the library files. I am also working on injecting some of the information into the Alien::Foo namespace, but I still am deciphering how I want to do that.

jtpalmer commented 12 years ago

@run4flat Thanks for the ideas, I'll take a look at those.

@jberger the tests are skipped if Ext::CBuilder isn't installed, but I see your point. My concern was that the information being supplied by the Alien::Foo module isn't sufficient to build an executable, but I think I've moved outside the intended scope of Alien::Base.

Either way, thank you both for your help.

jberger commented 12 years ago

It is beyond the scope for now, but with some planning there's no reason that we shouldn't be able to provide executables eventually too. Just a question, what other information would you need? The reason that I say its beyond the scope now is locating afterwards, not during build.

jtpalmer commented 12 years ago

I shouldn't have stated that you can't build executables, because you can. But, the executable doesn't find the library unless you set LD_RUN_PATH when you compile it or LD_LIBRARY_PATH when you run it. Adding an rpath (-Wl,-rpath,...) option to libs works on some architectures (does the same thing as setting LD_RUN_PATH). Instinctively, it seems like the extra work being done by dlopen at runtime shouldn't be necessary, everything should be set at compile time. So, there really isn't any other information needed, I'm probably just over thinking things.

jberger commented 12 years ago

Ok I see where you're coming from. Perhaps there can be an import option which will also set LD_RUN_PATH, but this isn't really a great use case. As you can see, this is a problem of installing libraries to a non-standard location, the perl interpreter must be running to get the locations setup.

jberger commented 12 years ago

Perhaps a better way to say it is: shared libraries installed via Alien::Base are really only intended to support Perl modules/scripts/extensions etc. Moving beyond that isn't going to be easy and therefore isn't a goal of the project (at least yet).

run4flat commented 12 years ago

Quick thoughts for the moment: wrap the binary in a perl script that sets the PATH and calls exec. Just like a shell script to achieve the same thing. On Apr 24, 2012 10:19 PM, "Joel Berger" < reply@reply.github.com> wrote:

Perhaps a better way to say it is: shared libraries installed via Alien::Base are really only intended to support Perl modules/scripts/extensions etc. Moving beyond that isn't going to be easy and therefore isn't a goal of the project (at least yet).


Reply to this email directly or view it on GitHub: https://github.com/jberger/Alien-Base/issues/10#issuecomment-5323008

jberger commented 12 years ago

@run4flat thats a good thought. Perhaps that process can be made simple, easy for one-liners

run4flat commented 12 years ago

@jtpalmer, I was wrong. My tests for Alien::FFCall fail now, probably for the same reasons that yours fail. I may look at your CBuilder approach, though, as that's likely to be better for this sort of thing than my approach.

@jberger, my initial reaction is "No, Alien modules should only be responsible for providing libraries for Perl module consumption." However, even this is tricky when it comes to writing tests for Alien-derived modules without the utilities of FFCall or libffi.

If you think that Alien::Base can provide a means for setting LD_LIBRARY_PATH, LD_RUN_PATH, and/or PATH, depending on the operating system, and do that reliably, I think that is generally a good idea. If you want to simply say, "Can't guarantee that this works on all systems; use Perl's libffi bindings for testing" or something like that, I would not fault you. The only issue with the latter is that it introduces yet another build tool that has to be installed.

I believe that either approach is acceptable and it's just a matter of choosing which one to take. I would vote on encouraging the use of FFCall for now simply because it's easier on you and may help you get your module's first version finished and out there that much faster.

mnunberg commented 12 years ago

So I've been following your progress in general and this thread specifically..

A good way of knowing how to do this is following whatever ExtUtils::MakeMaker does in these situations. For unix-like systems it's generally reliable, establishing the proper flags.

Win32 is yet another story.. (i think it really only has %PATH%)..

It's good to abstract this issue to the realm of the linker. The linker is the component which locates where extra dependencies are located, and ELF provides the DT_RPATH header for this. OS X does not use ELF (but uses something else, 'mach-o' ?), also see http://www.dribin.org/dave/blog/archives/2009/11/15/rpath/

I am not a huge fan of environment variables, and suggest that these things be done at link-time when possible (EU::MM is fairly good at determining the proper environmental and linker incantations).

run4flat commented 12 years ago

Wait, I've got it. Consider how GSL solves this: it provides a binary called gsl-config which outputs all manner of compiler and linker flags that one might need for linking and compiling programs that use those libraries. These flags can be used directly in the compile/link commands for systems that know how to use backticks.

So, what about a script provided by Alien::Base called something like perl-alien-config (or some such) which would take the name of the alien library and then command line arguments for libs, cflags, prefix, etc? This gives potential consumers of the library everything they need without directly mucking with environment variables, and it's much more cross-platform than using rpath. It could even allow for Alien authors to subclass the config behavior if they so desire.

Example usage:

$ gcc my-foo.c -o foo `perl-alien-config Alien::FFCall --cflags`
jberger commented 12 years ago

@run4flat I see what you mean, but again, I think this is beyond the scope; in this case you really should have installed the library system-wide via your system's package manager. Alien::Base really is supposed to support Perl modules. Its not impossible that eventually I will need build these libraries with a Perl-centric view.

mnunberg commented 12 years ago

I actually do like the idea of using a pkg-config style configuration system for perl libraries. In most cases a file would look pretty simple.

So the key issue here is handling the case when one perl alien package depends on another perl alient package (perhaps with multiple chains of dependencies). pkg-config handles that quite well.

But having authors write their own .pc files is just moving the problem:

1) Many people won't really know what rpath is (but that won't stop them from thinking they know what they're doing, and releasing modules with bad build systems)

2) Even for those who have knowledge of rpath, they will need some way to generate the .pc file. Now, autoconf provides a natural means of doing so (and has built in pkgconfig-generation macros); but not all the world is autoconf. This means essentially placing the onus of the problem we're trying to solve on the potential users.. never a good idea.. unless...

3) We decouple the process of building and discovering. Remember that what we really need is just the various locations of shared dependencies, plugins, and perhaps prefix directives which would be passed to --prefix (for this, see my Couchbase::Client build system). Once you are truly sure you have compiled this information reliably, the environment variables and linker flags stop becoming so voodoo-like.

ExtUtils::MakeMaker tackles this problem quite nicely (or might it be one of its tributary modules)

jberger commented 12 years ago

Ok a few responses

mnunberg commented 12 years ago

I was going along run4flat's idea of the *-config format, which is really pkg-config, which is really just about writing a .pc file, which lets the package declare in a portable format what its linker flags should be.

pkg-config will not be doing any loading (the loading is done either at link-time by ld, or at runtime by dlopen or whatever other method we come up with. More about this later).

The fact of the matter is that our problem is not the very naive case of having one simple dynamic library which needs to be found by a Perl module. This is actually what DynaLoader does, because DynaLoader expects existing XS modules to be simple binary objects found in very predictable locations, with their own dependencies hand-configured by their authors (-- this is the problem Alien::Base is trying to solve btw, making a sane way for XS module authors to manage the non-Perl aspects of dependencies).

LD_RUN_PATH and rpath are virtually synonymous in functionality, only that there are various issues with each (essentially, LD_RUN_PATH is supposed to be the environment variable which ld uses in addition to whatever was supplied to it via rpath).

It might be beneficial here if I provide a detailed summary of all the steps the 'linker' goes through - this might demystify some things:

I will be using LD and DL as distinct terms. Specifically, LD is a component of the build process, and DL is a component of the runtime process.This is a fairly accurate and detailed description of what happens for ELF-based systems (other binary formats do things slightly differently..) found on Linux, Solaris, and other Unix-like systems.

Build Process

Or, what really happens when you build a shared object.

Basic Build Dependencies

  1. LD is passed an object file (e.g. a .o file)
  2. LD examines the object file for unresolved symbols (i.e. library calls)
  3. LD searches the command line for possible candidates supplying those symbols (via the -l flags)
  4. LD traverses the library search paths (e.g. /usr/lib, /usr/local/lib, and anything else specified via the -L flags, AND LD_LIBRARY_PATH) to find the library mentioned in the previous step
  5. LD maps the symbols in your current library (i.e. the one that is being built) to memory addresses within the located dependency (i.e. the one determine in steps 3-4) and stores this in the library as an external symbol with its location within the currently-being-built object's symbol table

    blah_create_handle => libblah.so:0xb00b5

  6. LD Stores information about the actual dependency in the library as well, inside the currently-being-built object's dependency table

    NEEDED: libblah.so

  7. If LD does not find a library which contains the unresolved symbol, the behavior is dependent on what is being built (is it an executable, or a shared library) and the various flags passed to the linker. By default LD will not abort the build process for unresolved dependencies within libraries (it will rely on lazy-loading during runtime, explained later on), and will abort for executables (this is called strict loading or binding)

Extended Build Dependencies

  1. LD sees RPATH directives in its command line (and LD_RUN_PATH in the environment). It appends this information to the library as well (note, this has nothing to do with steps 1-6), e.g.

    EXTRA SEARCH PATHS: /mydir/lib:/other/dir/h4x0rz

Link/Load Process

Or, what happens when you 'run a program'

Determining Symbols

  1. The binary is invoked, in one of its main entry points (the binary in this case is an object which shared dependencies) it requests that external libraries be loaded, via the NEEDED directive within the object (remember that executables and binaries are basically the same in this respect [well, there are some subtle differences which aren't important for this overview]).
  2. For each object (this means, for each binary which has a NEEDED section), this process is performed, in order:
  3. For each shared object declared as NEEDED, DL will scan the search directories (default search paths, and those specified in the environment variable LD_LIBRARY_PATH (ELF, Linux and solaris) or DYLD_LIBRARY_PATH (Darwin/OSX); it will also scan those directories specified in RPATH and will use the first library found matching the name requested (i.e. /mydir/libblah.so).
  4. For each of the found objects, it will recurse to steps 2 and 3.
  5. dlopen and LoadLibrary are merely runtime versions which can declare additional dependencies. This simulates the process as initiated at step 1, except that this happens during runtime (i.e. when the program has already started) as opposed to load time (when the C library and DL load the executable entry point for the first time).

Problems with runtime dependency loading

Because runtime dependency loading is lazy, it is impossible to inspect the binary and determine at load time which dependencies need to be loaded. It is also impossible to determine during load and build time whether the prospective runtime dependency is of the right version and contains the correct symbols. For this reason, such dependencies are placed in non-standard locations (i.e. a place which is controlled by the parent package) and not in global system directories.

An example of a 'shared library' would be something like /usr/lib/libglib-2.0.so This is a versioned, publically visible, and load-time dependency.

An example of a 'runtime library' is one of a plugin, where it contains some basic routines which invoke an API the parent expects. Note the path: /usr/lib/gtk-2.0/2.10.0/engines/libnimbus.so

In this case, the path is known only to GTK (other libraries cannot really link against it, unless they knew where GTK placed it), GTK knows where this library is located due to hard-coded configuration directives during build time (i.e. passing --with-engine-dir=/usr/lib/gtk-2.0/2.10.0/engines). GTK itself expects this directory to never change (and would break if it did).

In other words, using dlopen generally means implementing your own search paths and conventions.

This is what Perl does with its XS modules as well. It places them in well known directories, and DynaLoader searches for them in those well known directories. This is due to versioning constraints and semantics.

The real problem with dlopen and all the linker directives (and this goes for any configurable option I discussed in the LD stages) we have provided is that they can only ever determine what is specified as load-time dependencies (that is, well known dependencies which can even be inspected using ELF readers).

Therefore, we need an extra hack. We also need to insert possible plugin search paths within the RPATH and search path directives (this won't always work, too) which is not usual in 'normal' build processes so that dlopen()s will just work (in cases where the paths are unqualified, e.g.the plugins exist in /usr/lib directly), and also be able to provide a proper --prefix (and other configure-time options) for things like GTK (GTK is not a good example of alien, but there are far simpler projects which use the same system).

Anyway, the goal of this long post was to provide a detailed overview of the various stages and problems Alien faces with each. I hope it has served its aim

jberger commented 12 years ago

@mnunberg, that may be the best git comment I have ever seen. Thanks so much for putting all of that together. You might think about blog posting that so that more people get the benefit of reading it!

After my first of what I am sure will be several readings, I think that I am ok with the dlopen concept. Yes I understand that it means a little extra work on resolving paths etc, but truthfully I needed to do most of that to provide LD_RUN_PATH in the previous iteration. I don't think that my envisioned workflow allows LD_LIBRARY_PATH as I want to be able to provide these libraries when by use Alien::Foo, which means the interpreter is already running. Further I don't want these to be publicly visible libraries (beyond Perl) and they are already going to be in a path known to the Alien::Foo module. I think all of this points to my current implementation.

For dependency chains, I think that the concept will be that it will be up to Alien:: authors to be sure to dlopen a dependency before dlopen-ing their own for consumers. Since this is built in to the import/use directive it shouldn't feel foreign, and in fact hopefully will feel natural.

Thanks again for writing such a detailed description!

jberger commented 12 years ago

Also, I should mention, that Alien::Base DOES attempt to provide all of the information that a .pc file would provide, just as pkg-config would. Using these facilities, one should be able to build an XS module that depends on multiple Alien:: modules. Building an library provided by an Alien:: which itself relies on a library provided by an Alien:: might be harder, but I'm not sure that there is an easy way to deal with that case.

mnunberg commented 12 years ago

That's precisely the types of scenarios which become hairy, or else the scope of Alien becomes very limited (especially that the types of situations one uses Alien for would generally itself have significant dependencies, and there is at least one such example I can think of which exists on CPAN already - not using Alien::Base, but I believe would be a much better model application).

http://search.cpan.org/dist/Memcached-libmemcached/

Couchbase::Client in fact borrowed from the ease of distribution and installation of this module (note, this module does not ask you to bother your package manager, and I think that is very good design, if it worked). Couchbase::Client tries to retain the same distribution model, except it places far more effort in making sure it actually works.

I am sure there are some other distributions out there which do the same thing and would have a far larger usage base.

While I have an idea about your background and the reason for you writing this in the first place, Mathematical and scientific libraries are particularly easy because their internal structure (in terms of how they interact and operate with the surrounding ecosystem) is very simplistic because they mainly do CPU-bound operations and are highly portable.

Most libraries however involve complex network operations, versioning, many configuration directives etc. etc. and this would account for a large user base - not to mention they are also the more complex than your average scientific library. You have been particularly 'lucky' in stumbling upon one which is relatively complex :)

I think one of the key points that I should emphasize again is that the problem should be abstracted away from Perl and XS-space. If this is to work correctly and be useful where it's needed the most, it will need to implement functionality which operates along the lines of a dynamic linker and is aware of these concepts as they exist in a generic fashion (as I have outline above). Perl and XS are just the glue being used to ensure the interoperability of those modules with Perl.

Of course, they aren't _just_the glue (but likely where most of the application logic resides), from the perspective of a build and library system, they are just 'clients', so to speak, which 'connect' to the linker system.

What I am proposing, essentially, is a bottom-up approach.

For windows, btw, there is a Win32 API function called AddDllDirectory()

http://msdn.microsoft.com/en-us/library/windows/desktop/hh310513%28v=vs.85%29.aspx

Which is actually superior to the environment variables provided by Unix, because it can be set at runtime, and is not mandated as an environment variable before the program begins executing.

I have just checked the relevant manpages, and indeed the dynamic linker does not re-check the relevant environment variables again. This means we are confined to an rpath solution for unix systems, and its runtime equivalent (AddDllDirectory) on Win32.

Perhaps what we might do in the future is write a simple project which demonstrates these issues.. I am rather stacked with work these days, but it does not sound very difficult

mnunberg commented 12 years ago

As a proof of concept, here is a very simple abstraction of a scenario we will need to deal with:

https://github.com/mnunberg/plugin-skeleton

Basically, 'child.c' is the entry point, it loads libparent, and libparent itself loads a dynamic plugin, parent_plugin. If you inspect the Makefile, you will see various variables. The key is to get the Makefile to successfully and dynamically compile 'child' and its dependencies, so that it will work successfully.

On Linux, the following incantation seems to work:

  make CHILD_LDFLAGS="-Wl,-rpath=\\$\${ORIGIN}/inst/lib -Wl,-rpath=\\$\${ORIGIN}/inst/plugin -lparent -L$PWD/inst/lib" -B

In reality, 'child' is an abstraction of an entry point for an XS module, but note how this has now become a more generic problem and less of a Perl problem. It only happens to be a Perl problem because we need to impose some sane mechanism which is relatively independent of the environment in which it operates

jberger commented 12 years ago

Ok, I will try to keep this in mind. Admittedly I think mostly of Alien::Base a providing the libraries for which some module provides the library bindings. For more involved libraries though, we will definitely need to be able to provide what you suggest. For now I will push ahead down the path I am travelling, but only for one reason, it is what I proposed in the TPF grant proposal; namely to be able to provide simple libraries. I am all for being more ambitious eventually though. Thanks again!