kambo-1st / llama-cpp-php

The package enables the use of the LLama C++ library in PHP, thereby allowing the setup and execution of LLM models in PHP on your local machine.
MIT License
41 stars 5 forks source link

[Bug]: running on Docker #1

Open yandod opened 1 year ago

yandod commented 1 year ago

What happened?

I am trying this library on Docker on mac and happened to see this error on test. I knew this library is tested on Linux only, but I wonder is it possible to run on Docker image.

% make test
docker compose run php php ./test.php

Fatal error: Uncaught FFI\Exception: Failed loading '/var/www/html/vendor/kambo/llama-cpp-php-linux-lib/src/../libllama.so' in /var/www/html/vendor/kambo/llama-cpp-php/src/Native/LLamaCPPFFI.php:59 Stack trace:

0 /var/www/html/vendor/kambo/llama-cpp-php/src/Native/LLamaCPPFFI.php(59): FFI::cdef('#define FFI_LIB...', '/var/www/html/v...')

1 /var/www/html/vendor/kambo/llama-cpp-php/src/Native/LLamaCPPFFI.php(49): Kambo\LLamaCPP\Native\LLamaCPPFFI::createWithLibraryInPath('/var/www/html/v...')

2 /var/www/html/vendor/kambo/llama-cpp-php/src/Native/LLamaCPPFFI.php(30): Kambo\LLamaCPP\Native\LLamaCPPFFI::create()

3 /var/www/html/vendor/kambo/llama-cpp-php/src/Context.php(35): Kambo\LLamaCPP\Native\LLamaCPPFFI::getInstance()

4 /var/www/html/test.php(10): Kambo\LLamaCPP\Context::createWithParameter(Object(Kambo\LLamaCpp\Parameters\ModelParameters))

5 {main}

thrown in /var/www/html/vendor/kambo/llama-cpp-php/src/Native/LLamaCPPFFI.php on line 59 make: *** [test] Error 255

How to reproduce the bug

here is my docker environment code. https://github.com/yandod/php-machine-learning

run

make install
make download
make test

Package Version

dev-main

PHP Version

8.2

Which operating systems does with happen with?

macOS

Notes

No response

lukemoynihan commented 10 months ago

I get the same issue running Ubuntu/PHP 8.2 under Parallels on Mac. FFI is enabled in PHP and confirmed on via php -i

lukemoynihan commented 10 months ago

@yandod I found the solution to the issue, the libllama.so requires a more recent version of libc than is included in the OS you're using. Running the following ldd libllama.so showed output like:

libllama.so: /lib/x86_64-linux-gnu/libstdc++.so.6: version `GLIBCXX_3.4.29' not found (required by libllama.so)
libllama.so: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by libllama.so)
libllama.so: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found (required by libllama.so)
        linux-vdso.so.1 (0x00007ffefa5d2000)
        libstdc++.so.6 => /lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007f2316dd4000)
        libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f2316c85000)
        libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007f2316c6a000)
        libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f2316a78000)
        /lib64/ld-linux-x86-64.so.2 (0x00007f23170c2000)

then running apt policy libc6 showed the currently installed version:

libc6:
  Installed: 2.31-0ubuntu9.9
  Candidate: 2.31-0ubuntu9.9
  Version table:
 *** 2.31-0ubuntu9.9 500
        500 http://us.archive.ubuntu.com/ubuntu focal-updates/main amd64 Packages
        100 /var/lib/dpkg/status
     2.31-0ubuntu9.7 500
        500 http://security.ubuntu.com/ubuntu focal-security/main amd64 Packages
     2.31-0ubuntu9 500
        500 http://us.archive.ubuntu.com/ubuntu focal/main amd64 Packages

Once I updated my VM from Ubuntu 20 to 22 I had the needed version and could get past this error