nRF24 / pyRF24

A python package that wraps the RF24, RF24Network, and RF24Mesh C++ libraries.
http://pyrf24.rtfd.io
GNU General Public License v2.0
24 stars 3 forks source link

RPi Zero - ImportError: libcpp_rf24.so: cannot open shared object file: No such file or directory #61

Closed DJManas closed 5 months ago

DJManas commented 5 months ago

Hello,

I am struggling with NRF24 libraries for python on RPi zero using Raspbian GNU/Linux 11 (bullseye).

I had managed to git clone this repo using: git clone --recurse-submodules https://github.com/nRF24/pyRF24.git

Then build and install using: python -m pip install . -v

Which completed successfully. Now I want to try examples/getting_started.py, but I am getting this error:

Traceback (most recent call last):
  File "/home/djmanas/git/pyRF24/examples/getting_started.py", line 9, in <module>
    from pyrf24 import RF24, RF24_PA_LOW, RF24_DRIVER
  File "/usr/local/lib/python3.9/dist-packages/pyrf24/__init__.py", line 1, in <module>
    from .rf24 import (
ImportError: libcpp_rf24.so: cannot open shared object file: No such file or directory

So I had tried to build and install all libraries included in the git repo (RF24, RF24Mesh, RF24Network, pyrf). Yes, pyrf24 installed the _libcpprf24.so into /usr/local/ so I moved them to /usr/local/lib, but still having this issue.

So I wanted to take a look into the library directory: /usr/local/lib/python3.9/dist-packages/pyrf24 and I am confused I see the libraries there, but python tells me, that they are not present.

ls -al
total 896
drwxr-xr-x 3 root root   4096 25. kvě 22.28 .
drwxr-xr-x 4 root root   4096 25. kvě 22.28 ..
-rw-r--r-- 1 root root  34950 25. kvě 22.28 fake_ble.py
-rw-r--r-- 1 root root   2158 25. kvě 22.28 __init__.py
-rwxr-xr-x 1 root root  20188 25. kvě 22.28 libcpp_rf24_mesh.so
-rwxr-xr-x 1 root root  25152 25. kvě 22.28 libcpp_rf24_network.so
-rwxr-xr-x 1 root root  61344 25. kvě 22.28 libcpp_rf24.so
drwxr-xr-x 2 root root   4096 25. kvě 22.28 __pycache__
-rw-r--r-- 1 root root      0 25. kvě 22.28 py.typed
-rwxr-xr-x 1 root root 313932 25. kvě 22.28 rf24.cpython-39-arm-linux-gnueabihf.so
-rwxr-xr-x 1 root root 235908 25. kvě 22.28 rf24_mesh.cpython-39-arm-linux-gnueabihf.so
-rw-r--r-- 1 root root   2619 25. kvě 22.28 rf24_mesh.pyi
-rwxr-xr-x 1 root root 174388 25. kvě 22.28 rf24_network.cpython-39-arm-linux-gnueabihf.so
-rw-r--r-- 1 root root   4697 25. kvě 22.28 rf24_network.pyi
-rw-r--r-- 1 root root   9119 25. kvě 22.28 rf24.pyi

Launching the python using sudo doesn't make any difference, but I am not sure if it is needed or not. What am I doing wrong?

Thanks, Regards, Petr Sourek

2bndy5 commented 5 months ago

Then build and install using: python -m pip install . -v

If you're not using a venv created via python3 -m venv <env-name>), then this command actually invokes (or tries to invoke) python v2 in Linux.

[!TIP] Using a python venv is practically required on Debian-based Linux with recent updates to pip (when pip is installed via apt-get install python3-pip).

Using the repo's main branch (& after updating it with git pull), you should see the lines:

-- Using driver: SPIDEV -- Supplementing SPIDEV driver with linux/gpio.h

This means that sudo is not needed because the SPIDEV driver doesn't require root permission.

Using using RPi OS 64bit (bullseye), I cannot reproduce the error you reported (about ImportError: libcpp_rf24.so). From the ls output, I see the shared libs libcpp_rf24*.so are being built and located in the proper place. So, this seems like a problem specific to your setup/system. Did you set some C/C++ compiler-specific env variables (like CXX maybe)?

I don't have any RPi setup using a 32-bit OS anymore. I doubt that would be the problem here.


Lastly, you should not need to build the package yourself on RPi OS unless you want to enable debug prompts or use a different RF24_DRIVER (SPIDEV is highly recommended). We have uploaded source and binary (64-bit) distributions to pypi and the piwheels project is providing the binary distributions for 32-bit builds; FYI, the piwheels index is automatically added as the first index that pip looks for when using pip install on RPi OS. So, have you tried installing with

pip install pyrf24
# or if not using a venv
python3 -m pip install pyrf24
prompts from pip will tell you if using piwheels index

>Looking in indexes: https://pypi.org/simple, https://www.piwheels.org/simple Collecting pyrf24 Downloading pyrf24-0.3.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.metadata (13 kB) Downloading pyrf24-0.3.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (488 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 488.2/488.2 kB 1.9 MB/s eta 0:00:00 Installing collected packages: pyrf24 Successfully installed pyrf24-0.3.0

2bndy5 commented 5 months ago

So I had tried to build and install all libraries included in the git repo (RF24, RF24Mesh, RF24Network, pyrf). Yes, pyrf24 installed the libcpp_rf24.so into /usr/local/ so I moved them to /usr/local/lib, but still having this issue.

  1. Building pyrf24 package should automatically build RF24 as libcpp_rf24.so, RF24Network as libcpp_rf24_network.so, and RF24Mesh as libcpp_rf24_mesh.so
  2. Moving the built binaries into /usr/local/lib will not help. In fact, this will break the installation of pyrf24. All binaries are supposed to work in isolation. Meaning, you should be able to use pyrf24 package without needing to install the C++ libs in /usr/local/lib or /usr/lib. This was the major improvement from the older individual python wrappers (included with each RF24* lib source).
  3. If installing pyrf24 resulted in any built binaries copied to /usr/**, then you definitely have set some C/C++ env var telling CMake to alter the install location of the binaries (possibly even a CMAKE variable like CMAKE_INSTALL_PREFIX)

Please check your system's env vars (printenv should help) and make sure everything is as you expected. If you have altered some env variables to aid in building a different project, then you may have to undo some of that; starting again with a fresh shell might help.

DJManas commented 5 months ago

Hello @2bndy5,

thanks for the effort. I am using python, because all of my linuxes uses version 3 as default one, I guess if it was python2 it wouldn't even install. But its good point, recently I was trying to deploy simple script for production and turned out, that it had python2 still installed (it was Oracle Linux so I had to upgrade it).

Assuming its my fault, after I had read your post I completely deleted the SD card, and had:

Even though the files exists:

(venv) djmanas:~/.virtualenv/venv/lib/python3.11/site-packages/pyrf24 $ ls -al
total 912
drwxr-xr-x  3 djmanas djmanas   4096 26. kvě 13.09 .
drwxr-xr-x 10 djmanas djmanas   4096 26. kvě 13.09 ..
-rw-r--r--  1 djmanas djmanas  34950 26. kvě 13.09 fake_ble.py
-rw-r--r--  1 djmanas djmanas   2158 26. kvě 13.09 __init__.py
-rwxr-xr-x  1 djmanas djmanas  24580 26. kvě 13.09 libcpp_rf24_mesh.so
-rwxr-xr-x  1 djmanas djmanas  25432 26. kvě 13.09 libcpp_rf24_network.so
-rwxr-xr-x  1 djmanas djmanas  61144 26. kvě 13.09 libcpp_rf24.so
drwxr-xr-x  2 djmanas djmanas   4096 26. kvě 13.09 __pycache__
-rw-r--r--  1 djmanas djmanas      0 26. kvě 13.09 py.typed
-rwxr-xr-x  1 djmanas djmanas 318036 26. kvě 13.09 rf24.cpython-311-arm-linux-gnueabihf.so
-rwxr-xr-x  1 djmanas djmanas 235912 26. kvě 13.09 rf24_mesh.cpython-311-arm-linux-gnueabihf.so
-rw-r--r--  1 djmanas djmanas   2619 26. kvě 13.09 rf24_mesh.pyi
-rwxr-xr-x  1 djmanas djmanas 178488 26. kvě 13.09 rf24_network.cpython-311-arm-linux-gnueabihf.so
-rw-r--r--  1 djmanas djmanas   4697 26. kvě 13.09 rf24_network.pyi
-rw-r--r--  1 djmanas djmanas   9119 26. kvě 13.09 rf24.pyi

So even when I am doing it like this I should have environmental variable RF24_DRIVER set to SPIDEV? Because I am not seeing it in printenv command:

SHELL=/bin/bash
LC_MONETARY=cs_CZ.UTF-8
NO_AT_BRIDGE=1
PWD=/home/djmanas/.virtualenv/venv/lib/python3.11/site-packages/pyrf24
LOGNAME=djmanas
XDG_SESSION_TYPE=tty
MOTD_SHOWN=pam
HOME=/home/djmanas
LC_PAPER=cs_CZ.UTF-8
LANG=en_US.UTF-8
LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=00:su=37;41:sg=30;43:ca=00:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.zst=01;31:*.tzst=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.wim=01;31:*.swm=01;31:*.dwm=01;31:*.esd=01;31:*.avif=01;35:*.jpg=01;35:*.jpeg=01;35:*.mjpg=01;35:*.mjpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.webp=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.m4a=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.oga=00;36:*.opus=00;36:*.spx=00;36:*.xspf=00;36:*~=00;90:*#=00;90:*.bak=00;90:*.old=00;90:*.orig=00;90:*.part=00;90:*.rej=00;90:*.swp=00;90:*.tmp=00;90:*.dpkg-dist=00;90:*.dpkg-old=00;90:*.ucf-dist=00;90:*.ucf-new=00;90:*.ucf-old=00;90:*.rpmnew=00;90:*.rpmorig=00;90:*.rpmsave=00;90:
VIRTUAL_ENV=/home/djmanas/.virtualenv/venv
XDG_SESSION_CLASS=user
DPKG_DEB_THREADS_MAX=1
TERM=xterm-256color
USER=djmanas
SHLVL=1
LC_MEASUREMENT=cs_CZ.UTF-8
XDG_SESSION_ID=1
VIRTUAL_ENV_PROMPT=(venv) 
XDG_RUNTIME_DIR=/run/user/1000
SSH_CLIENT=192.168.31.20 49174 22
LC_TIME=cs_CZ.UTF-8
PATH=/home/djmanas/.virtualenv/pianoBell/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/games:/usr/games
DBUS_SESSION_BUS_ADDRESS=unix:path=/run/user/1000/bus
SSH_TTY=/dev/pts/0
LC_NUMERIC=cs_CZ.UTF-8
TEXTDOMAIN=Linux-PAM
_=/usr/bin/printenv
OLDPWD=/home/djmanas

I guess that I wanted to build the sources, that I might be missing something through venv in firstplace. I am an IT guy, but still only human and I make mistakes as everyone does.

Thanks for help and patience, Regards, Petr Sourek

2bndy5 commented 5 months ago

You don't need to set a env var RF24_DRIVER=SPIDEV because the pyrf24 build will default to SPIDEV if RF24_DRIVER var is not set.

I'm sorry, you went through all that and still got the same error. The only differences between your setup and mine are

Since you're using the 32 bit OS, you're downloading the binary distribution from pinwheels index. I'm not sure if this is the culprit. I'll have to dig out my old RPi0 to reproduce. It looks like all the necessary files are present in the downloaded binary dist from the piwheels index.

You could try building from source (or pip install --no-binary pyrf24), but that will require CMake and the python3 headers installed (apt install cmake python3-dev).

2bndy5 commented 5 months ago

ok, I finally got my old RPi0 setup with latest RPi OS lite 32-bit...

I was able to reproduce the ImportError: libcpp_rf24.so problem using the binary install from piwheels index.

So, I tried building directly from pyRF24 repo source (pip install -v .) and the package installed fine (took almost 20 minutes). Then, I tried running examples/getting_started.py and it didn't have a problem finding the libcpp_rf24.so.

This looks like a problem with the piwheels binary distribution, so I'll have to investigate their build process in more depth. I suspect the auditwheel tool is compromising the static links between the libcpp_rf24.so and the python bindings in rf24.cpython-3**-*-linux-gnueabi.so. What's troubling me is that our 64-bit binary dists (uploaded to pypi) do not have this problem.

Technically, you could pin to v0.2.5, but that may prove problematic since the GPIO interface has changed (via Linux kernel updates) since RPi5 was released (nRF24/RF24#932).

DJManas commented 5 months ago

Hello, Thanks for your effort!

You don't need to set a env var RF24_DRIVER=SPIDEV because the pyrf24 build will default to SPIDEV if RF24_DRIVER var is not set.

Thank you.

I'm sorry, you went through all that and still got the same error. The only differences between your setup and mine are

Its ok, better to double check and doing it if someone else has problems with it, so he/she can find that the problem anyone been solving before.

So, I tried building directly from pyRF24 repo source (pip install -v .) and the package installed fine (took almost 20 minutes). Then, I tried running examples/getting_started.py and it didn't have a problem finding the libcpp_rf24.so

Ok, I will try it in the evening, still have the system as it was after installation. I will uninstall the pyrf24 library before it. And will let you know if it helped. Yes it builds long, but Pi Zero is weak and would be great if the version from piwheels would work, but at the moment I am doing prototype for one thing so as a workaround it would be great to make it working.

This looks like a problem with the piwheels binary distribution, so I'll have to investigate their build process in more depth. I suspect the auditwheel tool is compromising the static links between the libcpp_rf24.so and the python bindings in rf24.cpython-3**-*-linux-gnueabi.so.

Can I, somehow as a user, help you with that?

What's troubling me is that our 64-bit binary dists (uploaded to pypi) do not have this problem.

Yep, I can confirm, Pi 4, Armbian, everything is working fine, but for the final application I was hoping to use something smaller, like Zero. Seems that Zero 2 is 64-bit so this board should be fine. I am not sure, can this library be used on any other 32-bit boards if still any exists? I went for RPi because of GPIOD is not usable on anything else (or at least quick google stated that), but then when I stick to pyrf24 library in documentation is written, that without IRQ pin need, it should work without it (and I can confirm receiving is working without it).

So what I am trying to say with this long paragraph. I am wondering if its worth to repair 32-bits (if its still big part of market) or simply drop 32-bit binary support and stick to 64-bit.

2bndy5 commented 5 months ago

I'm not sure what help I would ask from an end-user. You did everything I'd expect from an end-user: report a problem and define steps to reproduce.

GPIOD should work on any Linux device as long as python3-dev is installed. The distributions uploaded to pypi are only source distributions, so pip install gpiod requires compiling from source (thus the need for python3 headers). We recently switched to gpiod in our examples/interrupt_configure.py since RPi.GPIO (& many other alternative GPIO libs) no longer work on RPi5.

It is worth repairing the 32-bit builds. We can't disregard a long time use case. We're kinda lucky that this only affects 32bit RPi OS. I'll update this thread when I can research piwheels' build process. I'm pretty sure they aren't using cibuildwheels to create the 32-bit binary dists; we use cibuildwheels in our CI to create the 64-bit binary dists (with the help of a qemu VM for aarch64 builds).

2bndy5 commented 5 months ago

Found the function in piwheels API that builds the wheels. It is piwheels.slave.builder.build_wheel(). Basically it invokes

pip3 wheel \
<pkg-name>==<pkg-version> \
--wheel-dir=<piwheels-bdist-dir> \
--log=<log-file> \
--no-deps \
--no-cache-dir \
--no-binary=<pkg-name> \
--prefer-binary \
--exists-action=w \
--no-python-version-warning \
--disable-pip-version-check \
--index-url=<piwheels-index-url>

This command is what I typically expected.

However, piwheels does some post-build processing. 1. opens the built wheel as a zip file 2. extract any .so files whose first 4 bytes match `b'\x7FELF'` and pass the extracted .so file's path to `ldd`. 3. examining the `ldd` output, if the linker reports a dependency (in the form of a stdout line ` => (0x)`), then the `` is checked against third-party lib sources (`apt-cache`, `pip`, and others). 4. adds a list of third-party sources for pkg dependencies to piwheels' internal metadata. This doesn't actually alter any of the wheel's metadata, rather this internal metadata about dependencies is just output to the build's log file.
Here, we can see some metadata saved in the piwheels JSON artifacts (used for external tooling like shields.io) ```json "releases": { "0.3.0": { "released": "2024-05-06 09:38:50", "prerelease": false, "yanked": false, "skip_reason": "", "files": { "pyrf24-0.3.0-cp311-cp311-linux_armv6l.whl": { "filehash": "15df45f8eb6d5dc47310c286827b273ca3b5cf06176701d09780671daa95d880", "filesize": 370886, "builder_abi": "cp311", "file_abi_tag": "cp311", "platform": "linux_armv6l", "requires_python": ">=3.7", "apt_dependencies": [] }, "pyrf24-0.3.0-cp311-cp311-linux_armv7l.whl": { "filehash": "15df45f8eb6d5dc47310c286827b273ca3b5cf06176701d09780671daa95d880", "filesize": 370886, "builder_abi": "cp311", "file_abi_tag": "cp311", "platform": "linux_armv7l", "requires_python": ">=3.7", "apt_dependencies": [] }, "pyrf24-0.3.0-cp37-cp37m-linux_armv6l.whl": { "filehash": "37500790088baa3aac3c1a539449c734f93b2afd762770d4e185e00c7c6b3a67", "filesize": 335553, "builder_abi": "cp37m", "file_abi_tag": "cp37m", "platform": "linux_armv6l", "requires_python": ">=3.7", "apt_dependencies": [] }, "pyrf24-0.3.0-cp37-cp37m-linux_armv7l.whl": { "filehash": "37500790088baa3aac3c1a539449c734f93b2afd762770d4e185e00c7c6b3a67", "filesize": 335553, "builder_abi": "cp37m", "file_abi_tag": "cp37m", "platform": "linux_armv7l", "requires_python": ">=3.7", "apt_dependencies": [] }, "pyrf24-0.3.0-cp39-cp39-linux_armv6l.whl": { "filehash": "536676f962ce65a7d77dbdf92ed58951d197f09d6caebd21f10afafc7b2a34d3", "filesize": 360552, "builder_abi": "cp39", "file_abi_tag": "cp39", "platform": "linux_armv6l", "requires_python": ">=3.7", "apt_dependencies": [] }, "pyrf24-0.3.0-cp39-cp39-linux_armv7l.whl": { "filehash": "536676f962ce65a7d77dbdf92ed58951d197f09d6caebd21f10afafc7b2a34d3", "filesize": 360552, "builder_abi": "cp39", "file_abi_tag": "cp39", "platform": "linux_armv7l", "requires_python": ">=3.7", "apt_dependencies": [] } } }, ```

I'm starting to suspect this problem is specific to the RPi0. I'll have to try setting up a RPi3 with a 32-bit OS to try and reproduce. According to piwheels FAQ, the wheels are built using a network of RPi3 and RPi4 machines.

I have found that the build env (on piwheels machines) includes a variable PIWHEELS_BUILD=1 to help pkg maintainers identify when special measures are needed for piwheels builds.

2bndy5 commented 5 months ago

I'm starting to suspect this problem is specific to the RPi0. I'll have to try setting up a RPi3 with a 32-bit OS to try and reproduce.

Its not specific to RPi0. I was able to reproduce on a RP3 using RPi OS 32-bit. It seems that building from source is the only current solution on a 32-bit OS. Luckily, it only took about 5 minutes on my RPi3. At least I don't have to troubleshoot from a slow-running RPi0 😥

DJManas commented 5 months ago

I'm starting to suspect this problem is specific to the RPi0. I'll have to try setting up a RPi3 with a 32-bit OS to try and reproduce.

Its not specific to RPi0. I was able to reproduce on a RP3 using RPi OS 32-bit. It seems that building from source is the only current solution on a 32-bit OS. Luckily, it only took about 5 minutes on my RPi3. At least I don't have to troubleshoot from a slow-running RPi0 😥

Oh, you have just did another test, but I see you were quicker. I was curious and when I got email, that its RPi0 specific I remembered, that I have old OrangePi Zero here doing pi-hole and rss aggregator stuff so I had created venv there (Armbian).

Quirm:~:% cat /etc/os-release
PRETTY_NAME="Armbian 23.8.1 bullseye"
NAME="Debian GNU/Linux"
VERSION_ID="11"
VERSION="11 (bullseye)"
VERSION_CODENAME=bullseye
ID=debian
HOME_URL="https://www.armbian.com"
SUPPORT_URL="https://forum.armbian.com"
BUG_REPORT_URL="https://www.armbian.com/bugs"
ARMBIAN_PRETTY_NAME="Armbian 23.8.1 bullseye"

python3 --version = Python 3.9.2

Quirm:~:% python3 -m venv .virtualenv/pyrf24    
Quirm:~:% source .virtualenv/pyrf24/bin/activate
(pyrf24) Quirm:~:% pip install pyrf24
Collecting pyrf24
  Downloading pyrf24-0.3.0.tar.gz (436 kB)
     |████████████████████████████████| 436 kB 2.2 MB/s 
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
    Preparing wheel metadata ... done
Building wheels for collected packages: pyrf24
  Building wheel for pyrf24 (PEP 517) ... |
done
  Created wheel for pyrf24: filename=pyrf24-0.3.0-cp39-cp39-linux_armv7l.whl size=338966 sha256=2cc3b8050071e7bc6ccc0892a6b15496e49cd9d47b6a7e4b7a8ebdb5de107131
  Stored in directory: /home/djmanas/.cache/pip/wheels/a4/a5/1f/2c08191a34a367f22aa27267da6ec8f51bba4d79995b51d522
Successfully built pyrf24
Installing collected packages: pyrf24
Successfully installed pyrf24-0.3.0
(pyrf24) Quirm:~:% cd git
(pyrf24) Quirm:git:% git clone https://github.com/nRF24/pyRF24.git         
Cloning into 'pyRF24'...
remote: Enumerating objects: 2476, done.
remote: Counting objects: 100% (872/872), done.
remote: Compressing objects: 100% (472/472), done.
remote: Total 2476 (delta 474), reused 650 (delta 393), pack-reused 1604
Receiving objects: 100% (2476/2476), 20.34 MiB | 1.74 MiB/s, done.
Resolving deltas: 100% (1471/1471), done.
(pyrf24) Quirm:~:% cd git/pyRF24/examples 
(pyrf24) Quirm:examples:% python3 getting_started.py                                                                                                                                                                                   <main>
Traceback (most recent call last):
  File "/home/djmanas/git/pyRF24/examples/getting_started.py", line 9, in <module>
    from pyrf24 import RF24, RF24_PA_LOW, RF24_DRIVER
  File "/home/djmanas/.virtualenv/pyrf24/lib/python3.9/site-packages/pyrf24/__init__.py", line 1, in <module>
    from .rf24 import (
ImportError: libcpp_rf24.so: cannot open shared object file: No such file or directory

Same result :-(

2bndy5 commented 5 months ago

Yeah, its gotta be a problem with the piwheels bdist. I built a wheel locally on my RPi3 (32bit) and downloaded the wheel from the piwheels index. At a quick glance the wheels' size were different by about 100 bytes; all included built .so files were the same exact size. I'm trying to research as much as possible before raising this issue upstream in piwheels repo.


PS - I've been having a lot of trouble with flashing the sd cards for this... The RPi OS lite 32bit image inits on boot better when I prevent the RPi Imager app from customizing the OS (with network/user info).

DJManas commented 5 months ago

Yeah, its gotta be a problem with the piwheels bdist. I built a wheel locally on my RPi3 (32bit) and downloaded the wheel from the piwheels index. At a quick glance the wheels' size were different by about 100 bytes; all included built .so files were the same exact size. I'm trying to research as much as possible before raising this issue upstream in piwheels repo.

Great, thanks.

PS - I've been having a lot of trouble with flashing the sd cards for this... The RPi OS lite 32bit image inits on boot better when I prevent the RPi Imager app from customizing the OS (with network/user info).

Yeah, I returned to it after some years, because I had thought for RPi it would be better choice, but whenever I need or whenever I buy non RPi board, I always go to Armbian. Just flash, connect it to network and it automatically starts SSH, user root, pass 1234 and walks you right through the initial setup (user, password, etc.) And I had found out, its even for RPi and it has raspi-config preinstalled, but I had found out on sunday and it seems its not for Zero.

2bndy5 commented 5 months ago

Progress! I've narrowed it down to the sdist that we uploaded to pypi. Piwheels builds their bdist from the sdist on pypi. Locally, if I install the package from a sdist, then I get ImportError: libcpp_rf24.so. Still investigating, but this means users that install from source using the sdist on pypi will hit this error.

# this will download the sdist and build the pkg from it
pip install --no-binary pyrf24
2bndy5 commented 5 months ago

I'm also seeing an unexpected directory included in the 64-bit bdist named "pyrf24.libs" which contains libcpp_rf24*-\.so files.

It looks like the auditwheel tool (which is used in our CI by cibuildwheel) is detecting the libcpp_rf24*.so binaries as external libs. Then auditwheel repair finds their path (located within the package's dist), and they are copied/altered to pyrf24.libs folder and replaced with links to the copied/altered binary files (see src code here). This all has something to do with the binaries' RPATH (or its older form RUNPATH) attribute.

FYI, we can't just invoke auditwheel tool from a piwheels build because the auditwheel was designed for wheels using the manylinux platform (a pypi-specific platform/tag that allows distributing bdists to compatible Linux systems). Piwheels uses the system's native platform/tag (armv6l and armv7l) which are unsupported in the manylinux project (see this discussion which has gone stale since most projects can rely on the piwheels project).

2bndy5 commented 5 months ago

Ugh, it won't as simple as installing audiwheel in the build env and using their API to alter our 32-bit builds. Apparently auditwheels requires a non-python-based tool called patchelf. This isn't much of a problem for them since auditwheel is meant to be used in a manylinux docker container (which has patchelf installed). Specifying non-python-based build dependencies in a python project's build requirements is not supported (that I'm aware of).

2bndy5 commented 5 months ago

Also, I think that building from source (32-bit and 64-bit) currently only works if you don't delete the build folder:

ldd ~/venv/lib/python3.11/site-packages/pyrf24/rf24.cpython-311-arm-linux-gnueabihf.so
        linux-vdso.so.1 (0x7ef8d000)
        /usr/lib/arm-linux-gnueabihf/libarmmem-${PLATFORM}.so => /usr/lib/arm-linux-gnueabihf/libarmmem-v7l.so (0x76eb0000)
        libcpp_rf24.so => /home/brendan/repos/pyRF24/build/lib.linux-armv7l-cpython-311/pyrf24/libcpp_rf24.so (0x76ea3000)
        libstdc++.so.6 => /lib/arm-linux-gnueabihf/libstdc++.so.6 (0x76cd0000)
        libgcc_s.so.1 => /lib/arm-linux-gnueabihf/libgcc_s.so.1 (0x76c90000)
        libc.so.6 => /lib/arm-linux-gnueabihf/libc.so.6 (0x76b17000)
        libm.so.6 => /lib/arm-linux-gnueabihf/libm.so.6 (0x76ad0000)
        /lib/ld-linux-armhf.so.3 (0x76f21000)
2bndy5 commented 5 months ago

I think I'm going to have to re-organize the bindings so they (RF24, RF24Network, and RF24Mesh) compile into 1 binary .so file...

DJManas commented 5 months ago

Thanks for the effort and info.

2bndy5 commented 5 months ago

Ok, I think I fixed this problem in the one-bin-4-all branch. Locally, I was able to build a bdist from an sdist (just like piwheels does) and execute the examples without errors. 💯

@DJManas It would nice if you could also test this one-bin-4-all branch, but it isn't necessary. I have tested it on my RPi4 (64-bit) and RPi3 (32-bit), so I fully expect reproducible results. It might be somewhat quicker to compile on the RPi0 because now the linker is only invoked once.

Technical breaking change in API structure

The consolidation of built binaries (into a single pyrf24.so file) does imply a breaking change in the API. Previously, users could import directly from the binaries:

from pyrf24.rf24 import RF24
from pyrf24.rf24_network import RF24Network, RF24NetworkHeader
from pyrf24.rf24_mesh import RF24Mesh

But now importing directly from the unified binary would be done like so:

from pyrf24.pyrf24 import RF24
from pyrf24.pyrf24 import RF24Network, RF24NetworkHeader
from pyrf24.pyrf24 import RF24Mesh

[!IMPORTANT] Importing anything directly from the binaries was never really needed as all API is (& always has been) re-exported via the pyrf24/__init__.py. So, the examples (& migration table in the README) still works the same. Really, just the docs and typing stubs had to be changed.

This approach is almost identical to how the PyO3 project exposes python bindings from a rust-compiled binary.

2bndy5 commented 5 months ago

I just published a test release to testPyPI: v0.3.0.post1.dev2. Installing that version on a 32-bit system should be the same as what piwheels does to build the 32-bit bdists.

DJManas commented 5 months ago

Great thanks. So I can test the 0.3.0.post1.dev2?

DJManas commented 5 months ago

Just did quick test on my OpiZero, which also produced the error. Logged in, switched to venv: source .virtualenv/pyrf24/bin/activate

Uninstalled pyrf24 library: pip uninstall pyrf24

Upgraded pip and setuptools: pip install --upgrade pip setuptools

And tried to install using from the page using: pip install -i https://test.pypi.org/simple/ pyrf24==0.3.0.post1.dev2

It produced error:

Looking in indexes: https://test.pypi.org/simple/
Collecting pyrf24==0.3.0.post1.dev2
  Downloading https://test-files.pythonhosted.org/packages/47/dd/89fb50f618f6bd5d9c67e4fd9f34de40c4e00dba056f8afca8793bcee88b/pyrf24-0.3.0.post1.dev2.tar.gz (436 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 436.9/436.9 kB 113.2 kB/s eta 0:00:00
  Installing build dependencies ... error
  error: subprocess-exited-with-error

  × pip subprocess to install build dependencies did not run successfully.
  │ exit code: 1
  ╰─> [3 lines of output]
      Looking in indexes: https://test.pypi.org/simple/
      ERROR: Could not find a version that satisfies the requirement setuptools>=61 (from versions: none)
      ERROR: No matching distribution found for setuptools>=61
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× pip subprocess to install build dependencies did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.

Strange thing is, that it complains about missing setuptools >61, but using 70.0.0 pip list

Package       Version
------------- -------
pip           24.0
pkg_resources 0.0.0
setuptools    70.0.0

What am I doing wrong? I will try RPi zero tomorrow.

2bndy5 commented 5 months ago

That is odd as hell. I have no idea how/why that setuptools problem is occurring. Maybe you need to deactivate and then re-activate the venv?

Or maybe ensure that the correct installation of pip is used.

python -m pip install -i https://test.pypi.org/simple/ pyrf24==0.3.0.post1.dev2

It may also not be able to find the setuptools package using the testPyPI (-i option), but having a compatible version already installed should mitigate that problem... Unless the setuptools v70 is somehow not actually installed in the env that pip is using.

2bndy5 commented 5 months ago

Looking at setuptools src, I see that

I'm not sure what python version you tried using in your venv (it wasn't mentioned in your post about OpiZero). For my RPi3 32-bit, I'm using python v3.11, and it worked fine when installing from the linked testPyPI release.

Alternatively, you could just git clone the pyRF24 repo (with submodules) and git checkout one-bin-4-all. Then build from source in a similar way that piwheels does:

python -m pip install build 
# make a sdist
python -m build -s
# compile and install from the sdist
python -m pip install --force-reinstall -v dist/pyrf24*.tar.gz
DJManas commented 5 months ago

I have connected the original RPi Zero, used the command to install the dev version of PyRF24, it failed on cmake missing, so I have installed cmake package and then it installed successfully. Not sure what the problem on OPi is, but it might be old linux version or something like that (I had tried to create new venv, same result, python version was 3.9). But as I had written, on RPi Zero it is working:

Looking in indexes: https://test.pypi.org/simple/, https://www.piwheels.org/simple
Collecting pyrf24==0.3.0.post1.dev2
  Using cached https://test-files.pythonhosted.org/packages/47/dd/89fb50f618f6bd5d9c67e4fd9f34de40c4e00dba056f8afca8793bcee88b/pyrf24-0.3.0.post1.dev2.tar.gz (436 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Building wheels for collected packages: pyrf24
  Building wheel for pyrf24 (pyproject.toml) ... done
  Created wheel for pyrf24: filename=pyrf24-0.3.0.post1.dev2-cp311-cp311-linux_armv6l.whl size=207446 sha256=6ef500ca1e99478346a8e396bf8df05c5b3620929eed808968ff173a9efb241a
  Stored in directory: /home/djmanas/.cache/pip/wheels/d6/6d/2b/bdfa12268c972c8e70efb7edb1194c1df021daa5a69bae0089
Successfully built pyrf24
Installing collected packages: pyrf24
Successfully installed pyrf24-0.3.0.post1.dev2

I mean the installation, since its after midnight I will connect NRF24L01 module later today and will try if example scripts are working.

Thanks!

2bndy5 commented 5 months ago

Ah, yes. To build from source, cmake and the python headers (python3-dev pkg) need to be installed. This is noted in the README as well:

sudo apt install cmake python3-dev
DJManas commented 5 months ago

I had connected the NRF24L01 to RPi Zero, and I can confirm its working. Thank you!

2bndy5 commented 5 months ago

Great! I'll merge the changes to main. There's some other changes I'd like to merge in with the submodules (mostly about debug output toggles), then I'll release v0.4.0.

2bndy5 commented 5 months ago

I just published v0.4.0 to PyPI. It should take a day or 2 for piwheels to build a armv6l bdist for v0.4.0 (see piwheels/pyrf24 status).

I yanked v0.3.0 release from PyPI (which should also eventually yank the piwheels builds for v0.3.0). This should prevent the erroneous v0.3.0 release from being used downstream in user projects.

@DJManas Thanks again for reporting this and providing feedback about testing and reproducing! You are awesome!

DJManas commented 5 months ago

@2bndy5 Thanks for you effort! I appreciate it.