NixOS / nixpkgs

Nix Packages collection & NixOS
MIT License
17.64k stars 13.79k forks source link

libedgetpu: package and provide nixos module #188719

Open colemickens opened 2 years ago

colemickens commented 2 years ago

Project description

It would be nice to be able to use the Google Coral Edge TPU Usb device with NixOS.

I think it's mostly a matter of building libedgetpu (and maybe tensorflow?) and then bundling the udev rules with it?

Metadata

ctittel commented 2 years ago

I'm interested in this as well, but have never packaged something for NixOS. Maybe one approach could be to wrap the official .deb from Google and patch it with patchelf? Here are the official installation instructions: https://coral.ai/docs/accelerator/get-started

ctittel commented 2 years ago

There is also the edgetpu-compiler, which is already wrapped: https://github.com/NixOS/nixpkgs/blob/master/pkgs/development/libraries/science/robotics/edgetpu-compiler/default.nix

Maybe wrapping libedgetpu would be pretty similar (unpacking the deb and copying the files to the correct output loaction), plus the udev stuff

yorickvP commented 2 years ago

Here's a start:

default.nix:

{ stdenv, fetchFromGitHub, xxd, tensorflow-lite, libusb1, abseil-cpp, flatbuffers }:
stdenv.mkDerivation rec {
  name = "${pname}-${version}";
  pname = "libedgetpu";
  version = "grouper";
  src = fetchFromGitHub {
    owner = "google-coral";
    repo = pname;
    rev = "release-${version}";
    sha256 = "sha256-73hwItimf88Iqnb40lk4ul/PzmCNIfdt6Afi+xjNiBE=";
  };
  patches = [ ./libedgetpu-stddef.diff ];
  makeFlags = ["-f" "makefile_build/Makefile" "libedgetpu" ];
  buildInputs = [ libusb1 abseil-cpp flatbuffers ];
  nativeBuildInputs = [ xxd ];
  TFROOT = "${tensorflow-lite.src}";
  # todo: missing dependency
  #    driver/beagle/beagle_kernel_top_level_handler.cc
  # -> driver/beagle/beagle_kernel_top_level_handler.h
  # -> fatal error: api/driver_options_generated.h: No such file or directory
  enableParallelBuilding = false;
  installPhase = ''
    mkdir -p $out/lib
    cp out/direct/k8/libedgetpu.so.1.0 $out/lib
    ln -s $out/lib/libedgetpu.so.1.0 $out/lib/libedgetpu.so.1
  '';
}

libedgetpu-stddef.diff:

diff --git a/api/allocated_buffer.h b/api/allocated_buffer.h
index 97740f0..7bc0547 100644
--- a/api/allocated_buffer.h
+++ b/api/allocated_buffer.h
@@ -16,6 +16,7 @@
 #define DARWINN_API_ALLOCATED_BUFFER_H_

 #include <functional>
+#include <cstddef>

 namespace platforms {
 namespace darwinn {
naggie commented 1 year ago

@yorickvP were you able to get anywhere with that derivation? Thanks for making a start!

yorickvP commented 1 year ago

It works with the following udev rules:

{
  services.udev.extraRules = ''
    SUBSYSTEM=="usb",ATTRS{idVendor}=="1a6e",GROUP="dialout"
    SUBSYSTEM=="usb",ATTRS{idVendor}=="18d1",GROUP="dialout"
  '';
}

However, it's not nicely packaged up, it should be possible to add something like

passthru.rules = writeTextDir "lib/udev/rules.d/edgetpu.rules" ''
...
''

to de derivation and then add services.udev.packages = [ pkgs.libedgetpu.rules ];

naggie commented 1 year ago

It seems those rules allow anyone in the dialout group to access the device, but I think it won't matter for my use case (frigate with docker). Useful to know though, thanks.

I had some success after figuring out out to use the derivation as a package, directly in the system configuration.nix -- I added (pkgs.callPackage ./edgetpu/default.nix { }) to environment.systemPackages.

This triggers a build of the edgetpu driver, but it fails after some time compiling. I think this might be something to do with newer, incompatible or broken nix packages from the 22.11 channel. image

I guess it's probably flatbuffers itself being broken on nix. I'll keep looking unless you've seen this too?

yorickvP commented 1 year ago

Looks like we dropped this package internally because edgetpu's are out of stock anyways, and it depends on an out of date flatbuffers.

cc @Lucus16

colemickens commented 1 year ago

Is "Edge TPU" headed towards abandonment? Did we close this because it's future is dead, no one plans to work on it, etc?

Just curious, I keep carrying this Coral around with me wondering if I should sell it or keep ahold of it...

yorickvP commented 1 year ago

@cpcloud might have more information. Personally, I would sell the Coral while it still has any value.

naggie commented 1 year ago

I think closing this is very presumptuous. Could we please leave it open? There are lots of people with Corals (for frigate) and there are no indications that Google will stop making them, just supply issues.

naggie commented 1 year ago

Thanks @yorickvP

graham33 commented 1 year ago

I've recently been able to buy one, and it doesn't seem to be EOL yet (they have a policy about it here: https://coral.ai/products/accelerator#:~:text=our%20technology.-,Product%20lifecycle,-Product%20line%20enhancements).

mweinelt commented 1 year ago

There are lots of people with Corals (for frigate) and there are no indications that Google will stop making them, just supply issues.

Frigate 0.12.0 does support Intel GNA via openvino. But I would surely be interested in someone packaging libedgetpu and pycoral, so we can support that as well. @cpcloud offered to check up on them, but have yet to hear back from him.

The frigate PR at #214428 has been coming along nicely, and we're finishing up the tests currently.

graham33 commented 1 year ago

I'm looking at libedgetpu at the moment, as a background task. Currently looking at upgrading tensorflow-lite.

On Tue, 16 May 2023, 12:24 Martin Weinelt, @.***> wrote:

There are lots of people with Corals (for frigate) and there are no indications that Google will stop making them, just supply issues.

Frigate 0.12.0 does support Intel GNA via openvino. But I would surely be interested in someone packaging libedgetpu https://github.com/google-coral/libedgetpu and pycoral https://github.com/google-coral/pycoral, so we can support that as well.

The frigate PR at #214428 https://github.com/NixOS/nixpkgs/pull/214428 has been coming along nicely and we're finishing up the tests currently.

— Reply to this email directly, view it on GitHub https://github.com/NixOS/nixpkgs/issues/188719#issuecomment-1549477959, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACTHH2LWY7HXKBZ3BFO4KJ3XGNPW3ANCNFSM5734GSNA . You are receiving this because you commented.Message ID: @.***>

mweinelt commented 1 year ago

See https://github.com/NixOS/nixpkgs/pull/217599 for my last try to get it to build again.

diode-ee commented 1 year ago

Just came across this and wanted to note that I recently was able to get a few coral TPUs as well. It would be great to get this packaged as it seems there is still a lot of value having it included.

Alan01252 commented 1 year ago

I had real trouble getting this working ( I'm a nix noob ) but this flake appears to work and I was able to run yolov8 images using the ultralytics python package so thought I'd post here incase someone can fix/improve/find it useful. I'm sure I'm duplicating args that I shouldn't but I'm posting this "as-is" warts and all as it's what's working..

{
  description = "A flake for libedgetpu";

  inputs.nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";

  inputs.poetry2nix = {
      url = "github:nix-community/poetry2nix";
inputs.nixpkgs.follows = "nixpkgs";
    };

  outputs = { self, nixpkgs, poetry2nix }:

    let
      overlay1 = final: prev: {
      flatbuffers = prev.flatbuffers.overrideAttrs (oldAttrs: {
        version = "1.12.0";
          NIX_CFLAGS_COMPILE = "-Wno-error=class-memaccess -Wno-error=maybe-uninitialized";
cmakeFlags = oldAttrs.cmakeFlags or [] ++ ["-DFLATBUFFERS_BUILD_SHAREDLIB=ON"];
        NIX_CXXSTDLIB_COMPILE = "-std=c++17";
        configureFlags = oldAttrs.configureFlags or [] ++ ["--enable-shared"];
        src = final.fetchFromGitHub {
          owner = "google";
          repo = "flatbuffers";
          rev = "v1.12.0";
          sha256 = "sha256-L1B5Y/c897Jg9fGwT2J3+vaXsZ+lfXnskp8Gto1p/Tg=";
        };
    });
    };

    overlay2 = final: prev: {
     abseil-cpp = prev.abseil-cpp.overrideAttrs (oldAttrs: {
        NIX_CXXSTDLIB_COMPILE = "-std=c++17";
        cmakeFlags = [
          "-DBUILD_SHARED_LIBS=ON"
          "-DBUILD_TESTING=OFF"
        ];
      });
    };
      systems = [ "x86_64-linux" ];
      forAllSystems = nixpkgs.lib.genAttrs systems;

      pkgs = import nixpkgs {
        system = "x86_64-linux";
        overlays = [ overlay1 overlay2 ];
      };

   in {

    packages.x86_64-linux.libedgetpu = with pkgs; stdenv.mkDerivation rec {
      pname = "libedgetpu";
      version = "grouper";

      src = fetchFromGitHub {
        owner = "google-coral";
        repo = pname;
        rev = "release-${version}";
        sha256 = "sha256-73hwItimf88Iqnb40lk4ul/PzmCNIfdt6Afi+xjNiBE=";
      };

      patches = [ ./libedgetpu-stddef.diff ];

      makeFlags = ["-f" "makefile_build/Makefile" "libedgetpu" ];

      buildInputs = [ libusb1 pkgs.abseil-cpp pkgs.flatbuffers ];

      nativeBuildInputs = [ xxd ];

        NIX_CXXSTDLIB_COMPILE = "-std=c++17";

      TFROOT = "${fetchFromGitHub {
        owner = "tensorflow";
        repo = "tensorflow";
        rev = "v2.7.0";  # replace with the version you need
        sha256 = "sha256-n7jRDPeXsyq4pEWSWmOCas4c8VsArIKlCuwvSU/Ro/c=";  # replace with the actual SHA256
      }}";

      enableParallelBuilding = false;

      installPhase = ''
        mkdir -p $out/lib
        cp out/direct/k8/libedgetpu.so.1.0 $out/lib
        ln -s $out/lib/libedgetpu.so.1.0 $out/lib/libedgetpu.so.1
      '';
    };

    defaultPackage.x86_64-linux = self.packages.x86_64-linux.libedgetpu;

    devShells = forAllSystems (system:
        let
          pkgs = import nixpkgs {
            inherit system;
            overlays = [ poetry2nix.overlay ];
          };

          poetryEnv = pkgs.poetry2nix.mkPoetryEnv {
            projectDir = ./.;
            python = pkgs.python310;
          };
        in
        {
          default = pkgs.mkShell {
            buildInputs = with pkgs; [
              poetry
              gcc
             stdenv.cc.cc.lib

            ];
          };
        });

  };
}

pyproject.toml

[tool.poetry]
name = "coral"
version = "0.1.0"
description = ""
readme = "README.md"

[tool.poetry.dependencies]
python = ">=3.10,<3.12"
ultralytics = "^8.0.147"
numpy = "^1.25.2"
nvidia-cuda-runtime-cu11 = "11.7.99"
nvidia-cuda-cupti-cu11 = "11.7.101"
nvidia-cudnn-cu11 = "8.5.0.96"
nvidia-cublas-cu11 = "11.10.3.66"
nvidia-cufft-cu11 = "10.9.0.58"
nvidia-curand-cu11 = "10.2.10.91"
nvidia-cusolver-cu11 = "11.4.0.1"
nvidia-cusparse-cu11 = "11.7.4.91"
nvidia-nccl-cu11 = "=2.14.3"
nvidia-nvtx-cu11 = "11.7.91"
triton = "2.0.0"
opt-einsum = "3.3"
tensorflow = "v2.8.0"

[build-system]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"

For running on the USB device I had to export the nano model from yolo like this

yolo export model=yolov8n.pt format=edgetpu optimize int8 imgsz=608,352

The 640 models would not load and would crash with something like

Deadline exceeded: USB transfer error 2 [LibUsbDataOutCallback]

For reasons I don't understand I also have to manually set the LD_LIBRARY_PATH for the gcc/glib. I'm sure there's something stupid I'm missing in the flake here LD_LIBRARY_PATH=/nix/store/2w4k8nvdyiggz717ygbbxchpnxrqc6y9-gcc-12.2.0-lib/lib/:/nix/store/n2y7xr21b4vvsap4zyxqvcqid5zr080f-libglvnd-1.5.0/lib/:/nix/store/phl7g6rn4qhl2pkm76167c5k6kp3jyzs-glib-2.74.0/lib/:./result/lib/:/nix/store/6549jyzmdk7mqv3wvrqbqjr02zjp9csw-zlib-1.2.13/lib/:/nix/store/a1kjcwxadb1z0dky08a2brqjh4n49yla-cudatoolkit-11.8.0/lib/ poetry run yolo predict task=detect model=yolov8n_saved_model/yolov8n_full_integer_quant_edgetpu.tflite imgsz=608,352 source=Bullet-Right_2023-08-02T00-05-08.196Z.jpeg

naggie commented 1 year ago

Thanks @Alan01252 that's progress!

jhvst commented 11 months ago

FWIW, I started working on a module over at https://github.com/jhvst/nix-config/blob/main/nixosModules/frigate/flake.nix

It does not currently work (missing udev rules etc.) but at least it loads the libraries properly currently. It's a container, so it is supposed to be run like in here: https://nixos.wiki/wiki/NixOS_Containers and https://www.tweag.io/blog/2020-07-31-nixos-flakes/

@naggie have you got Frigate to work with the libedgetpu flake?

naggie commented 11 months ago

I did not -- I've been using Ubuntu on my CCTV VM as a workaround for now. Thanks for starting on a module for Frigate! That's great, I'll switch to it some time if it works out.

graham33 commented 11 months ago

I had another go at this, and I've managed to get my USB Coral TPU working with frigate and libedgetpu!

I've packaged libedgetpu in my NUR, based on the flake above: https://github.com/graham33/nur-packages/blob/master/pkgs/libedgetpu/default.nix. The only things I've really changed are to bump the version of Tensorflow slightly (just to 2.7.4, I'm not sure how much later libedgetpu would be compatible with), and to add the udev rules in the installation.

Here's an abridged version of my NixOS config:

let                                                                                                                                                                                                                                     
  inherit (pkgs.nur.repos.graham33) libedgetpu;                                                                                                                                                                                         
in {  
  ...
  services.frigate = {                                                                                                                                                                                                                  
    enable = true;                                                                                                                                                                                                                      
    ...                                                                                                                                                                                                     
    settings = {                                                                                                                                                                                                                        
      detectors = {                                                                                                                                                                                                                     
        coral = {                                                                                                                                                                                                                       
          type = "edgetpu";                                                                                                                                                                                                             
          device = "usb";                                                                                                                                                                                                               
        };                                                                                                                                                                                                                              
      };                             
  };                                                                                                                                                                                                                                    

  systemd.services.frigate.environment.LD_LIBRARY_PATH = "${libedgetpu}/lib";                                                                                                                                                           
  systemd.services.frigate.serviceConfig = {                                                                                                                                                                                            
    SupplementaryGroups = "plugdev";                                                                                                                                                                                                    
  };                                                                                                                                                                                                                                    

  services.udev.packages = [ libedgetpu ];                                                                                                                                                                                              
  users.groups.plugdev = {};  

Then I get the magic lines in the frigate logs!:

[2023-10-22 19:42:01] frigate.detectors.plugins.edgetpu_tfl INFO    : Attempting to load TPU as usb
...
[2023-10-22 19:42:04] frigate.detectors.plugins.edgetpu_tfl INFO    : TPU found
naggie commented 11 months ago

Wow excellent well done and thanks!

pdelbarba commented 10 months ago

@graham33 How did you deal with the gasket dependency? AFAICT this just builds the library but the gasket kernel driver is still needed. There's a gasket package nixpkgs but it seems to be having severe kernel compatibility issues.

graham33 commented 10 months ago

@graham33 How did you deal with the gasket dependency? AFAICT this just builds the library but the gasket kernel driver is still needed. There's a gasket package nixpkgs but it seems to be having severe kernel compatibility issues.

I didn't hit this at all. I'm using the USB Coral, is it possible this driver is only needed for the PCIe one (I don't know much about it)?

colino17 commented 10 months ago

@graham33 How did you deal with the gasket dependency? AFAICT this just builds the library but the gasket kernel driver is still needed. There's a gasket package nixpkgs but it seems to be having severe kernel compatibility issues.

I seem to have it working with a mini-PCIe Coral, using package tweaks from this issue as well as a separate Github issue for gasket.

My config is here: Coral Config Gasket Package LibEdgeTPU Package

Hope that helps.

pdelbarba commented 8 months ago

@colino17 libedgetpu package worked great for a couple months but now flatbuffers and libcoraltpu aren't building because of the switch to gcc13. Setting gcc12Stdenv seems to fix the issue.

sneakrz commented 8 months ago

@colino17 libedgetpu package worked great for a couple months but now flatbuffers and libcoraltpu aren't building because of the switch to gcc13. Setting gcc12Stdenv seems to fix the issue.

How does one set gcc12Stdenv? I'm struggling to figure this out. Any help would be super appreciated!

sneakrz commented 8 months ago

@colino17 libedgetpu package worked great for a couple months but now flatbuffers and libcoraltpu aren't building because of the switch to gcc13. Setting gcc12Stdenv seems to fix the issue.

Never mind my previous comment. I was able to get my PCIe Coral working with the gasket kernel module via the unstable channel. I thought I'd need libedgetpu as well, but Frigate is working, and discovering both TPUs- so wonderful.

Alan01252 commented 8 months ago

@colino17 libedgetpu package worked great for a couple months but now flatbuffers and libcoraltpu aren't building because of the switch to gcc13. Setting gcc12Stdenv seems to fix the issue.

Never mind my previous comment. I was able to get my PCIe Coral working with the gasket kernel module via the unstable channel. I thought I'd need libedgetpu as well, but Frigate is working, and discovering both TPUs- so wonderful.

That blows my mind, how is that working? I'm pretty sure you have to pass libedgetpu through as a delegate to tflite for it to work? Is frigate shipping with it?

sneakrz commented 8 months ago

@colino17 libedgetpu package worked great for a couple months but now flatbuffers and libcoraltpu aren't building because of the switch to gcc13. Setting gcc12Stdenv seems to fix the issue.

Never mind my previous comment. I was able to get my PCIe Coral working with the gasket kernel module via the unstable channel. I thought I'd need libedgetpu as well, but Frigate is working, and discovering both TPUs- so wonderful.

That blows my mind, how is that working? I'm pretty sure you have to pass libedgetpu through as a delegate to tflite for it to work? Is frigate shipping with it?

It does get installed in the container, yes, but I was under the impression that it needed to be installed on the host as well.

colino17 commented 8 months ago

@colino17 libedgetpu package worked great for a couple months but now flatbuffers and libcoraltpu aren't building because of the switch to gcc13. Setting gcc12Stdenv seems to fix the issue.

Never mind my previous comment. I was able to get my PCIe Coral working with the gasket kernel module via the unstable channel. I thought I'd need libedgetpu as well, but Frigate is working, and discovering both TPUs- so wonderful.

I would keep an eye on your Frigate logs. I initially went down a similar route with only gasket on the host and thought I had everything working, but I remember there being some issue where my PCIE Coral kept disconnecting even though it was initially recognized. I didn't notice it for the longest time as I also had a USB Coral which was picking up the slack. Perhaps this is fixed now and working properly with that type of configuration.

I think ultimately I'm going to migrate to dual USB Corals to avoid these types of issues entirely.

sneakrz commented 8 months ago

@colino17 libedgetpu package worked great for a couple months but now flatbuffers and libcoraltpu aren't building because of the switch to gcc13. Setting gcc12Stdenv seems to fix the issue.

Never mind my previous comment. I was able to get my PCIe Coral working with the gasket kernel module via the unstable channel. I thought I'd need libedgetpu as well, but Frigate is working, and discovering both TPUs- so wonderful.

I would keep an eye on your Frigate logs. I initially went down a similar route with only gasket on the host and thought I had everything working, but I remember there being some issue where my PCIE Coral kept disconnecting even though it was initially recognized. I didn't notice it for the longest time as I also had a USB Coral which was picking up the slack. Perhaps this is fixed now and working properly with that type of configuration.

I think ultimately I'm going to migrate to dual USB Corals to avoid these types of issues entirely.

Just a little update for anyone interested: It's been a few days, and all is going well so far. Unfortunately, I have rebooted a few times, but not due to any issues. I have a dual TPU with one dedicated to Frigate and the other dedicated to Scrypted. It's had no problems at this point. This guy is thrilled. New to Nix and loving it.

mweinelt commented 8 months ago

Is anyone ever going to submit the packages into nixpkgs?

I invested a lot of time into getting frigate packaged and openvino updated to a working state, and I hoped that something would manifest from this issue. But for some reason this package is being kept out of tree, and as such out of reach for many interested parties.

serpent213 commented 7 months ago

@mweinelt

Is there a configuration example around on how to get this all working?

My goal is have Frigate running with a Coral mini-PCIe card, currently using the nixpkgs-23.11 versions of gasket and frigate and a custom

libedgetpu.nix ```nix { stdenv , lib , fetchFromGitHub , python3 , libusb1 , abseil-cpp , flatbuffers , xxd }: let flatbuffers_1_12 = flatbuffers.overrideAttrs (oldAttrs: rec { version = "1.12.1"; NIX_CFLAGS_COMPILE = "-Wno-error=class-memaccess -Wno-error=maybe-uninitialized"; cmakeFlags = (oldAttrs.cmakeFlags or []) ++ ["-DFLATBUFFERS_BUILD_SHAREDLIB=ON"]; NIX_CXXSTDLIB_COMPILE = "-std=c++17"; configureFlags = (oldAttrs.configureFlags or []) ++ ["--enable-shared"]; src = fetchFromGitHub { owner = "google"; repo = "flatbuffers"; rev = "v${version}"; sha256 = "sha256-5sHddlqWx/5d5/FOMK7qRlR5xbUR47rfejuXI1jymWM="; }; }); in stdenv.mkDerivation rec { # pname = "libedgetpu"; # version = "ddfa7bde33c23afd8c2892182faa3e5b4e6ad94e"; # src = fetchFromGitHub { # owner = "google-coral"; # repo = pname; # rev = version; # sha256 = "sha256-NidGjBPOLu5py7bakqvNQLDi72b5ig9QF9C1UuQldn0="; # }; pname = "libedgetpu"; version = "90b03d96ed83412178ed6e6cfddbd40bb3f84925"; src = fetchFromGitHub { owner = "feranick"; repo = pname; rev = version; sha256 = "sha256-/Eneik+v+juGsg/us+0YBQxkKeJUpGnFqrRPu5nKYWk="; }; # patches = [ ./libedgetpu-stddef.patch ]; makeFlags = ["-f" "makefile_build/Makefile" "libedgetpu" ]; buildInputs = [ libusb1 abseil-cpp flatbuffers_1_12 ]; nativeBuildInputs = [ xxd ]; NIX_CXXSTDLIB_COMPILE = "-std=c++17"; TFROOT = "${fetchFromGitHub { owner = "tensorflow"; repo = "tensorflow"; rev = "v2.8.4"; # latest rev providing tensorflow/lite/c/common.c sha256 = "sha256-MFqsVdSqbNDNZSQtCQ4/4DRpJPG35I0La4MLtRp37Rk="; # rev = "v2.13.1"; # latest rev providing tensorflow/lite/c/common.c # sha256 = "sha256-fCwf7I76gdyeOPVnPPqEw4cI7RrcrshTSHjdfevUriY="; }}"; # TFROOT = "${python3.pkgs.tensorflow}"; enableParallelBuilding = false; installPhase = '' mkdir -p $out/lib cp out/direct/k8/libedgetpu.so.1.0 $out/lib ln -s $out/lib/libedgetpu.so.1.0 $out/lib/libedgetpu.so.1 mkdir -p $out/lib/udev/rules.d cp debian/edgetpu-accelerator.rules $out/lib/udev/rules.d/99-edgetpu-accelerator.rules ''; } ```

Injecting it into Frigate like

      systemd.services.frigate.environment.LD_LIBRARY_PATH = "${libedgetpu}/lib";

Best I could do so far was a successful build when using TensorFlow 2.8.4, newer versions result in

make: *** No rule to make target '/nix/store/ik160yl79rjszcdgj301apza3a5vhd0z-source/tensorflow/lite/c/common.c', needed by '/build/source/makefile_build/../out//nix/store/ik160yl79rjszcdgj301apza3a5vhd0z-source/tensorflow/lite/c/common.o'.  Stop.

which should be fixed in 2.13.1, but no difference. nixpkgs-23.11 includes TF 2.13.0, that is what I was aiming for.

With 2.8.4 Frigate produces a

segfault ``` frigate.detectors.plugins.edgetpu_tfl INFO : Attempting to load TPU as pci frigate.app INFO : Output process started: 69432 frigate.app INFO : Camera processor started for test2: 69437 frigate.app INFO : Capture process started for test2: 69441 Fatal Python error: Segmentation fault Thread 0x00007f0ddca4e6c0 (most recent call first): File "/nix/store/rac8pxbi1vapwrlqzbrkycbyg521djzw-python3-3.11.6/lib/python3.11/threading.py", line 327 in wait File "/nix/store/rac8pxbi1vapwrlqzbrkycbyg521djzw-python3-3.11.6/lib/python3.11/multiprocessing/queues.py", line 231 in _feed File "/nix/store/rac8pxbi1vapwrlqzbrkycbyg521djzw-python3-3.11.6/lib/python3.11/threading.py", line 982 in run File "/nix/store/rac8pxbi1vapwrlqzbrkycbyg521djzw-python3-3.11.6/lib/python3.11/threading.py", line 1045 in _bootstrap_inner File "/nix/store/rac8pxbi1vapwrlqzbrkycbyg521djzw-python3-3.11.6/lib/python3.11/threading.py", line 1002 in _bootstrap Current thread 0x00007f0e31654740 (most recent call first): File "/nix/store/grc5rgqywnlgjm40w0kb8kz0n95vkn4b-python3.11-tensorflow-2.13.0/lib/python3.11/site-packages/tensorflow/lite/python/interpreter.py", line 513 in __init__ File "/nix/store/l0gfxyfahpx92v00ylwavhpgzd05s296-frigate-0.12.1/lib/python3.11/site-packages/frigate/detectors/plugins/edgetpu_tfl.py", line 39 in __init__ File "/nix/store/l0gfxyfahpx92v00ylwavhpgzd05s296-frigate-0.12.1/lib/python3.11/site-packages/frigate/detectors/__init__.py", line 24 in create_detector File "/nix/store/l0gfxyfahpx92v00ylwavhpgzd05s296-frigate-0.12.1/lib/python3.11/site-packages/frigate/object_detection.py", line 52 in __init__ File "/nix/store/l0gfxyfahpx92v00ylwavhpgzd05s296-frigate-0.12.1/lib/python3.11/site-packages/frigate/object_detection.py", line 98 in run_detector File "/nix/store/rac8pxbi1vapwrlqzbrkycbyg521djzw-python3-3.11.6/lib/python3.11/multiprocessing/process.py", line 108 in run File "/nix/store/rac8pxbi1vapwrlqzbrkycbyg521djzw-python3-3.11.6/lib/python3.11/multiprocessing/process.py", line 314 in frigate.detectors.plugins.edgetpu_tfl INFO : TPU found _bootstrap File "/nix/store/rac8pxbi1vapwrlqzbrkycbyg521djzw-python3-3.11.6/lib/python3.11/multiprocessing/popen_fork.py", line 71 in _launch File "/nix/store/rac8pxbi1vapwrlqzbrkycbyg521djzw-python3-3.11.6/lib/python3.11/multiprocessing/popen_fork.py", line 19 in __init__ File "/nix/store/rac8pxbi1vapwrlqzbrkycbyg521djzw-python3-3.11.6/lib/python3.11/multiprocessing/context.py", line 281 in _Popen File "/nix/store/rac8pxbi1vapwrlqzbrkycbyg521djzw-python3-3.11.6/lib/python3.11/multiprocessing/context.py", line 224 in _Popen File "/nix/store/rac8pxbi1vapwrlqzbrkycbyg521djzw-python3-3.11.6/lib/python3.11/multiprocessing/process.py", line 121 in start File "/nix/store/l0gfxyfahpx92v00ylwavhpgzd05s296-frigate-0.12.1/lib/python3.11/site-packages/frigate/object_detection.py", line 179 in start_or_restart File "/nix/store/l0gfxyfahpx92v00ylwavhpgzd05s296-frigate-0.12.1/lib/python3.11/site-packages/frigate/object_detection.py", line 147 in __init__ File "/nix/store/l0gfxyfahpx92v00ylwavhpgzd05s296-frigate-0.12.1/lib/python3.11/site-packages/frigate/app.py", line 214 in start_detectors File "/nix/store/l0gfxyfahpx92v00ylwavhpgzd05s296-frigate-0.12.1/lib/python3.11/site-packages/frigate/app.py", line 379 in start File "/nix/store/l0gfxyfahpx92v00ylwavhpgzd05s296-frigate-0.12.1/lib/python3.11/site-packages/frigate/__main__.py", line 16 in File "", line 88 in _run_code File "", line 198 in _run_module_as_main ```

Also I was surprised that the frigate package depends on tensorflow instead of tensorflow-lite.

Thank you for the effort!

joe2xyz commented 7 months ago

Hi @mweinelt and @serpent213 ,

I can confirm - I have exactly the same problem. Have you been able to resolve it?

Thank you.

Mar 02 15:51:27 frig python3.11[8563]: [2024-03-02 15:51:22] frigate.app                    INFO    : Output process started: 8595
Mar 02 15:51:27 frig python3.11[8563]: [2024-03-02 15:51:22] frigate.app                    INFO    : Camera processor started for name_of_your_camera: 8605
Mar 02 15:51:27 frig python3.11[8563]: [2024-03-02 15:51:22] frigate.app                    INFO    : Capture process started for name_of_your_camera: 8611
Mar 02 15:51:27 frig python3.11[8586]: Fatal Python error: Segmentation fault
Mar 02 15:51:27 frig python3.11[8586]: Thread 0x00007f470f7fe6c0 (most recent call first):
Mar 02 15:51:27 frig python3.11[8586]:   File "/nix/store/sxr2igfkwhxbagri49b8krmcqz168sim-python3-3.11.8/lib/python3.11/threading.py", line 327 in wait
Mar 02 15:51:27 frig python3.11[8586]:   File "/nix/store/sxr2igfkwhxbagri49b8krmcqz168sim-python3-3.11.8/lib/python3.11/multiprocessing/queues.py", line 231 in _feed
Mar 02 15:51:27 frig python3.11[8586]:   File "/nix/store/sxr2igfkwhxbagri49b8krmcqz168sim-python3-3.11.8/lib/python3.11/threading.py", line 982 in run
Mar 02 15:51:27 frig python3.11[8586]:   File "/nix/store/sxr2igfkwhxbagri49b8krmcqz168sim-python3-3.11.8/lib/python3.11/threading.py", line 1045 in _bootstrap_inner
Mar 02 15:51:27 frig python3.11[8586]:   File "/nix/store/sxr2igfkwhxbagri49b8krmcqz168sim-python3-3.11.8/lib/python3.11/threading.py", line 1002 in _bootstrap
Mar 02 15:51:27 frig python3.11[8586]: Current thread 0x00007f47b5111740 (most recent call first):
Mar 02 15:51:27 frig python3.11[8586]:   File "/nix/store/h8pyg044fj7azhrs4fmdbagww22ka83h-python3.11-tensorflow-2.13.0/lib/python3.11/site-packages/tensorflow/lite/python/interpreter.py", line 513 in __init__
Mar 02 15:51:27 frig python3.11[8586]:   File "/nix/store/zd30rz4wn9cr26spi6vdmlyis04nr8vl-frigate-0.13.2/lib/python3.11/site-packages/frigate/detectors/plugins/edgetpu_tfl.py", line 43 in __init__
Mar 02 15:51:27 frig python3.11[8586]:   File "/nix/store/zd30rz4wn9cr26spi6vdmlyis04nr8vl-frigate-0.13.2/lib/python3.11/site-packages/frigate/detectors/__init__.py", line 18 in create_detector
Mar 02 15:51:27 frig python3.11[8586]:   File "/nix/store/zd30rz4wn9cr26spi6vdmlyis04nr8vl-frigate-0.13.2/lib/python3.11/site-packages/frigate/object_detection.py", line 53 in __init__
Mar 02 15:51:27 frig python3.11[8586]:   File "/nix/store/zd30rz4wn9cr26spi6vdmlyis04nr8vl-frigate-0.13.2/lib/python3.11/site-packages/frigate/object_detection.py", line 102 in run_detector
Mar 02 15:51:27 frig python3.11[8586]:   File "/nix/store/sxr2igfkwhxbagri49b8krmcqz168sim-python3-3.11.8/lib/python3.11/multiprocessing/process.py", line 108 in run
Mar 02 15:51:27 frig python3.11[8586]:   File "/nix/store/sxr2igfkwhxbagri49b8krmcqz168sim-python3-3.11.8/lib/python3.11/multiprocessing/process.py", line 314 in _bootstrap
Mar 02 15:51:27 frig python3.11[8586]:   File "/nix/store/sxr2igfkwhxbagri49b8krmcqz168sim-python3-3.11.8/lib/python3.11/multiprocessing/popen_fork.py", line 71 in _launch
Mar 02 15:51:27 frig python3.11[8586]:   File "/nix/store/sxr2igfkwhxbagri49b8krmcqz168sim-python3-3.11.8/lib/python3.11/multiprocessing/popen_fork.py", line 19 in __init__
Mar 02 15:51:27 frig python3.11[8586]:   File "/nix/store/sxr2igfkwhxbagri49b8krmcqz168sim-python3-3.11.8/lib/python3.11/multiprocessing/context.py", line 281 in _Popen
Mar 02 15:51:27 frig python3.11[8586]:   File "/nix/store/sxr2igfkwhxbagri49b8krmcqz168sim-python3-3.11.8/lib/python3.11/multiprocessing/context.py", line 224 in _Popen
Mar 02 15:51:27 frig python3.11[8586]:   File "/nix/store/sxr2igfkwhxbagri49b8krmcqz168sim-python3-3.11.8/lib/python3.11/multiprocessing/process.py", line 121 in start
Mar 02 15:51:27 frig python3.11[8586]:   File "/nix/store/zd30rz4wn9cr26spi6vdmlyis04nr8vl-frigate-0.13.2/lib/python3.11/site-packages/frigate/object_detection.py", line 183 in start_or_restart
Mar 02 15:51:27 frig python3.11[8586]:   File "/nix/store/zd30rz4wn9cr26spi6vdmlyis04nr8vl-frigate-0.13.2/lib/python3.11/site-packages/frigate/object_detection.py", line 151 in __init__
Mar 02 15:51:27 frig python3.11[8586]:   File "/nix/store/zd30rz4wn9cr26spi6vdmlyis04nr8vl-frigate-0.13.2/lib/python3.11/site-packages/frigate/app.py", line 453 in start_detectors
Mar 02 15:51:27 frig python3.11[8586]:   File "/nix/store/zd30rz4wn9cr26spi6vdmlyis04nr8vl-frigate-0.13.2/lib/python3.11/site-packages/frigate/app.py", line 683 in start
Mar 02 15:51:27 frig python3.11[8586]:   File "/nix/store/zd30rz4wn9cr26spi6vdmlyis04nr8vl-frigate-0.13.2/lib/python3.11/site-packages/frigate/__main__.py", line 17 in <module>
Mar 02 15:51:27 frig python3.11[8586]:   File "<frozen runpy>", line 88 in _run_code
Mar 02 15:51:27 frig python3.11[8586]:   File "<frozen runpy>", line 198 in _run_module_as_main
Mar 02 15:51:27 frig python3.11[8586]: Extension modules: markupsafe._speedups, psutil._psutil_linux, psutil._psutil_posix, playhouse._sqlite_ext, greenlet._greenlet, zope.interface._zope_interface_coptimizations, gevent.libev.corecext, gevent._gevent_c_greenlet_p
rimitives, gevent._gevent_c_hub_local, gevent._gevent_c_waiter, gevent._gevent_c_hub_primitives, gevent._gevent_c_ident, gevent._gevent_cgreenlet, gevent._gevent_c_abstract_linkable, gevent._gevent_cevent, gevent._gevent_clocal, gevent._gevent_cqueue, numpy.core._multiar
ray_umath, numpy.core._multiarray_tests, numpy.linalg._umath_linalg, numpy.fft._pocketfft_internal, numpy.random._common, numpy.random.bit_generator, numpy.random._bounded_integers, numpy.random._mt19937, numpy.random.mtrand, numpy.random._philox, numpy.random._pcg64, nu
mpy.random._sfc64, numpy.random._generator, matplotlib._c_internal_utils, PIL._imaging, matplotlib._path, kiwisolver._cext, matplotlib._image, pydantic.typing, pydantic.errors, pydantic.version, pydantic.utils, pydantic.class_validators, pydantic.config, pydantic.color,
pydantic.datetime_parse, pydantic.validators, pydantic.networks, pydantic.types, pydantic.json, pydantic.error_wrappers, pydantic.fields, pydantic.parse, pydantic.schema, pydantic.main, pydantic.dataclasses, pydantic.annotated_types, pydantic.decorator, pydantic.env_sett
ings, pydantic.tools, pydantic, _cffi_backend, cv2, yaml._yaml, _ruamel_yaml, google.protobuf.pyext._message, tensorflow.python.framework.fast_tensor_util, h5py._errors, h5py.defs, h5py._objects, h5py.h5, h5py.utils, h5py.h5t, h5py.h5s, h5py.h5ac, h5py.h5p, h5py.h5r, h5p
y._proxy, h5py._conv, h5py.h5z, h5py.h5a, h5py.h5d, h5py.h5ds, h5py.h5g, h5py.h5i, h5py.h5f, h5py.h5fd, h5py.h5pl, h5py.h5o, h5py.h5l, h5py._selector, scipy._lib._ccallback_c, scipy.sparse._sparsetools, _csparsetools, scipy.sparse._csparsetools, scipy.linalg._fblas, scip
y.linalg._flapack, scipy.linalg.cython_lapack, scipy.linalg._cythonized_array_utils, scipy.linalg._solve_toeplitz, scipy.linalg._flinalg, scipy.linalg._decomp_lu_cython, scipy.linalg._matfuncs_sqrtm_triu, scipy.linalg.cython_blas, scipy.linalg._matfuncs_expm, scipy.linal
g._decomp_update, scipy.sparse.linalg._dsolve._superlu, scipy.sparse.linalg._eigen.arpack._arpack, scipy.sparse.csgraph._tools, scipy.sparse.csgraph._shortest_path, scipy.sparse.csgraph._traversal, scipy.sparse.csgraph._min_spanning_tree, scipy.sparse.csgraph._flow, scip
y.sparse.csgraph._matching, scipy.sparse.csgraph._reordering, lxml._elementpath, lxml.etree, lxml.builder, setproctitle._setproctitle, scipy.ndimage._nd_image, scipy.special._ufuncs_cxx, scipy.special._ufuncs, scipy.special._specfun, scipy.special._comb, scipy.special._e
llip_harm_2, _ni_label, scipy.ndimage._ni_label, scipy.spatial._ckdtree, scipy._lib.messagestream, scipy.spatial._qhull, scipy.spatial._voronoi, scipy.spatial._distance_wrap, scipy.spatial._hausdorff, scipy.spatial.transform._rotation, scipy.optimize._minpack2, scipy.opt
imize._group_columns, scipy.optimize._trlib._trlib, scipy.optimize._lbfgsb, _moduleTNC, scipy.optimize._moduleTNC, scipy.optimize._cobyla, scipy.optimize._slsqp, scipy.optimize._minpack, scipy.optimize._lsq.givens_elimination, scipy.optimize._zeros, scipy.optimize._highs
.cython.src._highs_wrapper, scipy.optimize._highs._highs_wrapper, scipy.optimize._highs.cython.src._highs_constants, scipy.optimize._highs._highs_constants, scipy.linalg._interpolative, scipy.optimize._bglu_dense, scipy.optimize._lsap, scipy.optimize._direct, scipy.integ
rate._odepack, scipy.integrate._quadpack, scipy.integrate._vode, scipy.integrate._dop, scipy.integrate._lsoda, scipy.special.cython_special, scipy.stats._stats, scipy.stats.beta_ufunc, scipy.stats._boost.beta_ufunc, scipy.stats.binom_ufunc, scipy.stats._boost.binom_ufunc
, scipy.stats.nbinom_ufunc, scipy.stats._boost.nbinom_ufunc, scipy.stats.hypergeom_ufunc, scipy.stats._boost.hypergeom_ufunc, scipy.stats.ncf_ufunc, scipy.stats._boost.ncf_ufunc, scipy.stats.ncx2_ufunc, scipy.stats._boost.ncx2_ufunc, scipy.stats.nct_ufunc, scipy.stats._b
oost.nct_ufunc, scipy.stats.skewnorm_ufunc, scipy.stats._boost.skewnorm_ufunc, scipy.stats.invgauss_ufunc, scipy.stats._boost.invgauss_ufunc, scipy.interpolate._fitpack, scipy.interpolate.dfitpack, scipy.interpolate._bspl, scipy.interpolate._ppoly, scipy.interpolate.inte
rpnd, scipy.interpolate._rbfinterp_pythran, scipy.interpolate._rgi_cython, scipy.stats._biasedurn, scipy.stats._levy_stable.levyst, scipy.stats._stats_pythran, scipy._lib._uarray._uarray, scipy.stats._ansari_swilk_statistics, scipy.stats._sobol, scipy.stats._qmc_cy, scip
y.stats._mvn, scipy.stats._rcont.rcont, scipy.stats._unuran.unuran_wrapper (total: 192)
Mar 02 15:51:43 frig python3.11[8563]: [2024-03-02 15:51:43] frigate.watchdog               INFO    : Detection appears to have stopped. Exiting Frigate...
Mar 02 15:51:43 frig python3.11[8563]: [2024-03-02 15:51:43] frigate.app                    INFO    : Stopping...
Mar 02 15:51:43 frig python3.11[8563]: [2024-03-02 15:51:43] frigate.storage                INFO    : Exiting storage maintainer...
Mar 02 15:51:43 frig python3.11[8563]: [2024-03-02 15:51:43] frigate.ptz.autotrack          INFO    : Exiting autotracker...
Mar 02 15:51:43 frig python3.11[8563]: [2024-03-02 15:51:43] frigate.watchdog               INFO    : Exiting watchdog...
Mar 02 15:51:43 frig python3.11[8563]: [2024-03-02 15:51:43] frigate.record.cleanup         INFO    : Exiting recording cleanup...
Mar 02 15:51:43 frig python3.11[8563]: [2024-03-02 15:51:43] frigate.stats                  INFO    : Exiting stats emitter...
Mar 02 15:51:43 frig python3.11[8563]: [2024-03-02 15:51:43] frigate.events.cleanup         INFO    : Exiting event cleanup...
Mar 02 15:51:43 frig python3.11[8563]: [2024-03-02 15:51:43] frigate.object_processing      INFO    : Exiting object processor...
Mar 02 15:51:43 frig python3.11[8563]: [2024-03-02 15:51:43] frigate.events.maintainer      INFO    : Exiting event processor...
Mar 02 15:51:43 frig python3.11[8563]: [2024-03-02 15:51:43] frigate.comms.ws               INFO    : Exiting websocket client...
Mar 02 15:51:43 frig python3.11[8563]: [2024-03-02 15:51:43] peewee.sqliteq                 INFO    : writer received shutdown request, exiting.
Mar 02 15:51:43 frig python3.11[8563]: [2024-03-02 15:51:43] watchdog.name_of_your_camera   INFO    : Terminating the existing ffmpeg process...
Mar 02 15:51:43 frig python3.11[8563]: [2024-03-02 15:51:43] watchdog.name_of_your_camera   INFO    : Waiting for ffmpeg to exit gracefully...
Mar 02 15:51:44 frig python3.11[8563]: [2024-03-02 15:51:44] frigate.output                 INFO    : exiting output process...
mweinelt commented 7 months ago

Also I was surprised that the frigate package depends on tensorflow instead of tensorflow-lite.

https://github.com/blakeblackshear/frigate/pull/5611

Have you been able to resolve it?

Won't spend any time on this as long as it is not brought into nixpkgs. It is too hard to reason about the environments of everyone involved otherwise.

I also don't own any coral device fwiw.

serpent213 commented 6 months ago

I can confirm - I have exactly the same problem. Have you been able to resolve it?

No, didn't do further research, went with OpenVINO for now...

joe2xyz commented 6 months ago

Thank you @serpent213 . Could you please share the relevant Nix / Frigate configuration?

I wonder if you need to somehow add OpenVINO Python module and also how the OpenVINO configuration looks like in Frigate config...

Thank you.

GoogleBot42 commented 5 months ago

I also did not have success getting coral running using the newer tensorflow in 23.11. It used to work before my 23.11 upgrade and there's only so much time I want to put into getting it working again. So... I cheated by downgrading tensorflow/frigrate in an overlay. It seems to work. Eventually I hope to remove this band-aid though.

  nixpkgs-frigate.url = "github:NixOS/nixpkgs/5cfafa12d57374f48bcc36fda3274ada276cf69e";
final: prev:

let
  system = prev.system;
  frigatePkgs = inputs.nixpkgs-frigate.legacyPackages.${system};
in
{
  # It seems that libedgetpu needs to be built with the newer version of tensorflow in nixpkgs
  # but I am lazy so I instead just downgrade by using the old nixpkgs
  libedgetpu = frigatePkgs.callPackage ./libedgetpu { };
  frigate = frigatePkgs.frigate;
}
VTimofeenko commented 5 months ago

A (very hacky) solution to this is to build libedgetpu using docker+bazel method, then push the built binary as "libedgetpu-bin" to frigate. Seems to work^1 for nixpkgs-unstable.

Steps:

  1. Get latest commit from google-coral/libedgetpu (e35aed1)
  2. Have docker installed
  3. Edit Makefile, set SHELL to just "bash"
  4. Follow the build instructions
  5. Copy out/.../libedgetpu.so.1 someplace
  6. Use the package from spoiler below (it basically copies prebuilt library and creates udev rules
  7. Add it to Frigate systemd service in a very similar way as above in the thread, but this time libusb also goes into LD_LIBRARY_PATH.
libedgetpu-bin pkg ```nix { stdenv, }: stdenv.mkDerivation { src = ./libedgetpu.so.1.0; pname = "libedgetpu"; version = "whatever"; dontBuild = true; dontUnpack = true; installPhase = '' mkdir -p $out/lib cp $src $out/lib/libedgetpu.so.1.0 ln -s $out/lib/libedgetpu.so.1.0 $out/lib/libedgetpu.so.1 mkdir -p $out/lib/udev/rules.d cat >> $out/lib/udev/rules.d/99-edgetpu-accelerator.rules <
Frigate systemd config ```nix { pkgs, lib, ... }: { services.frigate.settings.detectors.coral = { type = "edgetpu"; device = "usb"; }; systemd.services.frigate.environment.LD_LIBRARY_PATH = lib.makeLibraryPath [ pkgs.libedgetpu-bin pkgs.libusb # libusb ]; systemd.services.frigate.serviceConfig = { SupplementaryGroups = "plugdev"; }; services.udev.packages = [ pkgs.libedgetpu-bin ]; users.groups.plugdev = { }; } ```

^1: "work" is defined as:

  • Frigate launched without crashing.
  • CPU usage and LA are way down.
  • Frigate reports using coral.
  • Coral detected me standing in the driveway.
  • 30 minutes have passed and Frigate hasn't crashed.
pdelbarba commented 4 months ago

I think I have a working set of packages for getting the drivers running. So far, everything is stable on frigate 13.2

To summarize, it seems like part of the problem is the incredibly old version of flatbuffers used, so I went and tried to mimic the build environment of the Ubuntu 22.04LTS container that libedgetpu expects since that's known to work (thanks @VTimofeenko). This meant using an old abseil version and gcc 12 which introduces some compiler bugs I had to work around with the extremely concerning NIX_CFLAGS_COMPILE line below.

I'll caution I'm not sure which parts are necessary and which aren't anymore, it took me a while to get this working and there might be a slightly more elegant way to get this to all behave, but at least this is a start.

gasket.nix ``` { stdenv, lib, fetchFromGitHub, kernel }: stdenv.mkDerivation rec { pname = "gasket"; version = "1.0-18-unstable-2023-09-05"; src = fetchFromGitHub { owner = "google"; repo = "gasket-driver"; rev = "5815ee3908a46a415aac616ac7b9aedcb98a504c"; sha256 = "sha256-O17+msok1fY5tdX1DvqYVw6plkUDF25i8sqwd6mxYf8="; }; makeFlags = kernel.makeFlags ++ [ "-C" "${kernel.dev}/lib/modules/${kernel.modDirVersion}/build" "M=$(PWD)" ]; buildFlags = [ "modules" ]; installFlags = [ "INSTALL_MOD_PATH=${placeholder "out"}" ]; installTargets = [ "modules_install" ]; sourceRoot = "${src.name}/src"; hardeningDisable = [ "pic" "format" ]; nativeBuildInputs = kernel.moduleBuildDependencies; meta = with lib; { description = "The Coral Gasket Driver allows usage of the Coral EdgeTPU on Linux systems."; homepage = "https://github.com/google/gasket-driver"; license = licenses.gpl2; maintainers = [ lib.maintainers.kylehendricks ]; platforms = platforms.linux; broken = versionOlder kernel.version "5.15"; }; } ```
libedgetpu.nix ``` { stdenv , lib , fetchFromGitHub , python3 , libusb1 , abseil-cpp_202308 , flatbuffers , xxd , gcc12Stdenv }: let flatbuffers_1_12 = flatbuffers.overrideAttrs (oldAttrs: rec { version = "1.12.1"; NIX_CFLAGS_COMPILE = "-Wno-error=class-memaccess -Wno-error=maybe-uninitialized -Wno-error=stringop-overflow -Wno-error=uninitialized"; cmakeFlags = (oldAttrs.cmakeFlags or []) ++ ["-DFLATBUFFERS_BUILD_SHAREDLIB=ON"]; NIX_CXXSTDLIB_COMPILE = "-std=c++17"; configureFlags = (oldAttrs.configureFlags or []) ++ ["--enable-shared"]; src = fetchFromGitHub { owner = "google"; repo = "flatbuffers"; rev = "v${version}"; sha256 = "sha256-5sHddlqWx/5d5/FOMK7qRlR5xbUR47rfejuXI1jymWM="; }; }); stdenv = gcc12Stdenv; in stdenv.mkDerivation rec { pname = "libedgetpu"; version = "e35aed18fea2e2d25d98352e5a5bd357c170bd4d"; src = fetchFromGitHub { owner = "google-coral"; repo = pname; rev = version; sha256 = "sha256-SabiFG/EgspiCFpg8XQs6RjFhrPPUfhILPmYQQA1E2w="; }; # patches = [ ./libedgetpu-stddef.patch ]; makeFlags = ["-f" "makefile_build/Makefile" "libedgetpu" ]; buildInputs = [ libusb1 abseil-cpp_202308 flatbuffers_1_12 ]; nativeBuildInputs = [ xxd ]; NIX_CXXSTDLIB_COMPILE = "-std=c++17"; TFROOT = "${fetchFromGitHub { owner = "tensorflow"; repo = "tensorflow"; rev = "v2.16.1"; # latest rev providing tensorflow/lite/c/common.c sha256 = "sha256-MFqsVdSqbNDNZSQtCQ4/4DRpJPG35I0La4MLtRp37Rk="; }}"; enableParallelBuilding = false; installPhase = '' mkdir -p $out/lib cp out/direct/k8/libedgetpu.so.1.0 $out/lib ln -s $out/lib/libedgetpu.so.1.0 $out/lib/libedgetpu.so.1 mkdir -p $out/lib/udev/rules.d cp debian/edgetpu-accelerator.rules $out/lib/udev/rules.d/99-edgetpu-accelerator.rules ''; } ```
hvolkmer commented 3 months ago

I tried @pdelbarba config and it didn't work for me with NixOS 24.05, frigate 0.13.2, which uses tensorflow 2.13.0, python 3.11. I got a segfault when loading the model file (could repro with a simple python script) when calling in /nix/store/10sszc68w7a4yfdzsx1pwzrjx8c0rka8-python3.11-tensorflow-2.13.0/lib/python3.11/site-packages/tensorflow/lite/python/interpreter_wrapper/_pywrap_tensorflow_interpreter_wrapper.so

This sounds like some conflict between the tensorflow libraries linked in python vs. libedgetpu.

@VTimofeenko docker compile approach worked for me.

Here's the ldd output from the lib compiled with @pdelbarba config:

     linux-vdso.so.1 (0x00007ffec8792000)
        libflatbuffers.so.1 => /nix/store/1bysrrykzrjnrd4nl7bwinpcq2vbp8jk-flatbuffers-1.12.1/lib/libflatbuffers.so.1 (0x00007f42faa68000)
        libabsl_flags.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_flags.so.2308.0.0 (0x00007f42faa63000)
        libabsl_flags_internal.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_flags_internal.so.2308.0.0 (0x00007f42faa59000)
        libabsl_flags_reflection.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_flags_reflection.so.2308.0.0 (0x00007f42faa4c000)
        libabsl_flags_marshalling.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_flags_marshalling.so.2308.0.0 (0x00007f42faa3e000)
        libabsl_hash.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_hash.so.2308.0.0 (0x00007f42faa39000)
        libabsl_hashtablez_sampler.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_hashtablez_sampler.so.2308.0.0 (0x00007f42faa33000)
        libabsl_raw_hash_set.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_raw_hash_set.so.2308.0.0 (0x00007f42faa2d000)
        libabsl_str_format_internal.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_str_format_internal.so.2308.0.0 (0x00007f42faa10000)
        libusb-1.0.so.0 => /nix/store/s52wvz5i9nlsx0pp1z32bj5ipq5hs45l-libusb-1.0.27/lib/libusb-1.0.so.0 (0x00007f42fa9ee000)
        libstdc++.so.6 => /nix/store/d4dzkmwkyrkc1l8z9x7vcdj193fx4g45-gcc-12.3.0-lib/lib/libstdc++.so.6 (0x00007f42fa7c6000)
        libm.so.6 => /nix/store/k7zgvzp2r31zkg9xqgjim7mbknryv6bs-glibc-2.39-52/lib/libm.so.6 (0x00007f42fa6e3000)
        libgcc_s.so.1 => /nix/store/d4dzkmwkyrkc1l8z9x7vcdj193fx4g45-gcc-12.3.0-lib/lib/libgcc_s.so.1 (0x00007f42fa6c2000)
        libc.so.6 => /nix/store/k7zgvzp2r31zkg9xqgjim7mbknryv6bs-glibc-2.39-52/lib/libc.so.6 (0x00007f42fa4d5000)
        libabsl_flags_config.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_flags_config.so.2308.0.0 (0x00007f42fa4cb000)
        libabsl_flags_program_name.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_flags_program_name.so.2308.0.0 (0x00007f42fa4c5000)
        libabsl_flags_private_handle_accessor.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_flags_private_handle_accessor.so.2308.0.0 (0x00007f42fa4c0000)
        libabsl_flags_commandlineflag.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_flags_commandlineflag.so.2308.0.0 (0x00007f42fa4bb000)
        libabsl_flags_commandlineflag_internal.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_flags_commandlineflag_internal.so.2308.0.0 (0x00007f42fa4b6000)
        libabsl_cord.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_cord.so.2308.0.0 (0x00007f42fa499000)
        libabsl_cordz_info.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_cordz_info.so.2308.0.0 (0x00007f42fa492000)
        libabsl_cord_internal.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_cord_internal.so.2308.0.0 (0x00007f42fa477000)
        libabsl_cordz_functions.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_cordz_functions.so.2308.0.0 (0x00007f42fa472000)
        libabsl_cordz_handle.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_cordz_handle.so.2308.0.0 (0x00007f42fa46c000)
        libabsl_crc_cord_state.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_crc_cord_state.so.2308.0.0 (0x00007f42fa462000)
        libabsl_crc32c.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_crc32c.so.2308.0.0 (0x00007f42fa45c000)
        libabsl_crc_internal.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_crc_internal.so.2308.0.0 (0x00007f42fa455000)
        libabsl_crc_cpu_detect.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_crc_cpu_detect.so.2308.0.0 (0x00007f42fa450000)
        libabsl_bad_optional_access.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_bad_optional_access.so.2308.0.0 (0x00007f42fa44b000)
        libabsl_city.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_city.so.2308.0.0 (0x00007f42fa444000)
        libabsl_bad_variant_access.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_bad_variant_access.so.2308.0.0 (0x00007f42fa43f000)
        libabsl_low_level_hash.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_low_level_hash.so.2308.0.0 (0x00007f42fa43a000)
        libabsl_exponential_biased.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_exponential_biased.so.2308.0.0 (0x00007f42fa435000)
        libabsl_synchronization.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_synchronization.so.2308.0.0 (0x00007f42fa422000)
        libabsl_graphcycles_internal.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_graphcycles_internal.so.2308.0.0 (0x00007f42fa417000)
        libabsl_kernel_timeout_internal.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_kernel_timeout_internal.so.2308.0.0 (0x00007f42fa411000)
        libabsl_stacktrace.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_stacktrace.so.2308.0.0 (0x00007f42fa40c000)
        libabsl_symbolize.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_symbolize.so.2308.0.0 (0x00007f42fa404000)
        libabsl_malloc_internal.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_malloc_internal.so.2308.0.0 (0x00007f42fa3fd000)
        libabsl_debugging_internal.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_debugging_internal.so.2308.0.0 (0x00007f42fa3f4000)
        libabsl_demangle_internal.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_demangle_internal.so.2308.0.0 (0x00007f42fa3e8000)
        libabsl_time.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_time.so.2308.0.0 (0x00007f42fa3d1000)
        libabsl_strings.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_strings.so.2308.0.0 (0x00007f42fa3af000)
        libabsl_string_view.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_string_view.so.2308.0.0 (0x00007f42fa3aa000)
        libabsl_strings_internal.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_strings_internal.so.2308.0.0 (0x00007f42fa3a2000)
        libabsl_base.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_base.so.2308.0.0 (0x00007f42fa39b000)
        libabsl_spinlock_wait.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_spinlock_wait.so.2308.0.0 (0x00007f42fa396000)
        libabsl_throw_delegate.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_throw_delegate.so.2308.0.0 (0x00007f42fa38f000)
        libabsl_raw_logging_internal.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_raw_logging_internal.so.2308.0.0 (0x00007f42fa38a000)
        libabsl_log_severity.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_log_severity.so.2308.0.0 (0x00007f42fa383000)
        libabsl_int128.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_int128.so.2308.0.0 (0x00007f42fa37c000)
        libabsl_civil_time.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_civil_time.so.2308.0.0 (0x00007f42fa374000)
        libabsl_time_zone.so.2308.0.0 => /nix/store/pvxbaxnfi9l9ljmyjbk9lrqc8rcf765p-abseil-cpp-20230802.2/lib/libabsl_time_zone.so.2308.0.0 (0x00007f42fa354000)
        /nix/store/k7zgvzp2r31zkg9xqgjim7mbknryv6bs-glibc-2.39-52/lib/ld-linux-x86-64.so.2 => /nix/store/k7zgvzp2r31zkg9xqgjim7mbknryv6bs-glibc-2.39-52/lib64/ld-linux-x86-64.so.2 (0x00007f42fabf1000)
        libudev.so.1 => /nix/store/7j8mbhf7c3sigq7lwl5vc5h7lhx34m6d-systemd-minimal-libs-255.6/lib/libudev.so.1 (0x00007f42fa30c000)
        libpthread.so.0 => /nix/store/k7zgvzp2r31zkg9xqgjim7mbknryv6bs-glibc-2.39-52/lib/libpthread.so.0 (0x00007f42fa307000)
        librt.so.1 => /nix/store/k7zgvzp2r31zkg9xqgjim7mbknryv6bs-glibc-2.39-52/lib/librt.so.1 (0x00007f42fa300000)
        libcap.so.2 => /nix/store/yvhyhcfhc98wm86pw4ygk5jdr804iwrw-libcap-2.69-lib/lib/libcap.so.2 (0x00007f42fa2f2000)

Here is the ldd output from @VTimofeenko docker compile approach:

   linux-vdso.so.1 (0x00007ffecf147000)
        libusb-1.0.so.0 => not found
        libstdc++.so.6 => not found
        libm.so.6 => /nix/store/k7zgvzp2r31zkg9xqgjim7mbknryv6bs-glibc-2.39-52/lib/libm.so.6 (0x00007f301b865000)
        libgcc_s.so.1 => /nix/store/1q9vc0lq7qjlfjz47mfmlzdf86c543jy-xgcc-13.2.0-libgcc/lib/libgcc_s.so.1 (0x00007f301b840000)
        libc.so.6 => /nix/store/k7zgvzp2r31zkg9xqgjim7mbknryv6bs-glibc-2.39-52/lib/libc.so.6 (0x00007f301b653000)
        /nix/store/k7zgvzp2r31zkg9xqgjim7mbknryv6bs-glibc-2.39-52/lib64/ld-linux-x86-64.so.2 (0x00007f301ba75000)

To make this work without "luck" (i.e. having a compatible tensorflow version on both ends), I think the config needs to be changed to compile the library with statically linked dependencies (like in the docker version).