codeproject / CodeProject.AI-Server

CodeProject.AI Server is a self contained service that software developers can include in, and distribute with, their applications in order to augment their apps with the power of AI.
Other
574 stars 136 forks source link

[Feature Request] arm64 / CuDA image for nVidia Jetson nano / TX2 type embedded systems. #54

Open CRCinAU opened 1 year ago

CRCinAU commented 1 year ago

There seems to currently be a massive hole in the provided packages for detection in the Docker environment.

There are a lot of people using Deepstack on nVidia Jetson Nano / TX2 type systems that are unable to run AI-Server at all.

Please consider adding Jetpack accelerated docker images for use on the this range of boards.

mtk11 commented 11 months ago

I will appreciate if native support could be added. Currently the cuda drivers are not recognized even thought the latest libcudnn8-dev_8.2.4.15-1+cuda10.2_arm64.deb is installed.

Installing CodeProject.AI Analysis Module

======================================================================

                   CodeProject.AI Installer

======================================================================

CUDA Present...No
Allowing GPU Support: Yes
Allowing CUDA Support: Yes

General CodeProject.AI setup

Creating Directories...Done

Processing side-loaded module ALPR

Python 3.8 is already installed
Virtual Environment already present
Checking for Python 3.8...Found Python 3.8.0. present
Checking for CUDA...Not found
Ensuring PIP is installed...Done
ChrisMaunder commented 8 months ago

This is being worked on

tkg61 commented 4 months ago

Any update on this? What is suggestion for a Jetson NX which already has ubuntu 20. Should we build from scratch like the doc says or wait for an image?

ChrisMaunder commented 2 months ago

No progress unfortunately.

dakaix commented 2 months ago

Adding my voice here for support on Jetson platforms, I was hoping to run this on my Jetson AGX module.

Have tried manually compiling using the Remote SSH approach in the docs, but couldn't work out how to start it afterwards as a headless service!