NVIDIA / TensorRT

NVIDIA® TensorRT™ is an SDK for high-performance deep learning inference on NVIDIA GPUs. This repository contains the open source components of TensorRT.
https://developer.nvidia.com/tensorrt
Apache License 2.0
10.53k stars 2.1k forks source link

Opensource TensorRT python bindings PLLLLLEEEAAAAASSSSEEEEEEE!! #762

Closed fish-finger closed 3 years ago

fish-finger commented 4 years ago

Description

You keep locking out access to code even when there is NO COMMERCIAL REASON TO DO SO!!!!!

Your a hardware provider FFS, so why bother forcing your consumers to use pre-compiled binaries that are based on particular OS / framework versions nobody can use unless our computer systems are completely aligned with NVIDIAs.

Stop being A**HATS and put the TensorRT environment fully OS so we can use your hardware without having to wait for you to catch up!!!!

Environment

TensorRT Version: GPU Type: Nvidia Driver Version: CUDA Version: CUDNN Version: Operating System + Version: Python Version (if applicable): TensorFlow Version (if applicable): PyTorch Version (if applicable): Baremetal or Container (which commit + image + tag):

Relevant Files

Steps To Reproduce

mk-nvidia commented 4 years ago

Hi @fish-finger , please specify which OS/framework/TensorRT versions you're using (. Please also describe the error(s) you saw. Thanks!

fish-finger commented 4 years ago

Lets start with the python bindings for the TensorRT. Your supporting Python 3.7 with CUDNN 8 / CUDA 10-2. The python wheel
"tensorrt-7.0.0.11-cp37-none-linux_x86_64.whl" contains an ".SO" file which will not working with later versions of Python 3.8. Why provide an embedded binary that can only be used with a specific python version, if you provide the source code I could compile it myself to the python version I use in production.

And I would suggest review the entire Deepstream platform Python Bindings architectures for the same reason if you want wider adoption.

mk-nvidia commented 4 years ago

Coincidentally, we're already planning to open source the Python bindings for the TensorRT API in the next few months. I'll forward your feedback to the Deepstream team as well. Thanks for your patience. Happy to hear more feedback about other things we can improve to make your experience better.

nvzhihanj commented 4 years ago

There are a variety of containers you can find on https://ngc.nvidia.com/catalog/containers/nvidia:tensorrt, which could potentially save your time setting up the environments locally. You can also refer to the release notes for the CUDA/TensorRT version you want: https://docs.nvidia.com/deeplearning/tensorrt/container-release-notes/index.html

fish-finger commented 4 years ago

The nvidia containers are (after many 100s hours of investigation) not that useful since they dont publish the dockerfiles and just make it harder to understand the apis to the nvidia hardware. AI devops/engineering is a fantastically complex business, your containers are useless for anything other than running demos Nvidia prepared (IMHO). Worse you force folks to use authentication to create tensorRT engines in deepstream (which stopped my firm from adopting it).

So back to the question, why have you not opensourced the TensortRT python bindings so that we can build them to whatever version we want given that the current bindings are a full LTS behind for Ubuntu and a major release behind for Python?

fish-finger commented 4 years ago

Coincidentally, we're already planning to open source the Python bindings for the TensorRT API in the next few months. I'll forward your feedback to the Deepstream team as well. Thanks for your patience. Happy to hear more feedback about other things we can improve to make your experience better.

Chaps, are you serious, a couple of months, were just talking about python bindings?

mk-nvidia commented 4 years ago

Chaps, are you serious, a couple of months, were just talking about python bindings?

There are other features / bug fixes in the release schedule before we get to the task of releasing these bindings.

pranavm-nvidia commented 4 years ago

The nvidia containers are (after many 100s hours of investigation) not that useful since they dont publish the dockerfiles and just make it harder to understand the apis to the nvidia hardware.

To get the Dockerfiles, you can try something like:

docker run -v /var/run/docker.sock:/var/run/docker.sock --rm chenzj/dfimage <image_id>

using docker image ls to figure out the image ID.

fish-finger commented 4 years ago

Fair play, decent feedback and technical as well from multiple sources, tips hat.

  1. You have a job to do and a release pipeline but NVIDIA are simply not customer focused, go open source with TRT, DEEPSTREAM and supporting software. We Abandoned DEEPSTREAM since all the advantages of using it were outweighed by the combination of closed code binaries and poor architecture (who rights glue code in c/c++ these days?). If you want to be complete legends rewrite DEEPSTREAM in Rust + Python, Rust to handle the streaming and Python the glue code (too late for me already doing it).

  2. The NVIDIA docker definitions at gitlabs/nvidia are far more useful than the entire container environment @ https://ngc.nvidia.com/ which requires accounts and constant re-authentication and is a bit of a PITA tbh.

  3. I can build the environment for trt p3.7 myself (aint my first rodeo), it just means having to manage multiple AI Pipelines, which means multiple devops, multiple data gateways, multiple repos, more development (computer vision components are very interconnected)....... get the picture.

NVIDIA needs to stop black boxing their software technology and embrace common sense open source for AI, its hardware is bloody hard enough to use as it is without that nonsense.

Thanks for the replies.

pranavm-nvidia commented 4 years ago

I understand your frustration; Hopefully open-sourcing the Python bindings will alleviate at least some of the issues you've highlighted here.

We really do appreciate your feedback, but we'd appreciate it even more if you keep our code of conduct in mind for your future posts.

fish-finger commented 4 years ago

I understand your frustration; Hopefully open-sourcing the Python bindings will alleviate at least some of the issues you've highlighted here.

We really do appreciate your feedback, but we'd appreciate it even more if you keep our code of conduct in mind for your future posts.

Rebuilding entire AI pipelines for a specific python version (for a small firm) post Covid 19 is not a small deal, and unfortunately folks dont listen unless you SHOUT sometimes. I am happy to provide direct feedback about the experiences we have had to assist in a collaborative way. I would even create a python 3.8* version myself if you give me the source and do it for you.

fish-finger commented 3 years ago

Has the python bindings been open sourced yet?

gonzo1978 commented 3 years ago

Hi, I am also needing the python 3.8 bindings. You guys said, you will release the bindings 2.5 months ago. Is there any link you can provide to them. Thank you !!

fish-finger commented 3 years ago

So am I still!

Python 3.9 out, are we to expect this before 3.10?

valeralann commented 3 years ago

Chaps, are you serious, a couple of months, were just talking about python bindings?

There are other features / bug fixes in the release schedule before we get to the task of releasing these bindings.

When is the next release schedule?

ttyio commented 3 years ago

python binding is now opensourced in https://github.com/NVIDIA/TensorRT/tree/21.02/python, thanks all, closing