triton-inference-server / server

The Triton Inference Server provides an optimized cloud and edge inferencing solution.
https://docs.nvidia.com/deeplearning/triton-inference-server/user-guide/docs/index.html
BSD 3-Clause "New" or "Revised" License
8.37k stars 1.49k forks source link

ci: Adding tests for `numpy>=2` #7756

Closed KrishnanPrash closed 3 weeks ago

KrishnanPrash commented 3 weeks ago

What does the PR do?

Modifying L0_backend_python/examples/test.sh to test the python backend when numpy>=2 is installed in the environment. This is achieved by:

Reason for limited scope: Support for numpy>=2 proved to be more complicated than intended because older version of packages like torch, torchvision, and tensorflow do not support for numpy 2.x.

Related PRs:

Python Backend PR: Link

Where should the reviewer start?