CodeProject.AI Server is a self contained service that software developers can include in, and distribute with, their applications in order to augment their apps with the power of AI.
[x] Behaviour of one or more Modules [provide name(s), e.g. ObjectDetectionYolo]
[ ] Installer
[ ] Runtime [e.g. Python3.7, .NET]
[ ] Module packages [e.g. PyTorch)
[x] Something else
Describe the bug
Using "/app/AnalysisLayer/ObjectDetectionYolo/custom-models" or "/app/modules/ObjectDetectionYolo/custom-models/"as the path to custom modules does not allow the model to load.
Both of these directories are mentioned in the doucmentation.
Instead passing the model directly via "/app/preinstalled-modules/ObjectDetectionYolo/custom-models/delivery.pt" works
Expected behavior
Is there a way to pass additional custom models to the module without overwriting the other custom models
Screenshots
If applicable, add screenshots to help explain your problem.
Your System (please complete the following information):
CodeProject.AI Server version: 2.1.11
OS: Debian
System RAM 12gb
GPU (if available) Quadro P1000
GPU RAM (if available) 4gb
Additional context
Add any other context about the problem here.
Area of Concern
Describe the bug Using "/app/AnalysisLayer/ObjectDetectionYolo/custom-models" or "/app/modules/ObjectDetectionYolo/custom-models/"as the path to custom modules does not allow the model to load.
Both of these directories are mentioned in the doucmentation.
Instead passing the model directly via "/app/preinstalled-modules/ObjectDetectionYolo/custom-models/delivery.pt" works
Expected behavior Is there a way to pass additional custom models to the module without overwriting the other custom models
Screenshots If applicable, add screenshots to help explain your problem.
Your System (please complete the following information):
Additional context Add any other context about the problem here.