Closed 0xSage closed 3 weeks ago
❯ sudo cortex-nightly engines list +---+--------------+-------------------+---------+--------------+ | # | Name | Supported Formats | Version | Status | +---+--------------+-------------------+---------+--------------+ | 1 | ONNXRuntime | ONNX | 0.0.1 | Incompatible | +---+--------------+-------------------+---------+--------------+ | 2 | llama.cpp | GGUF | 0.0.1 | Ready | +---+--------------+-------------------+---------+--------------+ | 3 | TensorRT-LLM | TensorRT Engines | 0.0.1 | Incompatible | +---+--------------+-------------------+---------+--------------+ ❯ sudo cortex-nightly engines get ONNXRuntime A subcommand is required Run with --help for more information. ❯ sudo cortex-nightly engines get cortex.ONNXRuntime A subcommand is required Run with --help for more information. ❯ sudo cortex-nightly engines get cortex.onnx +-------------+-------------------+---------+--------------+ | Name | Supported Formats | Version | Status | +-------------+-------------------+---------+--------------+ | ONNXRuntime | ONNX | 0.0.1 | Incompatible | +-------------+-------------------+---------+--------------+
1.. For engine IDs, what are we going with?
engines list
engines get <ID>
cortex engines get 0
engines get cortex.llamacpp
engines get cortex.onnx
engines get cortex.tensorrt-llm
cc @dan-homebrew can you take a look. Do we need to fix the current engine naming?
This is linked to #1168.
Closing as duplicate of #1168
Problem Statement
Related
1168
Questions
1.. For engine IDs, what are we going with?
engines list
are consistent with the IDs used inengines get <ID>
?cortex engines get 0
Current
engines get cortex.llamacpp
engines get cortex.onnx
engines get cortex.tensorrt-llm