Closed dan-homebrew closed 2 weeks ago
@gabrielle-ong For v1.0's Manual QA, I would like us to focus on answering the following questions:
I provide more context and links to key issues below:
Most of our bugs come from when cortex.cpp breaks on some hardware/OS. Catching these will reduce the number of bugs in Jan.
Additionally, I would like to document our process provisioning a VM. This may grow to become its own product in the future (e.g. Menlo Cloud), as other teams may also need cross-platform testing.
Installers are OS-specific, and install a clear set of files for each operating system. Installers will also need to detect hardware and pull the correct version of llama.cpp (I will cover that in the next section).
First, we will need to verify that the Installer for a particular operating system writes the correct files to the correct locations. The following issues/discussions have relevant context:
Installers will also detect the user's hardware, and then pull the correct version of llama.cpp. This is based on:
We should be aligned with the versions published by llama.cpp: https://github.com/ggerganov/llama.cpp/releases
We need to verify that an AVX-2 system is correctly identified as an AVX-2 system, and the correct version is pulled.
We have an open issue, discussing how llama.cpp is installed, which may change how we QA this: https://github.com/janhq/cortex.cpp/issues/1217
We should verify that Pulling and Running Models writes to the correct Model Folder, and with model.yaml. The following issues/discussions have relevant context:
We should verify that key API endpoints work:
We should also verify that key CLI commands work:
cortex run
, etcWe should verify that Cortex's uninstaller removes all relevant files, and does not leave dangling references.
We should verify that Cortex's updater works, but this may be challenging for now.
Issues that are still in progress:
Nonurgent Question @vansangpfiev @namchuai :
I itemized test cases here: https://github.com/janhq/cortex.cpp/issues/1147
@0xSage ,
Where do I view the new API? When we update the API, we also have to update the json file https://github.com/janhq/cortex.cpp/blob/dev/docs/static/openapi/jan.json and it will be publish in https://cortex.so/api-reference
Do we autogenerate the API reference documentation? We using continue.dev to gen the update API. Drogon does not have builtin swagger generation (more info: https://github.com/drogonframework/drogon/pull/923)
I'd like to QA for OpenAI compatibility at some point. Should we QA API in this milestone? IMO, yes.
(will prefer to close this issue and iterate on the QA list with each release)
Updated QA list in #1535 (will prefer to close this issue and iterate on the QA list with each release)
Yes, please go ahead to close it. We should add a Github Issue template for QA list, and allow us to version control and incrementally improve the Manual QA checklist
Goal
Tasklist
releases
folder incortex.cpp
, where the QA checklist should be committed together with key learnings, lessons to preserve for younger engineers