containers / podman-desktop-extension-ai-lab

Work with LLMs on a local environment using containers
https://podman-desktop.io/extensions/ai-lab
Apache License 2.0
169 stars 30 forks source link

UX: Adding model service as a top-level UI construct #76

Open mairin opened 8 months ago

mairin commented 8 months ago

Need to update mockups to list model endpoints as a top-level construct, Michael's idea was to call them Model Services... this is the WIP mockup.. some stuff I need to figure out:

image

slemeur commented 8 months ago

I think the list is efficient. but I'm wondering as a developer, now that I know those are the model served (that I can access), where are the end points and if there are APIs what are those. maybe when we know (using localai) we should have a way to show them (maybe not on this screen, but in a detail screen?)

mairin commented 7 months ago

I think the list is efficient. but I'm wondering as a developer, now that I know those are the model served (that I can access), where are the end points and if there are APIs what are those. maybe when we know (using localai) we should have a way to show them (maybe not on this screen, but in a detail screen?)

The way the endpoints are handled right now in the mockup (not ideal) is the clipboard icon under actions would copy the endpoint url to the clipboard.

For the API, i think i need to see an example of the API docs (if they exist) and what would they belong to (whatever is serving the model I suppose?) I will ask mclifford and see. At the least, the sample code that comes with the recipes should be of some help there?

mairin commented 7 months ago

As follow up - when I asked Michael this is what he said:

Here is the docs for the OpenAI API, is this what you are looking for? https://platform.openai.com/docs/api-reference/chat/create

This is the docs for the llama.cpp python webserver, not as detailed as OpenAI docs: https://github.com/abetlen/llama-cpp-python/blob/main/docs/server.md