containers / ramalama

Ramalama is an open-source developer tool that simplifies the local serving of AI models from any source and facilitates their use for inference in production, all through the familiar language of containers.
https://ramalama.ai
MIT License
1.6k stars 182 forks source link

interactive mode CLI #1267

Open benoitf opened 2 weeks ago

benoitf commented 2 weeks ago

Feature request description

each time I run commands, I enter ramalama <something> like ramalama run <something>

I would like to know if an interactive mode would be possible where I could enter in an interactive chat and then only enter the RamaLama arguments where I would have completion, etc.

like

ramalama --interactive

info
 ....

run foo
  ....
^C
version

convert tinyllama foo

Suggest potential solution

No response

Have you considered any alternatives?

No response

Additional context

No response

rhatdan commented 2 weeks ago

Why do you want this?

benoitf commented 2 weeks ago

I launch once the cli container and then all commands are ramalama commands. I don't need to prefix all my commands or to run again a new container

Like I run the python or node.js container

rhatdan commented 2 weeks ago

That feels like something you could build simply for yourself. Not sure there would be huge demand for this.

ericcurtin commented 1 week ago

alias rl="ramalama"

springs to mind.

This isn't incredibly hard to implement as a separate project. This is basically implement my own custom shell.

The thing is when you execute run, then we have to spin up another shell of sorts so we would have to maintain multiple shells.