simonw / llm-cmd

Use LLM to generate and execute commands in your shell
Apache License 2.0
109 stars 11 forks source link

Use os.execle #5

Open simonw opened 5 months ago

simonw commented 5 months ago

Tip from https://twitter.com/adamchainz/status/1772735510584119335

You could use os.execle to run the command instead. It would make the command not a subprocess, allowing full features of the executed command like command line interaction, piping, etc.

simonw commented 5 months ago

https://docs.python.org/3/library/os.html#os.execle

os.execle(path, arg0, arg1, ..., env)

These functions all execute a new program, replacing the current process; they do not return.

Doesn't look like it has the equivalent of shell=True though which is tricky as I'd prefer to avoid splitting the arguments myself.

simonw commented 5 months ago

I'm looking to replace this code: https://github.com/simonw/llm-cmd/blob/e471b287a26335ce79607298882a8ddbd2302c03/llm_cmd.py#L51-L57

simonw commented 5 months ago

I'm nervous about a couple of things here:

simonw commented 5 months ago

Inconclusive conversation with GPT-4 about this, it's not much help: https://chat.openai.com/share/9222833c-c998-47ab-81ce-23555790cc9e

sandipb commented 5 months ago

Slightly offtopic, why not have this arg slurping in the primary "llm prompt" or "llm" interface as well? It is a bit annoying to put the entire text in a string there, and I have to use a bash function to do it for me.

simonw commented 5 months ago

Slightly offtopic, why not have this arg slurping in the primary "llm prompt" or "llm" interface as well? It is a bit annoying to put the entire text in a string there, and I have to use a bash function to do it for me.

I don't understand what you mean, can you expand a little?

sandipb commented 5 months ago

I mean, right now the primary way to use llm is to run like this :

 llm "the query to ask"

I wish, like llm cmd, we could also write:

llm the query to ask

Because right now:

$ llm what is the easiest way to find a number is prime\?
Usage: llm prompt [OPTIONS] [PROMPT]
Try 'llm prompt --help' for help.

Error: Got unexpected extra arguments (is the easiest way to find a number is prime?)

$ llm "what is the easiest way to find a number is prime"
Determining whether a number is prime (meaning it has no divisors other than ...
bloodearnest commented 5 months ago

I'm nervous about a couple of things here:

  • How to best split up whatever the LLM returned and ensure it works right. The suggested command might include pipes for example, no idea if that's something we can run as a replacement process

In the slight off-chance you're not already aware, shlex.split() does a pretty good job of parsing a string of shell args into python strings. Handles quoting and escaping.

However, that will not help if you want to consume LLM output that might include pipes.

The main thing that subprocess.run(..., shell=True) does is make the call arguments ["/bin/sh", "-c", your_string_arg]. It has some basic logic to choose a default sh path, or you can pass executable=/path/to/sh.

You can maybe do the same thing here to support arbitrary shell expressions including pipes from your LLM, except with os.execle rather than spawn/fork? i.e.

os.execle("/bin/sh", "-c", llm_cmd_output, ...)

(Hmm, or even os.execle("/bin/bash", "-o", "pipefail", "-c", ...) if bash is installed, for better exit codes?)

  • Ensuring this works predictable on Linux and macOS and maybe Windows

So, subprocess.run(..., shell=True) on windows uses cmd.exe by default, and it processes the args with a function subprocess.list2cmdline, which is a semi-internal function that I think implements these rules.

So, assuming your LLM is spitting out cmd.exe syntax, then you should be able to do sth like:

os.execle(cmd_exe_path, subprocess.list2cmdline([llm_cmd_output]), ...)

It's a bit of shame that subprocess module doesn't expose a public get_system_shell() function so we can reuse the logic, although I can understand why they don't!

The above is untested ideas from some of my prior experience wrangling this kind of stuff before, so may not work!

sandipb commented 5 months ago

There is also the consideration of backgrounding of processes. If a command has an ampersand and is expected to go into background in the shell, will it get terminated when llm quits because it is the parent process?