containers / ramalama

The goal of RamaLama is to make working with AI boring.
MIT License
272 stars 47 forks source link

Run the command by default without stderr #436

Closed rhatdan closed 1 week ago

rhatdan commented 1 week ago

Instead of runing the container staight away run it in a shell -c environment with 2> /dev/null, to eliminate the stderr output.

Fixes: https://github.com/containers/ramalama/issues/431

Summary by Sourcery

Bug Fixes:

sourcery-ai[bot] commented 1 week ago

Reviewer's Guide by Sourcery

The PR modifies the command execution in containers by wrapping the command in a shell environment and redirecting stderr to /dev/null by default. This change is implemented by using /bin/sh -c to execute the command, with stderr redirection disabled when debug mode is enabled.

Sequence diagram for command execution in container

sequenceDiagram
    participant User
    participant Container
    User->>Container: Execute command
    alt Debug mode enabled
        Container->>Container: /bin/sh -c "command"
    else
        Container->>Container: /bin/sh -c "command 2> /dev/null"
    end

File-Level Changes

Change Details Files
Modified command execution to suppress stderr output by default
  • Added shell wrapper using /bin/sh -c for command execution
  • Implemented stderr redirection to /dev/null when not in debug mode
  • Converted command arguments array to a single shell-escaped string using shlex.join
ramalama/model.py

Possibly linked issues


Tips and commands #### Interacting with Sourcery - **Trigger a new review:** Comment `@sourcery-ai review` on the pull request. - **Continue discussions:** Reply directly to Sourcery's review comments. - **Generate a GitHub issue from a review comment:** Ask Sourcery to create an issue from a review comment by replying to it. - **Generate a pull request title:** Write `@sourcery-ai` anywhere in the pull request title to generate a title at any time. - **Generate a pull request summary:** Write `@sourcery-ai summary` anywhere in the pull request body to generate a PR summary at any time. You can also use this command to specify where the summary should be inserted. #### Customizing Your Experience Access your [dashboard](https://app.sourcery.ai) to: - Enable or disable review features such as the Sourcery-generated pull request summary, the reviewer's guide, and others. - Change the review language. - Add, remove or edit custom review instructions. - Adjust other review settings. #### Getting Help - [Contact our support team](mailto:support@sourcery.ai) for questions or feedback. - Visit our [documentation](https://docs.sourcery.ai) for detailed guides and information. - Keep in touch with the Sourcery team by following us on [X/Twitter](https://x.com/SourceryAI), [LinkedIn](https://www.linkedin.com/company/sourcery-ai/) or [GitHub](https://github.com/sourcery-ai).
rhatdan commented 1 week ago

@ericcurtin @bmahabirbu PTAL

bmahabirbu commented 1 week ago

@ericcurtin @bmahabirbu PTAL

Tested and works! Thank you!

rhatdan commented 1 week ago

Still does not work as well as I would hope.

 $ ./bin/ramalama --nocontainer run tiny
.....................................................................................
main: interactive mode on.

== Running in interactive mode. ==
 - Press Ctrl+C to interject at any time.
 - Press Return to return control to the AI.
 - To return control without starting a new line, end your input with '/'.
 - If you want to submit another line, end your input with '\'.

> 

Versus:

$ ./bin/ramalama run tiny

> 

Something about the container is causing lama-cli to not show the nice prompt.