ClinicianFOCUS / clinicianfocus-installer

Install tool to help with installing the software and tools under development in this applied research project.
GNU Affero General Public License v3.0
0 stars 1 forks source link

Replace Mistral model with Gemma model in NSIS installer #28

Closed ItsSimko closed 2 weeks ago

ItsSimko commented 2 weeks ago

Summary by Sourcery

Replace the Mistral model with the Gemma model in the NSIS installer, updating the installation process to use a PowerShell script for managing the Gemma model on Ollama.

New Features:

Enhancements:

sourcery-ai[bot] commented 2 weeks ago

Reviewer's Guide by Sourcery

This PR replaces the Mistral LLM model with the Gemma model in the NSIS installer. The implementation changes how the model is downloaded and initialized by removing the direct model file downloads and .env configuration, replacing it with a PowerShell script that uses Ollama to pull and run the Gemma model. The section size for the LLM model has been reduced to reflect Gemma's smaller size requirements.

Sequence diagram for Gemma model setup in NSIS installer

sequenceDiagram
    participant User
    participant NSIS_Installer
    participant PowerShell
    participant Docker
    participant Ollama

    User->>NSIS_Installer: Start installation
    NSIS_Installer->>Docker: Check LLM checkbox state
    alt LLM checkbox checked
        NSIS_Installer->>PowerShell: Create and run docker_command.ps1
        PowerShell->>Ollama: Pull Gemma model
        Ollama-->>PowerShell: Model pulled
        PowerShell->>Ollama: Run Gemma model
        Ollama-->>PowerShell: Model running
        PowerShell-->>User: Gemma installed and launched
    end

Class diagram for NSIS installer changes

classDiagram
    class NSIS_Installer {
        -LLM_Installed: int
        -Checkbox_LLM: int
        -Checkbox_Speech2Text: int
        -Checkbox_FreeScribe: int
        +ModelPageLeave()
        +CheckAndStartDocker()
    }
    class PowerShellScript {
        +CreateScript()
        +RunScript()
        +DeleteScript()
    }
    NSIS_Installer --> PowerShellScript : uses
    note for NSIS_Installer "Handles installation process and model setup"
    note for PowerShellScript "Manages PowerShell script creation and execution"

File-Level Changes

Change Details Files
Remove Mistral model download and configuration logic
  • Remove direct download of Mistral model and template files
  • Remove .env file creation for Mistral model configuration
  • Remove related environment variable settings
install.nsi
Implement Gemma model setup using Ollama
  • Add PowerShell script generation for Gemma model setup
  • Add docker commands to pull and run Gemma model through Ollama
  • Add user feedback messages during model installation
  • Update model section size from 8GB to 2.8GB
install.nsi
Improve installation flow and error handling
  • Add wait step for container readiness before model setup
  • Add cleanup of temporary PowerShell script
  • Update user messages to reflect new model setup process
install.nsi

Tips and commands #### Interacting with Sourcery - **Trigger a new review:** Comment `@sourcery-ai review` on the pull request. - **Continue discussions:** Reply directly to Sourcery's review comments. - **Generate a GitHub issue from a review comment:** Ask Sourcery to create an issue from a review comment by replying to it. - **Generate a pull request title:** Write `@sourcery-ai` anywhere in the pull request title to generate a title at any time. - **Generate a pull request summary:** Write `@sourcery-ai summary` anywhere in the pull request body to generate a PR summary at any time. You can also use this command to specify where the summary should be inserted. #### Customizing Your Experience Access your [dashboard](https://app.sourcery.ai) to: - Enable or disable review features such as the Sourcery-generated pull request summary, the reviewer's guide, and others. - Change the review language. - Add, remove or edit custom review instructions. - Adjust other review settings. #### Getting Help - [Contact our support team](mailto:support@sourcery.ai) for questions or feedback. - Visit our [documentation](https://docs.sourcery.ai) for detailed guides and information. - Keep in touch with the Sourcery team by following us on [X/Twitter](https://x.com/SourceryAI), [LinkedIn](https://www.linkedin.com/company/sourcery-ai/) or [GitHub](https://github.com/sourcery-ai).
yingbull commented 2 weeks ago

We have support for just llama.cpp and a model without ollama installed as well though, right?

ItsSimko commented 2 weeks ago

We have support for just llama.cpp and a model without ollama installed as well though, right?

The gemma q8 model for llama cpp is installed through the FreeScribe installer, which is pulled from the latest release on the FreeScribe repo.