Omochice / dotfiles

my dotfiles
7 stars 0 forks source link

chore(deps): update dependency ollama/ollama to v0.3.9 #740

Closed renovate[bot] closed 2 weeks ago

renovate[bot] commented 2 weeks ago

This PR contains the following updates:

Package Update Change
ollama/ollama patch v0.3.6 -> v0.3.9

Release Notes

ollama/ollama (ollama/ollama) ### [`v0.3.9`](https://togithub.com/ollama/ollama/releases/tag/v0.3.9) [Compare Source](https://togithub.com/ollama/ollama/compare/v0.3.8...v0.3.9) #### What's Changed - Fixed error that would occur when running Ollama on Linux ARM - Ollama will now show improved error when attempting to run unsupported models - Fixed issue where Ollama would not autodetect the chat template for Llama 3.1 models - `OLLAMA_HOST` will now with with urls that contain paths #### New Contributors - [@​bryanhonof](https://togithub.com/bryanhonof) made their first contribution in [https://github.com/ollama/ollama/pull/6074](https://togithub.com/ollama/ollama/pull/6074) **Full Changelog**: https://github.com/ollama/ollama/compare/v0.3.8...v0.3.9 ### [`v0.3.8`](https://togithub.com/ollama/ollama/releases/tag/v0.3.8) [Compare Source](https://togithub.com/ollama/ollama/compare/v0.3.7-rc1...v0.3.8) #### What's Changed - Fixed error where the `ollama` CLI couldn't be found on the path when upgrading Ollama on Windows #### New Contributors - [@​seankhatiri](https://togithub.com/seankhatiri) made their first contribution in [https://github.com/ollama/ollama/pull/6530](https://togithub.com/ollama/ollama/pull/6530) **Full Changelog**: https://github.com/ollama/ollama/compare/v0.3.7...v0.3.8 ### [`v0.3.7`](https://togithub.com/ollama/ollama/releases/tag/v0.3.7) [Compare Source](https://togithub.com/ollama/ollama/compare/v0.3.6...v0.3.7-rc1) #### New Models - [Hermes 3](https://ollama.com/library/hermes3): Hermes 3 is the latest version of the flagship Hermes series of LLMs by Nous Research, which includes support for tool calling. - [Phi 3.5](https://ollama.com/library/phi3.5): A lightweight AI model with 3.8 billion parameters with performance overtaking similarly and larger sized models. - [SmolLM](https://ollama.com/library/smollm): A family of small models with 135M, 360M, and 1.7B parameters, trained on a new high-quality dataset. #### What's Changed - CUDA 12 support: improving performance by up to 10% on newer NVIDIA GPUs - Improved performance of `ollama pull` and `ollama push` on slower connections - Fixed issue where setting `OLLAMA_NUM_PARALLEL` would cause models to be reloaded on lower VRAM systems - Ollama on Linux is now distributed as a `tar.gz` file, which contains the `ollama` binary along with required libraries. #### New Contributors - [@​pamelafox](https://togithub.com/pamelafox) made their first contribution in [https://github.com/ollama/ollama/pull/6345](https://togithub.com/ollama/ollama/pull/6345) - [@​eust-w](https://togithub.com/eust-w) made their first contribution in [https://github.com/ollama/ollama/pull/5964](https://togithub.com/ollama/ollama/pull/5964) **Full Changelog**: https://github.com/ollama/ollama/compare/v0.3.6...v0.3.7

Configuration

📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).

🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.

Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about this update again.



This PR was generated by Mend Renovate. View the repository job log.

coderabbitai[bot] commented 2 weeks ago

Walkthrough

The change involves updating the version of the ollama package in the configuration file config/aqua/aqua.yaml from 0.3.6 to 0.3.9. This update may introduce new functionalities or bug fixes relevant to the package, affecting components that depend on it.

Changes

Files Change Summary
config/aqua/aqua.yaml Updated ollama package version from 0.3.6 to 0.3.9

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant Application
    participant Ollama

    User->>Application: Request functionality
    Application->>Ollama: Call functions (v0.3.6)
    Ollama-->>Application: Respond with data
    Application-->>User: Return response

    Note over Application, Ollama: Version updated to v0.3.9
    User->>Application: Request functionality
    Application->>Ollama: Call functions (v0.3.9)
    Ollama-->>Application: Respond with updated data
    Application-->>User: Return response

Poem

🐇 In the meadow where bunnies play,
A new version hops in today!
From 0.3.6 to 0.3.9,
A leap of joy, oh how divine!
With every change, we dance and cheer,
For better code brings us near! 🌼


Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

Share - [X](https://twitter.com/intent/tweet?text=I%20just%20used%20%40coderabbitai%20for%20my%20code%20review%2C%20and%20it%27s%20fantastic%21%20It%27s%20free%20for%20OSS%20and%20offers%20a%20free%20trial%20for%20the%20proprietary%20code.%20Check%20it%20out%3A&url=https%3A//coderabbit.ai) - [Mastodon](https://mastodon.social/share?text=I%20just%20used%20%40coderabbitai%20for%20my%20code%20review%2C%20and%20it%27s%20fantastic%21%20It%27s%20free%20for%20OSS%20and%20offers%20a%20free%20trial%20for%20the%20proprietary%20code.%20Check%20it%20out%3A%20https%3A%2F%2Fcoderabbit.ai) - [Reddit](https://www.reddit.com/submit?title=Great%20tool%20for%20code%20review%20-%20CodeRabbit&text=I%20just%20used%20CodeRabbit%20for%20my%20code%20review%2C%20and%20it%27s%20fantastic%21%20It%27s%20free%20for%20OSS%20and%20offers%20a%20free%20trial%20for%20proprietary%20code.%20Check%20it%20out%3A%20https%3A//coderabbit.ai) - [LinkedIn](https://www.linkedin.com/sharing/share-offsite/?url=https%3A%2F%2Fcoderabbit.ai&mini=true&title=Great%20tool%20for%20code%20review%20-%20CodeRabbit&summary=I%20just%20used%20CodeRabbit%20for%20my%20code%20review%2C%20and%20it%27s%20fantastic%21%20It%27s%20free%20for%20OSS%20and%20offers%20a%20free%20trial%20for%20proprietary%20code)
Tips ### Chat There are 3 ways to chat with [CodeRabbit](https://coderabbit.ai): - Review comments: Directly reply to a review comment made by CodeRabbit. Example: - `I pushed a fix in commit .` - `Generate unit testing code for this file.` - `Open a follow-up GitHub issue for this discussion.` - Files and specific lines of code (under the "Files changed" tab): Tag `@coderabbitai` in a new review comment at the desired location with your query. Examples: - `@coderabbitai generate unit testing code for this file.` - `@coderabbitai modularize this function.` - PR comments: Tag `@coderabbitai` in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples: - `@coderabbitai generate interesting stats about this repository and render them as a table.` - `@coderabbitai show all the console.log statements in this repository.` - `@coderabbitai read src/utils.ts and generate unit testing code.` - `@coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.` - `@coderabbitai help me debug CodeRabbit configuration file.` Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. ### CodeRabbit Commands (Invoked using PR comments) - `@coderabbitai pause` to pause the reviews on a PR. - `@coderabbitai resume` to resume the paused reviews. - `@coderabbitai review` to trigger an incremental review. This is useful when automatic reviews are disabled for the repository. - `@coderabbitai full review` to do a full review from scratch and review all the files again. - `@coderabbitai summary` to regenerate the summary of the PR. - `@coderabbitai resolve` resolve all the CodeRabbit review comments. - `@coderabbitai configuration` to show the current CodeRabbit configuration for the repository. - `@coderabbitai help` to get help. ### Other keywords and placeholders - Add `@coderabbitai ignore` anywhere in the PR description to prevent this PR from being reviewed. - Add `@coderabbitai summary` to generate the high-level summary at a specific location in the PR description. - Add `@coderabbitai` anywhere in the PR title to generate the title automatically. ### CodeRabbit Configuration File (`.coderabbit.yaml`) - You can programmatically configure CodeRabbit by adding a `.coderabbit.yaml` file to the root of your repository. - Please see the [configuration documentation](https://docs.coderabbit.ai/guides/configure-coderabbit) for more information. - If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: `# yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json` ### Documentation and Community - Visit our [Documentation](https://coderabbit.ai/docs) for detailed information on how to use CodeRabbit. - Join our [Discord Community](https://discord.com/invite/GsXnASn26c) to get help, request features, and share feedback. - Follow us on [X/Twitter](https://twitter.com/coderabbitai) for updates and announcements.