-
`from crewai import Agent, Task, Crew
from google.cloud import bigquery
from langchain_google_vertexai import VertexAI
from crewai_tools import SerperDevTool
llm = VertexAI(
temperature=0.0…
-
Default available model rev to Gemini 1.5 Pro (Google) and Gemini 1.5 Flash (Vertex AI)
Providers:
- https://github.com/stanfordnlp/dspy/blob/main/dsp/modules/googlevertexai.py
- https://github.…
-
Hi team,
I am trying to use context caching in Gemini through vertexai, however, when trying to create the content to cache, it fails as it can't find the stable model.
I can see that context c…
-
### Describe your configuration
- Extension name: **Multimodal Tasks with the Gemini API**
- Extension version: **1.0.0**
- Configuration values:
- **Gemini API Provider**: vertex-ai
- …
-
### Community Note
* Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the com…
-
Hello,
There are many instances were you would like to use a custom trained LLM model.
GCP and Vertex AI allow a verry easy and straightforward way of generating a tunned version of a published…
-
**Client**
vertexai/genai
**Environment**
Local running on Mac OS X
**Go Environment**
go version
go version go1.22.2 darwin/arm64
go env
GO111MODULE='on'
GOARCH='arm64'
GOBIN=''…
-
This test failed!
To configure my behavior, see [the Flaky Bot documentation](https://github.com/googleapis/repo-automation-bots/tree/main/packages/flakybot).
If I'm commenting on this issue too oft…
-
Vertex AI supports custom models by fine tuning PaLM2 and the Python library has support for accessing these custom models like below:
```python
import vertexai
from vertexai.language_models impo…
-
#### Environment details
- OS: Dockerfile base image: `python:3.11`
- Python version: 3.11
- pip version: 24.0
- `google-auth` version: 2.29.0
#### Description
We have created a si…