-
### What happened?
When we call Gemini models from Vertex, no entry of usage is updated in the transaction collection.
### Steps to Reproduce
1. Deploy Gemini pro and/or vision pro in Vertex AI.
2…
-
Utilize Gemini in order to first input data through a human-like questionnaire, then have it return a `JSON` to be sent to the backend.
-
Aider version: 0.58.1
Python version: 3.12.3
Platform: macOS-12.7.6-x86_64-i386-64bit
Python implementation: CPython
Virtual environment: No
OS: Darwin 21.6.0 (64bit)
Git version: git version 2.…
-
### Feature Description
Some Gemini models such as gemini 1.5 flash does support Function calling api so please add this Features to gemini llms
### Reason
_No response_
### Value of Feature
…
-
Google's Gemini boot is not working at all. Below is the picture of response we get.
![image](https://github.com/user-attachments/assets/83030946-9d80-4dec-bc45-523f878ebe86)
-
### Description of the feature request:
Currently `google.generativeai.upload_file` is a blocking function, it would be nice if there are async versions of these methods along with other I/O related…
-
The Google Gemini Pro model is not working with Semantic Kernel and functions calling
-
Implement the Google Gemini API to facilitate the adding of tasks with specific due dates.
saip9 updated
5 months ago
-
LiteLLM already support Gemini so it's probably already doable. Would be nice to support it OOTB as Gemini has a large context window
-
RAG is dying as the context window keeps increasing exponentially.
With 1M tokens context windows, who needs RAG. Caching course material and doing vector search seems like the way to go.
We will…