Open MrCsabaToth opened 3 months ago
We might have to wait because currently we seem to use too many (?) functions and the Open Meteo (#40 and #41) will add several more.
Finally Google released -002
stable production models of gemini-1.5-flash and gemini-1.5-pro, and since Dart Firebase generative AI package relies on stable versions only we can now leverage updated capabilities compared to the May -001
releases. For more see https://www.linkedin.com/posts/chandraai_gemini-vertex-tpu-activity-7244512521435406336-vo5I and https://www.linkedin.com/posts/chandraai_gemini-genai-aiforbusiness-activity-7244548489437704193-qlvN
The new model is shaky with the tools and behavior changed. We'll test and wait more until moving to this direction.
This came to my mind while attending https://cloudonair.withgoogle.com/events/key-prompt-engineering-techniques-with-anthropics-claude-on-vertex-ai
In an agentic AI setup when RAGs are implemented as a tool, the LLM can decide if the particular question requires any personalization data or chat history data. This way we can save potential RAG turnarounds. Currently we mandatorily perform both.