Closed souzatharsis closed 3 weeks ago
Google just released 'grounding' feature, which basically enables Google Search in Gemini as an LLM tool.
import google.generativeai as genai
import os
genai.configure(api_key=os.environ["API_KEY"])
model = genai.GenerativeModel('models/gemini-1.5-flash-002')
response = model.generate_content(contents="Who won Wimbledon this year?",
tools='google_search_retrieval')
print(response)
This sounds plug-and-play plus interesting to test.
In that way, if user wants to generate podcast from topic we simply run a ground search via gemini!
langchain is not yet integrated with Gemini grounded search.... so we either wait or implement something ugly:
if topic search: _ use gemini directly
else: _ use langchain framework
....
Podcast from 'topic' is now available:
Users can generate a podcast from a specific topic of interest, e.g. "Latest News in U.S. Politics" or "Modern art in the 1920s". Podcastfy will generate a podcast based on grounded real-time information about the most recent content published on the web about the topic.
generate_podcast(topic="Latest news about OpenAI")
User provides a topic: "US Elections" Podcastfy returns: Podcast on recent US Elections based on internet search
Proposed solution:
https://github.com/stanford-oval/storm An LLM-powered knowledge curation system that researches a topic and generates a full-length report with citations.