Closed kklemon closed 3 months ago
Example:
from penai.llm.conversation import Conversation, HumanMessageBuilder
from penai.llm.llm_model import RegisteredLLM
if __name__ == '__main__':
system_prompt = "You are a weather analysis expert system which always responds with structured JSON data."
conversation = Conversation(RegisteredLLM.GEMINI_15_PRO, system_prompt=system_prompt)
image_url = "https://th.bing.com/th/id/R.0fac916f221179dbeb0f8aaf53f40ed6?rik=oxgjA71TG9i%2fUg&riu=http%3a%2f%2fyesofcorsa.com%2fwp-content%2fuploads%2f2017%2f11%2fRainy-Weather-Best-Wallpaper.jpg&ehk=68czJLwKv89y%2flgNsrCRMAuxib4SPwzE1lrMPvvvOV4%3d&risl=&pid=ImgRaw&r=0"
response = conversation.query(
HumanMessageBuilder("Describe the weather in this image.")
.with_image_from_url(image_url).build())
response2 = conversation.query("When might one encounter such weather in central Europe?")
Output:
{
"weather": "Rainy",
"details": "Heavy rain is falling, as evidenced by the visible droplets and the use of an umbrella by a pedestrian."
}
{
"season": "Transitional",
"details": "This type of weather in central Europe is most common during the transitional seasons of spring (March-May) and autumn (September-November). These seasons are characterized by fluctuating temperatures and passing weather fronts that can bring periods of rain."
}
Add support for system prompts to our (V)LM API and figure out how to handle the case when a system prompt is not supported by a model.