crewAIInc / crewAI

Framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks.
https://crewai.com
MIT License
19.72k stars 2.73k forks source link

Together.ai JSON mode and Function Calling #205

Closed Kolaposki closed 1 month ago

Kolaposki commented 8 months ago

TogetherAI just announced JSON mode and function calling for their models. It currently supports these models: Mixtral, Mistra, and CodeLlama. How do we implement this? Using either Langchain or CrewAI.

This is how I use their model in my agent.

    self.MistralLLM = openai.ChatOpenAI(
        base_url="https://api.together.xyz/v1",
        api_key=os.getenv('TOGETHER_APIKEY'),
        temperature=0.75,
        model="mistralai/Mistral-7B-Instruct-v0.2")

Its hard for these models to follow simple instructions like return JSON and be consistent with it even when you scream at them in the system prompts. Does anyone know how to enforce this or found a hacky way?

JakobPCoder commented 7 months ago

response_format={"type": "json_object"} add that to your arguments. it doesnt work with mistral inst 0.2 i think 0.1 is fine. iizs all in the link you posted https://www.together.ai/blog/function-calling-json-mode.

JSON MODE examples: chat_completion = client.chat.completions.create( model="mistralai/Mixtral-8x7B-Instruct-v0.1", response_format={ "type": "json_object", "schema": User.model_json_schema() }, messages=[ {"role": "system", "content": "You are a helpful assistant that answers in JSON."}, {"role": "user", "content": "Create a user named Alice, who lives in 42, Wonderland Avenue, Wonderland city, 91234 Dreamland."} ], )

FUNCTION CALLING tools = [ { "type": "function", "function": { "name": "get_current_weather", "description": "Get the current weather in a given location", "parameters": { "type": "object", "properties": { "location": { "type": "string", "description": "The city and state, e.g. San Francisco, CA" }, "unit": { "type": "string", "enum": [ "celsius", "fahrenheit" ] } } } } } ]

Generate

response = client.chat.completions.create( model="mistralai/Mixtral-8x7B-Instruct-v0.1", messages=[ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "What is the current temperature of New York?"} ], tools=tools, tool_choice="auto", )

xitex commented 6 months ago

How to implement this in crewai???? where it must be added if we have only agent and tasks settings?

github-actions[bot] commented 1 month ago

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.

github-actions[bot] commented 1 month ago

This issue was closed because it has been stalled for 5 days with no activity.