Closed thesanju closed 7 months ago
Should be pretty straightforward:
from langchain.chat_models import ChatOpenAI
gpt35 = ChatOpenAI(
temperature=0.7,
model_name="gpt-3.5",
)
Agent(
# ...
llm=gpt35
)
import os
from crewai import Agent, Task, Crew, Process
from langchain.llms import OpenAI
os.environ["OPENAI_API_KEY"] = "sk-hBOcPKVc...P3aX50lUotMCDBhb"
# Define your agents with roles and goals
researcher = Agent(
role="...",
goal="...",
backstory="...",
verbose=True,
llm=OpenAI(
temperature=0.8, model_name="gpt-3.5-turbo-1106"
), # It uses langchain.chat_models, default is GPT 3.5
)
Just set env OPENAI_MODEL_NAME = gpt3.5-turbo
Just set env OPENAI_MODEL_NAME = gpt3.5-turbo
This one is the only that worked for me
this giving error