Closed JaDenis closed 9 months ago
You should be able to use it with it already
pip install --upgrade --quiet gpt4all > /dev/null
from langchain_community.llms import GPT4All
local_path = (
"./models/ggml-gpt4all-l13b-snoozy.bin" # replace with your desired local file path
)
llm = GPT4All(model=local_path, verbose=True)
# Now you can pass this to your agent definition
Agent(
#...
llm=llm
)
# rest of your code
I really meant gpt4free (import g4f), not GPT4ALL. Just now managed to make it work like that:
import g4f
from langchain_g4f import G4FLLM
from langchain.llms.base import LLM
from crewai import Agent, Task, Crew, Process
llm: LLM = G4FLLM(model=g4f.models.default)
agent = Agent(
#...
llm=llm
)
You rock anyway, dude! Great project
Hi @JaDenis are you finally integrate g4f ? can you make a PR or how we can use it ? I see g4f updated frequently , but why you using langchain_g4f
@pencilvania This should work:
class G4FLanguageModel(LLM):
model_name: str = "gpt-3.5-turbo"
def _call(self, prompt: str, stop: Optional[List[str]] = None) -> str:
response = g4f.ChatCompletion.create(
model=self.model_name,
messages=[{"role": "user", "content": prompt}]
)
return response
@property
def _identifying_params(self) -> dict[str, Any]:
return {"model_name": self.model_name}
@property
def _llm_type(self) -> str:
return "g4f"
g4f_model = G4FLanguageModel()
agent = Agent(
llm=g4f_model
)
Hello, it's a feature request and a call for help also.
I cant find a place where the api call is made, so that i could replace it with my own POST request. If you could guide to this place it would help a lot. I know its unethical, but i'm from russia and have no 'legal' access to OpenAI API anyway. Please help, i really like the framework and want to use it with g4f library, i would totally understand if you don't want to associate with it though.