run-llama / llama-agents

MIT License
1.59k stars 156 forks source link

Why OpenAILike is not supported? #129

Open HAL9KKK opened 1 month ago

HAL9KKK commented 1 month ago

Alwys looking for the API-KEY!!

llama-index-core-0.10.54 llama-agents-007

from llama_agents import (
    AgentService,
    AgentOrchestrator,
    ControlPlaneServer,
    LocalLauncher,
    SimpleMessageQueue,
)

from llama_index.core.agent import ReActAgent
from llama_index.core.tools import FunctionTool
from llama_index.llms.openai_like import OpenAILike
from llama_index.core import Settings

# create an agent
def get_the_secret_fact() -> str:
    """Returns the secret fact."""
    return "The secret fact is: A baby llama is called a 'Cria'."

tool = FunctionTool.from_defaults(fn=get_the_secret_fact)

llm = OpenAILike(
    api_key='pippo',
    base_url='http://localhost:1234/v1'
)

Settings.llm = llm

agent1 = ReActAgent.from_tools([tool], llm=llm)
agent2 = ReActAgent.from_tools([], llm=llm)

# create our multi-agent framework components
message_queue = SimpleMessageQueue(port=8000)
control_plane = ControlPlaneServer(
    message_queue=message_queue,
    orchestrator=AgentOrchestrator(llm=llm),
    port=8001,
)
agent_server_1 = AgentService(
    agent=agent1,
    message_queue=message_queue,
    description="Useful for getting the secret fact.",
    service_name="secret_fact_agent",
    port=8002,
)
agent_server_2 = AgentService(
    agent=agent2,
    message_queue=message_queue,
    description="Useful for getting random dumb facts.",
    service_name="dumb_fact_agent",
    port=8003,
)

# launch it
launcher = LocalLauncher([agent_server_1, agent_server_2], control_plane, message_queue)
result = launcher.launch_single("What is the secret fact?")

print(f"Result: {result}")

Return:

PS C:\Users\teiiamu\LLM\LLAMA-Agents> & C:/Users/teiiamu/AppData/Local/Programs/Python/Python311/python.exe c:/Users/teiiamu/LLM/LLAMA-Agents/llama_agents_basic.py
INFO:llama_agents.message_queues.simple - Consumer AgentService-9cc3f0bf-1c2e-42ac-b099-62e289db35e7: secret_fact_agent has been registered.
INFO:llama_agents.message_queues.simple - Consumer AgentService-df714c57-20b5-49bc-909e-9a174b892818: dumb_fact_agent has been registered.
INFO:llama_agents.message_queues.simple - Consumer d769f03a-e630-47cf-aa8c-13ddf26e5c51: human has been registered.
INFO:llama_agents.message_queues.simple - Consumer ControlPlaneServer-f277d14b-cc03-4d57-9f59-a5ff68756d13: control_plane has 
been registered.
INFO:llama_agents.services.agent - secret_fact_agent launch_local
INFO:llama_agents.services.agent - dumb_fact_agent launch_local
INFO:llama_agents.message_queues.base - Publishing message to 'control_plane' with action 'ActionTypes.NEW_TASK'
INFO:llama_agents.message_queues.simple - Launching message queue locally
INFO:llama_agents.message_queues.base - Publishing message to 'human' with action 'ActionTypes.COMPLETED_TASK'
INFO:llama_agents.message_queues.simple - Successfully published message 'control_plane' to consumer.
INFO:llama_agents.message_queues.simple - Successfully published message 'human' to consumer.
Result: An error occurred while running the tool: Error code: 401 - {'error': {'message': 'Incorrect API key provided: pippo. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
logan-markewich commented 1 month ago

@HAL9KKK you have a typo

Should be

llm = OpenAILike(
    api_key='pippo',
    api_base='http://localhost:1234/v1'
)

I tested with Ollama, it worked ok

Settings.llm = OpenAILike(
    api_key="fake",
    model="llama3:latest",
    api_base="http://localhost:11434/v1",
    timeout=120.0,
)