meta-llama / llama-stack-apps

Agentic components of the Llama Stack APIs
MIT License
3.92k stars 567 forks source link

Custom Tool Call Not Working For Inflation Example #54

Open dawenxi-007 opened 3 months ago

dawenxi-007 commented 3 months ago

Running the inflation.py example from the rep. I am expecting it calls the custom tool for get_ticker_data function, which is defined at the folder custom_tools by ticker_data.py. However, based on the log, it didn't find the tool:

[stderr]
Traceback (most recent call last):
  line 145, in <module>
ModuleNotFoundError: No module named 'get_ticker_data'
[/stderr]
StepType.shield_call> No Violation
StepType.inference> The error message indicates that the `get_ticker_data` module is not found. This is because the `get_ticker_data` function is not a built-in Python function, and it's not available in the current environment.

To fix this issue, you can use the `yfinance` library to get the ticker data for Meta. Here's an updated code snippet that uses `yfinance` to get the ticker data:
dawenxi-007 commented 3 months ago

Beside, it also reported: ModuleNotFoundError: No module named 'matplotlib' It would be good to install matplotlib during the env creating process. Or if the agent can automatically install it.

yttbgf commented 3 months ago

why I have this issue? I found self._messages_to_ollama_messages(request.messages) not translate Attachment object into json? Object of type Attachment is not JSON serializable Traceback (most recent call last): File "/home/tbgf/.local/lib/python3.10/site-packages/llama_toolchain/distribution/server.py", line 174, in sse_generator async for item in event_gen: File "/home/tbgf/.local/lib/python3.10/site-packages/llama_toolchain/agentic_system/meta_reference/agentic_system.py", line 152, in create_agentic_system_turn async for event in agent.create_and_execute_turn(request): File "/home/tbgf/.local/lib/python3.10/site-packages/llama_toolchain/agentic_system/meta_reference/agent_instance.py", line 179, in create_and_execute_turn async for chunk in self.run( File "/home/tbgf/.local/lib/python3.10/site-packages/llama_toolchain/agentic_system/meta_reference/agent_instance.py", line 313, in run async for res in self._run( File "/home/tbgf/.local/lib/python3.10/site-packages/llama_toolchain/agentic_system/meta_reference/agent_instance.py", line 388, in _run async for chunk in self.inference_api.chat_completion(req): File "/home/tbgf/.local/lib/python3.10/site-packages/llama_toolchain/inference/ollama/ollama.py", line 174, in chat_completion async for chunk in stream: File "/home/tbgf/.local/lib/python3.10/site-packages/ollama/_client.py", line 494, in inner async with self._client.stream(method, url, **kwargs) as r: File "/home/tbgf/anaconda3/envs/ollama/lib/python3.10/contextlib.py", line 199, in aenter return await anext(self.gen) File "/home/tbgf/.local/lib/python3.10/site-packages/httpx/_client.py", line 1615, in stream request = self.build_request( File "/home/tbgf/.local/lib/python3.10/site-packages/httpx/_client.py", line 358, in build_request return Request( File "/home/tbgf/.local/lib/python3.10/site-packages/httpx/_models.py", line 342, in init headers, stream = encode_request( File "/home/tbgf/.local/lib/python3.10/site-packages/httpx/_content.py", line 214, in encode_request return encode_json(json) File "/home/tbgf/.local/lib/python3.10/site-packages/httpx/_content.py", line 177, in encode_json body = json_dumps(json).encode("utf-8") File "/home/tbgf/anaconda3/envs/ollama/lib/python3.10/json/init.py", line 231, in dumps return _default_encoder.encode(obj) File "/home/tbgf/anaconda3/envs/ollama/lib/python3.10/json/encoder.py", line 199, in encode chunks = self.iterencode(o, _one_shot=True) File "/home/tbgf/anaconda3/envs/ollama/lib/python3.10/json/encoder.py", line 257, in iterencode return _iterencode(o, 0) File "/home/tbgf/anaconda3/envs/ollama/lib/python3.10/json/encoder.py", line 179, in default raise TypeError(f'Object of type {o.class.name} ' TypeError: Object of type Attachment is not JSON serializable