Closed gallegi closed 23 hours ago
Error due to missing package:
File "/opt/homebrew/Caskroom/miniforge/base/envs/la/lib/python3.11/site-packages/llama_index/core/readers/file/base.py", line 67, in _try_loading_included_file_formats
raise ImportError("`llama-index-readers-file` package not found")
ImportError: `llama-index-readers-file` package not found
Error while inference:
File "/opt/homebrew/Caskroom/miniforge/base/envs/la/lib/python3.11/site-packages/llama_cpp/llama_chat_format.py", line 289, in _convert_text_completion_chunks_to_chat
for i, chunk in enumerate(chunks):
File "/opt/homebrew/Caskroom/miniforge/base/envs/la/lib/python3.11/site-packages/llama_cpp/llama.py", line 1269, in _create_completion
raise ValueError(
ValueError: Requested tokens (2387) exceed context window of 2048
File: sockets.txt
Question:
Is socket supported in AnyLearning?
Fix:
This works when I update all context length (2048
) to 4096
. The answer was based on the content of the text file. Good job!
TODO: Make it configurable.
Implementing https://github.com/nrl-ai/llama-assistant/issues/13.
LGTM.