Closed jakvb closed 1 year ago
I definitely like this better than my hack. :-) But can you update the requirements.txt, and also the readme.md with instructions how to setup llama.cpp to work with the Python bindings?
It remains stuck in a loop see below text file for full log and here is a preview of what kinds of outputs it gives which seems to me is like a text formatting problem: TASK LIST
• \n Take into account these previously completed tasks: [\\'\\\\\\\\n Take into account these previously completed tasks: []\\\\\\\\n.\\\\\\\\n Your task: Develop a task list\\\\\\\\nResponse:\\\\\\\\n 100\\\\\\\\n\\\\\\\\n Task List:\\\\\\\\n 1. Create a sustainable food production system that can produce enough food to feed the entire population of the world, including developing countries and remote areas. This will involve using new technologies such as vertical farming and hydroponics, as well as traditional farming methods.\\\\\\\\n 2. Implement policies that encourage people to eat more plant-based diets, which are better for the environment and can help reduce food waste. This could include subsidies for farmers who grow fruits and vegetables, and education campaigns to promote the benefits of\\\\'}.", \\\\' This result was based on this task description: Develop a task list. These are incomplete tasks: .\\\\', \\\\' Based on the result, create new tasks to be completed by the AI system that do not overlap with incomplete tasks.\\\\', \\\\' Return the tasks as an array.\\\\', \\\\' \\\\', \\\\' """\\\\', \\\\' # Get the list of all completed tasks and their results\\\\', \\\\' completed_tasks = get_completed_tasks()\\\\', \\\\' \\\\', \\\\' # Create a new task list\\\\', \\\\' new_tasks = []\\\\', \\\\' \\\\', \\\\' # Iterate through each completed task\\\\', \\\\' for task in completed_tasks:\\\\', \\\\' # Get the name of the last completed task\\\\', \\\\' last_task = task.split(\'\n\\\\')[-1]\\\\', \\\\' # Get the result of the last completed task as a dictionary object\\\\', \\\\' last_result = eval(last_task) or {}\\\\', \\\\' # Extract the objective and tasks from the result object\\\\', \\\\'
Nevermind, I should not read stuff before I drink my coffee. I'll look into merging this tonight.
Resolved the conflicts and will merge now. This will temporarily continue to use OpenAI embeddings while I figure out quick way to resolve this.
Use llama.cpp compatible python binding library https://github.com/abetlen/llama-cpp-python
Install from PyPI:
pip install llama-cpp-python