yoheinakajima / babyagi

https://babyagi.org/
20.43k stars 2.68k forks source link

How to see/browse results? #281

Closed alexl83 closed 1 month ago

alexl83 commented 1 year ago

Hi! I got a task supposedly completed by babyagi using llama llm is there a way I can visualize tasks results? I tried tools/results.py and tools/results_browser.py - all to no avai

error 1 (fixed by declaring the variable TABLE_NAME in .env):

Traceback (most recent call last):
  File "/home/alex/AI/babyagi/tools/results_browser.py", line 23, in <module>
    assert PINECONE_TABLE_NAME, "TABLE_NAME environment variable is missing from .env"
AssertionError: TABLE_NAME environment variable is missing from .env

error 2:

Traceback (most recent call last):
  File "/home/alex/AI/babyagi/tools/results_browser.py", line 116, in <module>
    curses.wrapper(main)
  File "/home/alex/oobabooga/installer_files/conda/envs/bAGI/lib/python3.10/curses/__init__.py", line 94, in wrapper
    return func(stdscr, *args, **kwds)
  File "/home/alex/AI/babyagi/tools/results_browser.py", line 94, in main
    retrieved_tasks = query_records(index, get_ada_embedding(objective))
  File "/home/alex/AI/babyagi/tools/results_browser.py", line 32, in get_ada_embedding
    return openai.Embedding.create(input=[text], model="text-embedding-ada-002")["data"][0]["embedding"]
  File "/home/alex/oobabooga/installer_files/conda/envs/bAGI/lib/python3.10/site-packages/openai/api_resources/embedding.py", line 33, in create
    response = super().create(*args, **kwargs)
  File "/home/alex/oobabooga/installer_files/conda/envs/bAGI/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 149, in create
    ) = cls.__prepare_create_request(
  File "/home/alex/oobabooga/installer_files/conda/envs/bAGI/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 106, in __prepare_create_request
    requestor = api_requestor.APIRequestor(
  File "/home/alex/oobabooga/installer_files/conda/envs/bAGI/lib/python3.10/site-packages/openai/api_requestor.py", line 130, in __init__
    self.api_key = key or util.default_api_key()
  File "/home/alex/oobabooga/installer_files/conda/envs/bAGI/lib/python3.10/site-packages/openai/util.py", line 186, in default_api_key
    raise openai.error.AuthenticationError(
openai.error.AuthenticationError: No API key provided. You can set your API key in code using 'openai.api_key = <API-KEY>', or you can set the environment variable OPENAI_API_KEY=<API-KEY>). If your API key is stored in a file, you can point the openai module at it with 'openai.api_key_path = <PATH>'. You can generate API keys in the OpenAI web interface. See https://onboard.openai.com for details, or email support@openai.com if you have any questions.

Thank you! Ale

hoangdd commented 1 year ago

This error is caused by a missing OpenAI configuration code.

290

mjdyibrahim commented 1 year ago

Did you manage to see the results of the tasks?

alexl83 commented 1 year ago

Nope, it seems that due to LLaMA bugs, the prioritization agents drops babyagi out of the loop - hope it gets sorted soon so I can experiment :)

mjdyibrahim commented 1 year ago

Are there anyway to extract even the prompt flow from an instance. The content that was displayed in the prompt, anyway to see that after a session is closed?

vmajor commented 1 year ago

Is there any progress with this? Seeing "Done" is nice and well, but I'd really like to know what its done.

letsgoduke commented 1 year ago

any way to get results out of pinecone? completely new to vector DBs.

xnoubis commented 1 year ago

Indeed seeing results and direction of tasks that it, proposes, accomplishes, and completes is needed to the point of seeming obvious. I have it perform 'heuristics' and apply its own rules of thumb, to find the best change for solving problems without understanding them, using available resources. As it undergoes this, what it discovers, generates, and proposes reveals a data horde of treasures needed for further exploration and understanding.

noxiouscardiumdimidium commented 1 year ago

@xnoubis @letsgoduke @vmajor @elpreneurAbdo on loinux you can launch it with a quick and dirty fix like : ./babyagi.py | tee -a ./dump.txt to create a dump file that catches everything it outputs, after every completed step. i created ~20K line-long dump file today creating story snippets to use as LLM training/finetunig data. it never managed to write me an entire book, but it did write me hundreds of awesome scenes xD