pgalko / BambooAI

A lightweight library that leverages Language Models (LLMs) to enable natural language interactions, allowing you to source and converse with data.
MIT License
421 stars 46 forks source link

Can not import `bambooai` without setting `OPENAI_API_KEY` #11

Closed pnmartinez closed 2 months ago

pnmartinez commented 2 months ago

Problem

Recent versions of openai lib require the OPENAI_API_KEY on openai.OpenAI() init.

This cascades down to bambooai on the following line (see traceback below): https://github.com/pgalko/BambooAI/blob/c292bda57d958431688c21cd62479e1ca56ec295/bambooai/google_search.py#L10

Python 3.11.0 (main, Oct 24 2022, 18:26:48) [MSC v.1933 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import bambooai
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "C:\Users\USER\AppData\Roaming\Python\Python311\site-packages\bambooai\__init__.py", line 2, in <module>
    from .bambooai import BambooAI
  File "C:\Users\USER\AppData\Roaming\Python\Python311\site-packages\bambooai\bambooai.py", line 15, in <module>
    from . import models, prompts, func_calls, qa_retrieval, google_search, reg_ex, log_manager, output_manager, utils
  File "C:\Users\USER\AppData\Roaming\Python\Python311\site-packages\bambooai\google_search.py", line 10, in <module>
    openai_client = openai.OpenAI()
                    ^^^^^^^^^^^^^^^
  File "C:\Users\USER\AppData\Roaming\Python\Python311\site-packages\openai\_client.py", line 104, in __init__
    raise OpenAIError(
openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

Suggested solution

Since google_search.py may not be needed, maybe we can do a conditional import.

I will try to implement a fix on the weekend following best practices (conditional imports usually are not).

pgalko commented 2 months ago

My bad, had a similar issue with Groq. Will fix it and push shortly.

pgalko commented 2 months ago

Should be fixed now :-)

pnmartinez commented 2 months ago

Awesome! I will test it and close after that.

Let me take the opportunity to say that I am really open to collaborate in this project, allocating some weekly hours to it if you think that should help. I could tackle minor issues like this, but let me know if you are interested in coordinating efforts!

pgalko commented 2 months ago

Awesome! I will test it and close after that.

Let me take the opportunity to say that I am really open to collaborate in this project, allocating some weekly hours to it if you think that should help. I could tackle minor issues like this, but let me know if you are interested in coordinating efforts!

I would very much like that! The project might have some value for the community, but there is only so much that one person can do. It has been stagnant for last 6 months, due to me working on a different project, and I have only just resumed the development. The output formatting really needs some work, and could use some help if that is something that would interests you. Even some sort of basic Flask app or Streamlit front end would go a long way as requested by some people. With some smart logic we could possibly consolidate all LLM output in the print_wrapper() function in the OutputManager Class and serve it conditionally to the front end. Currently all LLM responses (and some other stuff) are printed out via this function, so maybe not too much work would be required.