Closed pnmartinez closed 2 months ago
My bad, had a similar issue with Groq. Will fix it and push shortly.
Should be fixed now :-)
Awesome! I will test it and close after that.
Let me take the opportunity to say that I am really open to collaborate in this project, allocating some weekly hours to it if you think that should help. I could tackle minor issues like this, but let me know if you are interested in coordinating efforts!
Awesome! I will test it and close after that.
Let me take the opportunity to say that I am really open to collaborate in this project, allocating some weekly hours to it if you think that should help. I could tackle minor issues like this, but let me know if you are interested in coordinating efforts!
I would very much like that! The project might have some value for the community, but there is only so much that one person can do. It has been stagnant for last 6 months, due to me working on a different project, and I have only just resumed the development. The output formatting really needs some work, and could use some help if that is something that would interests you. Even some sort of basic Flask app or Streamlit front end would go a long way as requested by some people.
With some smart logic we could possibly consolidate all LLM output in the print_wrapper()
function in the OutputManager
Class and serve it conditionally to the front end. Currently all LLM responses (and some other stuff) are printed out via this function, so maybe not too much work would be required.
Problem
Recent versions of
openai
lib require theOPENAI_API_KEY
onopenai.OpenAI()
init.This cascades down to
bambooai
on the following line (see traceback below): https://github.com/pgalko/BambooAI/blob/c292bda57d958431688c21cd62479e1ca56ec295/bambooai/google_search.py#L10Suggested solution
Since
google_search.py
may not be needed, maybe we can do a conditional import.I will try to implement a fix on the weekend following best practices (conditional imports usually are not).