-
hi
how set model for chat gpt turbo 3.5
and how set open ai api
-
Traceback (most recent call last):
File "/home/Group/gf/src/tool_learning/codeinterpreter-api/examples/show_bitcoin_chart.py", line 28, in
asyncio.run(main())
File "/root/miniconda3/envs/l…
-
There’s a lot of different LLM deployment providers. How do I easily replace my OpenAI base with their url as a proxy? - https://github.com/petals-infra/chat.petals.dev/issues/20, https://www.banana.d…
-
I cannot seem to save the images once they're generated. Only `show_image()` seems to be available. Any way to save the png locally once it has been generated?
Thanks, this is awesome work!!
-
https://platform.openai.com/assistants
-
For example you could pass in the function as a local variable into? Below is more info on what I'm trying to do.
prompt:
```
get smiles of abemaciclib and calculate its molecular weight using …
-
The OpenAI.Assistants unit can only work if the Headers include: “OpenAI-Beta','assistants=v1”
In this case, you must modify the GetHeader method of the TOpenAIAPI class as follows:
```
function …
-
### Describe the bug
interpreter throws error regarding network connection failure when running in local mode disconnected from the internet. it throws errors twice. first, even before asking for mod…
-
I have tested this with local LLM using ollama. I could not get it working though. But the other issue here is, when i run "streamlit run main.py", the streamlit app opens in one tab and the react app…
-
**After installing all required packages, I'm running the following code:**
```python
from codeinterpreterapi import CodeInterpreterSession, File
async def main():
# context manager for …