-
hello everyone,i can run python main.py --use_cached_query --visualize,but i cant run python main.py --visualize , my api_key can use, there is my wrong:
Traceback (most recent call last):
File "C…
-
Hello! Trying to pull 5,000 responses of the same prompt from GPT 4o Mini and repeatedly getting this error. The code worked fine previously; no network changes or disruptions on my end. Thanks!
-
Hello,
great idea for the plugin.
but to make our lives easier, is it possible to define the open_api_key somewhere in a file instead of env variable?
Thank you
-
Using the `Embedding.New` function for input texts containing control characters (U+0000 - U+001F and U+007F - U+009F) results in the following error:
```
POST "https://api.openai.com/v1/embeddings"…
-
Issue is WIP and will be further refined.
LLM and embedding model sources are currently defined in GUC, e.g. `vectorize.openai_service_url = https://api.openai.com/v1` contains the base url for Ope…
-
could you please add a config for custom website / api path that openai compatible?
Thank you.
-
LocalAI has a OpenAI compatible API. Two ways to support this as a provider:
1. Add a provider for LocalAI.
2. Add the following options to the OpenAI provider:
1. Model name would have to be…
-
### 🐛 Describe the bug
My current code:
```js
import { RAGApplicationBuilder, LocalPathLoader } from '@llm-tools/embedjs';
import { OpenAiEmbeddings } from '@llm-tools/embedjs-openai';
import { …
-
### Issue
I have installed aider ( version aider 0.59.1) in Ubuntu.
CPython 3.10.12
I followed the installation manual and exported the following variables:
export AZURE_API_KEY=mykey
expo…
-
When trying to follow the introduction docs using GenAIScript with any Ollama models fails.
This seems to stem from two issues:
1. Making invalid calls to the ollama host, it is not providing the co…