-
**Bug description**
Tika Document Reader dependency causing response type exception.
**Environment**
Spring Boot version: 3.3.4
Spring AI Version: 1.0.0-M2
Java Version: OpenJDK 22
**Steps t…
-
```
import os
import yaml
from loguru import logger
from langchain_core.language_models.chat_models import BaseChatModel
from langchain_community.chat_models import ChatLiteLLMRouter
import li…
-
Currently, the llm function in [ai/api/llms.py](https://github.com/briefercloud/briefer/blob/main/ai/api/llms.py) is created using ChatOpenAI from Langchain with a fixed endpoint for OpenAI.
To ma…
-
## Problem
I'm juggling with a lot of model providers (Groq, SambaNova, Cerebras, ...). But when I have to choose the API provider in Aider-composer parameters, there is only one place to set my en…
-
The OpenAI usage includes the following information, but only Anthropic services store cached token details:
```
"usage": {
"prompt_tokens": 2006,
"completion_tokens": 300,
"total_tokens"…
-
$c[Comando deve estar no BDScript 2.]
$c[troque APIKEY abaixo pela sua key da OpenAI. veja esse vídeo para saber como pegar sua key: https://youtu.be/VFBzh4YPOpU]
$var[key;APIKEY]
$nomention
$…
-
### Describe the bug
Running the code at [A streaming example using openai](https://www.gradio.app/guides/creating-a-chatbot-fast#a-streaming-example-using-openai) and typing anything and pressing se…
-
https://openai.com/index/mle-bench/
-
The core focus of the LLMWare project is building pipelines with small open source models, but we believe strongly that the project should be open and supportive of all model providers (small, large, …
-
## Use case
Ability to support AzureOpen AI in addition to other AI models we support.