-
Tried the previous solution
`pip uninstall openui`
`pip install .`
still the issue is same, now even the ollama integration is not working
and the GROQ_BASE_URL , GROQ_API_KEY makes no sense, b…
-
export const keys = {
groq: '',
ollama: 'http://localhost:11434/api/chat',
openai: ''
};
-
**Describe the bug**
I am not a 100% sure if this qualifies as a bug report or if it should be a feature request. Currently, we can not use a wild-card (`*`) import to import fragments of a query.
…
-
### Project Name
Chat your Medical PDFs
### Description
This project is a Streamlit-based web application designed to assist medical students in their research by providing an interactive way to qu…
-
**Is your feature request related to a problem? Please describe.**
The current implementation supports only Azure OpenAI as the LLM provider. It lacks the flexibility to support other LLM providers s…
nkkko updated
3 weeks ago
-
Groq does not support some of the fields that are set in the body see: https://console.groq.com/docs/openai
The openai js library automatically adds some of these fields, resulting in 400 errors.
…
-
## Goal
> Note: This Epic has changed multiple times, as our architecture has also changed
> A lot of the early comments are referring to a different context
> e.g. "Provider Abstraction" in Jan
…
-
Currently, the LLM utilizes function calling to render specific widgets. For some widgets, such as charting, a stock is specified. Others like screener have no input.
It would be great if the LLM c…
-
I’d like to be able to quickly pre-configure BoltAI from:
1. My current environment: import common environment variables for various providers and if it's successful, automatically configure the r…
-
!Warning! Even if you pass incognite=true, it still uses groq for text files! actually incognito flag is not used at all, it just uses groq for text files and llama for image files..
Shit code, a l…