-
Add an additional feature in app.py for LLM to act as fashion stylist and make deicion on whether to keep or return.
- Input: User upload image
- LLM responds with keep or return decision and justi…
-
In your opinion, can this approach be applied to Montezuma's revenge enviroment?
-
type `OpenAIChatSettings` is missing option for `response_format`
Example:
```typescript model: openai
.ChatTextGenerator({
model: 'gpt-4-vision-preview',
maxGeneration…
-
The leaderboard says the systems use a model from the GPT-4-Turbo family. Can you please clarify?
-
Whose API is this project using? Is it using the gpt-4-vision-preview model?
-
This gpt-4-vision sample works with sample image provided in the sample code: https://github.com/microsoft/semantic-kernel/blob/main/dotnet/samples/KernelSyntaxExamples/Example68_GPTVision.cs
Howev…
-
I have macbook M1 pro 16GB
I used the following command :
docker run -p 8080:8080 --name local-ai -ti localai/localai:latest-aio-cpu
and went on the frontend to try to chat or to generate a…
-
First of all I greatly appreciate the work you've done on BetterChatGPT and appreciate its powerful capabilities. Today, I'd like to request two new features that I believe could significantly enhance…
-
When using GetChatMessageContentAsync, he wants to change his model in the settings.
I am using DI and it registers the default model along with the token.
I have a case that in one of the functiona…
-
### Confirm this is a Node library issue and not an underlying OpenAI API issue
- [X] This is an issue with the Node library
### Describe the bug
The response from ChatGPT unexpectedly cuts off if …