-
```bash
2024-10-23 11:54:32,830 - _client.py[line:1038] - INFO: HTTP Request: GET http://127.0.0.1:7862/sdfiles/download?filename=&save_filename= "HTTP/1.1 200 OK"
2024-10-23 11:54:32.832 | DEBUG …
-
OpenAI added support for token stats in a streamed response. Would be great to have similar feature in Ollama.
https://community.openai.com/t/usage-stats-now-available-when-using-streaming-with-the…
-
## Issue
In our logs service, we see a lot of `StreamChat error code -1: QueryUsers failed with error: "",` errors that are result of queryUsers().
Log example:
![image](ht…
-
# Livestream
Test what it's already one.
Pending Tasks:
- [ ] UI for Event/Stream Creation: Develop a user interface to facilitate the creation of events or streams, aligning with the [NIP-…
-
### Question Validation
- [X] I have searched both the documentation and discord for an answer.
### Question
I have an LLM model (specifically from Anthropic) and I want to use the native tools fro…
-
An error occurred on the server while streaming results.
`2024-11-16 00:44:25.918 [error] [Microsoft.Azure.CopilotForAzure.AIService.Orchestration.OrchestratorBase] Error while processing request.
Sy…
-
### Description of the bug:
```python
def run_auto_function_calling():
"""
Function calls naturally fit in to [multi-turn chats](https://ai.google.dev/api/python/google/generativeai/Gener…
-
**Describe the bug:**
in ai assistant, whether streaming in settings, is true or false, the body of the sent request to llm, does not have "stream" field, so ai assistant could not get streamin respon…
-
something like this should be in nextjs
connect-src 'self' https://chat.stream-io-api.com wss://chat.stream-io-api.com ${process.env.NEXT_PUBLIC_API_URL || ''};
-
Thanks for sharing this amazing repo.
Is it possible to enable "stream" mode for GPT? I tried to add the parameter like "streaming: true", but it did not work. Any tips for that? Thanks in advance.…