-
The common approach seems to be:
- do nothing if the exception is handled by the instrumented library (retries, etc)
- for unhandled (by client lib) exceptions:
- set span status to error
- se…
-
I think would be quite useful to have subcommands in the CLI that acts as a client to the API allowing to e.g:
- [x] Install/setup models with the gallery
- [ ] Run inference quickly (testing)
- …
-
「削除を無視する実装があるから連合するべきじゃない」「AI学習拒否を尊重せずに公開する実装があるからイラストはLocal onlyにするべき」ってTipsがわりと支持を得てるのを考えると、これを尊重しないのも本質的には同じなので結果的には(大規模サーバーのLocal onlyに籠もられるくらいなら)対応するべきとは思う
_Originally posted by @r-ca in https…
-
Got this error when trying to use the node.
![image](https://github.com/user-attachments/assets/bfdfb19e-074a-4f4c-ac69-df5ed064dd99)
As a suggestion from someone, I tried to run install_req.bat …
-
### #
- [x] I have searched the existing issues
### Is your feature request related to a problem? Please describe it
I'm switching between using OpenAI and a local open AI compatible endpoints a lo…
-
👋 This dashboard summarizes my activity on the repository, including available improvement opportunities.
## Recommendations
_Last analysis: Sep 12 | Next scheduled analysis: Sep 19_
### Open
✅…
-
### Description
Hope to expand our AI chat capabilities by adding support for the [GROQ](https://groq.com/) alongside our existing integrations with OpenAI, Gemini, and Anthropic (Claude) APIs. This …
-
**Is your feature request related to a problem? Please describe:**
Hi, I modified `.env.local` with:
```bash
# You only need this environment variable set if you want to use oLLAMA models
#EXAMPLE…
-
[Ollama](https://ollama.com/) is a native app with support for many local models. The [API](https://github.com/ollama/ollama/blob/main/docs/api.md) looks straight forward. I don't the differences betw…
-
I need to use PyGwalker locally and don't want to use OpenAI for Q&A. However, I can also run LLM models locally. How can I combine that local model with PyGwalker Q&A?