-
* Frequent hallucinations: expressing “facts” that aren’t true in plausible ways. Always fact check its answers.
* Suggestible. Avoid leading questions because they can frequently cause hallucination…
-
### Non-ChatGPT bug
- [X] This issue does not occur on chat.openai.com and only occurs on this app.
### Version
1.1.0
### Bug description
When I ask anything in the chat window it shows an error…
-
https://chat.openai.com/g/g-Ej5zYQRIB-academic-writer-professional-version doesn't work.
-
I'd like to add [OpenAPI extensions](https://swagger.io/docs/specification/openapi-extensions/) to endpoints, especially for [GPTs Actions](https://platform.openai.com/docs/actions/consequential-flag)…
-
I have not inspected the code but this is why I suspect this.
I created an assistant that should be explicitly searching for "University of Florida":
![image](https://github.com/huggingface/chat-ui/…
-
Will be amazing if there were APIs that expose OpenRecall content to be used as a RAG for another LLM (i.e. Ollama or Dify or ChatGPT GPTs using functions) to enable asking "what's the last email i se…
-
Hi,
one of the things we did not put into the grant application was the idea of having a chatbot for GlyGen.org. Mainly because it was complicated at that time.
With the recently introduced GPTs…
-
For now, it's some complex to write a LLM application yaml, something we can do to simplify it.
1. Create default dependent resources(CRDs) using controller, not by user, such as Prompt, LLMChain, Re…
-
### Search for answers in existing issues
- [X] I have searched issues, there is no similar issue related
### Feature description
https://github.com/linexjlin/GPTs
这个项目里就提供了很多有意思的prompt
### Motiv…
ic4y updated
7 months ago
-