-
参考 openai-translator/openai-translator 的项目。
使用 ollama/qwen:7b 模型去做翻译
![image](https://github.com/SvenZhao/var-translation/assets/15955617/96b15c2c-6c84-4cb8-a109-ab567f28e597)
-
I was following this guide to implement a custom local provider of LLMs https://www.promptfoo.dev/docs/providers/python and seem to have some issues that I would love to debug.
My implementation :
…
-
# URL
- https://arxiv.org/abs/2311.18805
# Affiliations
- Qi Cao, N/A
- Takeshi Kojima, N/A
- Yutaka Matsuo, N/A
- Yusuke Iwasawa, N/A
# Abstract
- While Large Language Models (LLMs) have ac…
-
See:
https://github.com/gocodebox/lifterlms/pull/928#discussion_r325180248
Basically we can avoid counting the found rows when executing db queries if we don't need pagination information.
Rema…
-
### Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the LangChain documentation with the integrated search.
- [X] I used the GitHub search to find a sim…
-
How can I retrieve relevant knowledge to LLMs?Would you kindly provide the corresponding code if available?
-
mamba would be a powerful library for LLMS, It also potential for other application such as recsys. Would mamba provide tensorflow wrapper version in the future for more application and easier use
-
# URL
- https://arxiv.org/abs/2310.13127
# Affiliations
- Zhihan Zhang, N/A
- Shuohang Wang, N/A
- Wenhao Yu, N/A
- Yichong Xu, N/A
- Dan Iter, N/A
- Qingkai Zeng, N/A
- Yang Liu, N/A
…
-
### Version
v1.12.0
### Describe the bug
Platform: MacOS
**Steps to reproduce:**
1. Have Cody plugin installed in your VS Code editor
2. Login to Cody with Pro user
3. Open a source code file and…
-
Experiment 1
Prompt A: Token wise but _____ foolish.
Results for gpt-3.5-turbo-0613 temperature=1.
1. knowledgeable
2. knowledgeable
3. Token wise but cash foolish.
4. skillful
5. Resourceful but t…