-
- [ ] [paper-qa/README.md at main · Future-House/paper-qa](https://github.com/Future-House/paper-qa/blob/main/README.md?plain=1)
# PaperQA2
[![GitHub](https://img.shields.io/badge/github-%23121011.s…
-
### Your current environment
```text
PyTorch version: 2.3.0a0+ebedce2
Is debug build: False
CUDA used to build PyTorch: 12.3
ROCM used to build PyTorch: N/A
OS: Ubuntu 22.04.3 LTS (x86_64)
GC…
-
**Is your feature request related to a problem? Please describe.**
we are exploring around using LaVague for accomplishing web automation but the limitation is using public facing models. can we supp…
-
### System Info
NVIDIA A100-SXM4-80GB
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An official…
-
hello, thanks your solution for the local llm deploy in autogpt, when i installed it , follow the Readme, some steps not clear, and i can not test it sucess , is there needs to provides the openai …
-
Hi,
I am unable to import LlamaCpp in IPEX
CODE : from ipex_llm.langchain.llms import LlamaCpp
ERROR
Cell In[5], [line 1](vscode-notebook-cell:?execution_count=5&line=1)
----> [1](vscode-note…
-
Hi all,
I am facing the following issue when using HuggingFaceEndpoint for my custom finetuned model in my repository "Nithish-2001/RAG-29520hd0-1-chat-finetune" which is public with gradio.
llm_…
-
### System Info
- CPU x86_64
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the…
jlewi updated
2 months ago
-
res =guardrails.invoke({"input":"How do I cook meat"})
0.5s
I'm defining a chain, not using it ! the llm is local, while the llm in the yml file is openAI
chain = print_func|(guardrails |llm)| …
-
### System Info
tensorrt 10.0.1
tensorrt-cu12 10.0.1
tensorrt-cu12-bindings 10.0.1
tensorrt-cu12-libs 10.0.1
tensorrt-llm 0.10.0.dev2024050700
…