Hi
I self hosted a NIM of Llama3 8B and would like to use this config to test out guardrails:
models:
- type: main
engine: nim
model: meta/llama3-8b-instruct
parameters:
base_url: http://xxx.us-east-1.elb:8000/v1
Since this configuration must install langchain-nvidia-ai-endpoints, so I installed it by following the instruction here
After running pip install -U langchain-nvidia-ai-endpoints, I got this error
langchain-nvidia-ai-endpoints 0.3.0 requires langchain-core<0.4,>=0.3.0, but you have langchain-core 0.2.40 which is incompatible.
With the latest version of guardrails (0.10.0), the required langchain-core version is impossible to be compatible with the one required by langchain-nvidia-ai-endpoints
nemoguardrails 0.10.0 requires langchain-core!=0.1.26,<0.3.0,>=0.2.14, but you have langchain-core 0.3.0 which is incompatible.
There is no way that I can find a proper version of langchain-core to be compatible with both langchain-nvidia-ai-endpoints 0.3.0 and nemoguardrails 0.10.0. Does anyone have a better idea how to address this issue?
Hi I self hosted a NIM of Llama3 8B and would like to use this config to test out guardrails:
Since this configuration must install langchain-nvidia-ai-endpoints, so I installed it by following the instruction here After running
pip install -U langchain-nvidia-ai-endpoints
, I got this errorlangchain-nvidia-ai-endpoints 0.3.0 requires langchain-core<0.4,>=0.3.0, but you have langchain-core 0.2.40 which is incompatible.
With the latest version of guardrails (0.10.0), the required langchain-core version is impossible to be compatible with the one required by langchain-nvidia-ai-endpoints
nemoguardrails 0.10.0 requires langchain-core!=0.1.26,<0.3.0,>=0.2.14, but you have langchain-core 0.3.0 which is incompatible.
There is no way that I can find a proper version of langchain-core to be compatible with both langchain-nvidia-ai-endpoints 0.3.0 and nemoguardrails 0.10.0. Does anyone have a better idea how to address this issue?