Open austinmw opened 1 year ago
Hi @austinmw! Apologies for the delay in responding. In version 0.2.0 you should be able to use the sagemaker_endpoint
as the LLM type (https://github.com/NVIDIA/NeMo-Guardrails/blob/main/docs/user_guide/configuration-guide.md#supported-llm-models). We haven't tested yet though. However, the prompts would most likely need some tweaking to work well. In 0.3.0 which will be released at the end of this month, we'll add more features to customize the prompts for any type of LLM. Happy to assist you in getting this to work.
Hi, is there any way to use a SageMaker LLM endpoint instead of openai for the engine?