Closed d-shree closed 2 weeks ago
@d-shree : can you provide more details?
LLMRails
instance)?@drazvan
client = ChatOpenAI(
api_key=OPEN_AI_KEY,
)
config = RailsConfig.from_content(
colang_content=colang_flow,
yaml_content=config_content
)
rails = LLMRails(config=config, llm=client)
@action
decorator.
FROM python:3.9
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
EXPOSE 8009
CMD ["uvicorn", "src.endpoints:app", "--host", "0.0.0.0", "--port", "8076"]`
@d-shree: actions are loaded automatically from the actions.py
file when the configuration is loaded from a path. If you're creating the RailsConfig
instance with from_content
, you need to manually register the actions using rails.register_action
. Let me know if this solves the issue with the action not found.
This worked, but its strange that this error did not show up while testing normally, which was why I was having a hard time figuring out the exact cause of this issue.
I have created a FastAPI based application that uses NeMo-Guardrails for handling the conversation flow. This works fine locally however when I run the application as a docker container. I see the following error -
Bot message: "Action 'extract_masked_text' not found."[/]
where extract_masked_text is a custom action I have defined in
actions.py
file.I have also noticed that the prompts used to
generate_user_intent
have very different examples picked up for Input which is a zip codeThe examples picked up in the prompt used in local machine -
The examples picked up in the docker container -
Can somebody help me figure out what is going wrong here?