NVIDIA / NeMo-Guardrails

NeMo Guardrails is an open-source toolkit for easily adding programmable guardrails to LLM-based conversational systems.
Other
3.91k stars 352 forks source link

Not able to add Actions with Custom LLM Engine with nemo guardRails #532

Open alwaid-biswapriya opened 2 months ago

alwaid-biswapriya commented 2 months ago

I am trying to integrate Nemo Guardrails with our hosted LLM Model. For that, I have created a Custom LLM Class.

     class CustomLLM(BaseLanguageModel):        

        async def agenerate_prompt(self, prompts, stop=None, callbacks=None, **kwargs):
        print("Function agenerate_prompt")
        # # print("Generating prompt with Model : "+self.selected_model)
        print("----------------------------")
        print(str(prompts[0].text))
        print("----------------------------")
        http_client=httpx.Client(verify=False)

        client = OpenAI(
                            base_url="<Our URL>",
                            http_client=http_client,
                            api_key="<Ourt key>")

        streaming = False
        max_output_tokens = 50
        # Available Models list
        # Let's select the model from available list
        completion = client.completions.create(
            model="llamaguard-7b",
            max_tokens=max_output_tokens,
            prompt=prompts[0].text,
            stream=streaming)  
        print(" LLM Response ") 
        print("------------------------------")
        print ( completion.choices[0].text )
        print("------------------------------")
        return LLMResult(
                        generations=[[Generation(text=completion.choices[0].text)]],
                        llm_output=completion,
                        run=[RunInfo(model_name="llamaguard-7b", timestamp="2024-05-18",run_id=uuid.uuid4())]
                        )
    register_llm_provider("customeLLM",CustomLLM)

And config.yaml as given below.

models:
  - type: main
    engine: customeLLM

I am trying to add an action rails to invoke an action once the intent is satisfied.

define user greeting
        "Hey there!"
        "How are you?"
        "What's up?"

define user general question
        "who is the president of America"

define flow
    user asks general question
    $answer = execute bot_response(inputs=$last_user_message)
    bot $answer

But this action is not getting invoked. Is it because of the custom LLM engine which we have specified.

### Tasks
Pouyanpi commented 1 month ago

Hi @alwaid-biswapriya, it seems your CustomLLM implementation has issues. Could you resolve it or you still have this issue?