FlowiseAI / Flowise

Drag & drop UI to build your customized LLM flow
https://flowiseai.com
Apache License 2.0
29.47k stars 15.25k forks source link

[BUG] Conversational Agent example but with ChatHuggingface failed #851

Open predoctech opened 1 year ago

predoctech commented 1 year ago

Describe the bug The current Conversational Agent in Marketplace is built with ChatOpenAI. Tried to replace that component with ChatHuggingface and make use of hosted HF model "microsoft/DialoGPT-large". Chain failed with message:

The following model_kwargs are not used by the model: ['return_full_text'] 
(note: typos in the generate arguments will also show up in this list)

To Reproduce

  1. Go to 'Marketplace - Conversation Agent'
  2. Replace components "ChatOpenAI" with "ChatHuggingface"
  3. Specify a HF chat model
  4. Run the chain in message window

Expected behavior A response from HF model

Setup

Additional context Using "ChatHuggingface" in other chain examples (ie Retrieval chain) has caused similar errors.

chungyau97 commented 1 year ago

Hi @predoctech,

I tested the model with pure LangChain JS, I also get the error you're describing LLM run errored with error: "The following 'model_kwargs' are not used by the model: ['return_full_text'] (note: typos in the generate arguments will also show up in this list)"

    //Model
    const model = new HuggingFaceInference({
        model: 'microsoft/DialoGPT-medium',
        apiKey: process.env.HUGGINGFACEHUB_API_KEY
    })

    //Tools
    const tools = [
        new SerpAPI(process.env.SERPAPI_API_KEY, {
            location: 'Austin,Texas,United States',
            hl: 'en',
            gl: 'us',
        }),
        new Calculator(),
    ]

    //Agent
    const executor = await initializeAgentExecutorWithOptions(tools, model, {
        agentType: 'chat-conversational-react-description',
        verbose: true,
    })

    return await executor.call({ input: question })

Reference: LangChain JS Conversational LangChain JS HuggingFaceInference

LangChainJS Open Issue: https://github.com/hwchase17/langchainjs/issues/2177

HenryHengZJ commented 1 year ago

The fix to this, is to extend hugginface class and add a new JSON field for users to add new/custom model kwargs

predoctech commented 1 year ago

I just wish to add, when I tested with HF Inference LLM components the same error will show for the same underlying model. I suppose it all links back to the same huggingface base class that @HenryHengZJ mentioned so that a change will apply equally to ChatHF and HF Inference.

BadrLaajali commented 11 months ago

Did you find a solution to this problem?

DebajitKumarPhukan commented 10 months ago

I am facing the same issue. Hugging face models don't work with conversational agents. This is making Flowise behave a lot like Langflow i.e. dependent on Open AI.

shrutifiske commented 10 months ago

I am facing same issue with LocalChatAI model and Conversational Agent but it is working fine with LLM Chain. Does any one got solution. I am using llama-2-7b-chat.ggmlv3.q4_0.bin with 16GB RAM and 4 CPUs.

Getting rpc error: code = Unknown desc = inference failed.