Eladlev / AutoPrompt

A framework for prompt tuning using Intent-based Prompt Calibration
Apache License 2.0
1.86k stars 149 forks source link

Source LLMs #40

Closed YongLD closed 3 months ago

YongLD commented 4 months ago

Can this project use the source LLM? Such as Xcomposer or LLama? Have you test these LLMs in paper?

Eladlev commented 4 months ago

The system support running local LLMs with Langchain hugging face pipelines

We didn't test it in the paper, we definitely intend to expand the paper and provide more experiments

jzyxn commented 3 months ago

How to use local LLMs?

Eladlev commented 3 months ago

You should change in the relevant config file:

        llm:
            type: 'HuggingFacePipeline'
            name: <The name of the model>
            max_new_tokens: <max tokens>

If you want also that the optimizer will be local (highly not recommended), then you should change the meta prompts folder to:

meta_prompts:
    folder: 'prompts/meta_prompts_completion'
jzyxn commented 3 months ago

Thanks for the reply, I will try to see if it works.

IdoAmit198 commented 2 weeks ago

Thanks for the reply, I will try to see if it works.

Hi, do you have any updates regarding the performance using open-source LLMs?