preset-io / promptimize

Promptimize is a prompt engineering evaluation and testing toolkit.
Apache License 2.0
431 stars 34 forks source link

Using promptimize with other LLMs #24

Open MNIKIEMA opened 10 months ago

MNIKIEMA commented 10 months ago

Hello,

Firstly, I'd like to express my appreciation for the amazing work on this library. It's been instrumental in my projects.

I'm currently exploring how to adapt it for use with other Large Language Models (LLMs) in conjunction with LangChain. Specifically, I'm looking into inheriting from PromptCase to create a new class for SQL query handling.

Here is my current attempt:

class PromptSqlCase(PromptCase):
    def __init__(self, model, user_input, *args, **kwargs):
        self.model = model
        super().__init__(user_input, *args, **kwargs)

    def get_prompt_executor(self):
        return self.model

    def execute_prompt(self, prompt_str):
        self.response = self.prompt_executor(prompt_str)

        return self.response

However, I'm encountering some challenges:

Integration with Other LLMs: Are there specific considerations or best practices I should be aware of when integrating other LLMs with LangChain using this approach?

Class Implementation: Is my implementation of PromptSqlCase aligning with the library's design principles? If there are any suggested improvements or alternative methods, I would be grateful to learn about them.

Any guidance or resources you could provide would be greatly appreciated. I'm eager to contribute to the library's versatility and would love to share my findings with the community once I've successfully integrated these features.

Thank you for your time and support.