microsoft / promptflow

Build high-quality LLM apps - from prototyping, testing to production deployment and monitoring.
https://microsoft.github.io/promptflow/
MIT License
9.15k stars 829 forks source link

[Contribution Request] Integrate Unify into Promptflow #3560

Closed KatoStevenMubiru closed 3 weeks ago

KatoStevenMubiru commented 1 month ago

Hello Promptflow team,

I'm Kato Steven Mubiru, an engineer and contributor from Unify. I'll be leading the integration of Unify into Promptflow.

As the lead on this integration effort, my plan is to:

  1. Evaluate the compatibility between Unify and Promptflow
  2. Identify any potential conflicts or dependencies
  3. Determine the best method for integration
  4. Create and implement a detailed integration plan
  5. Ensure all documentation and tests are updated accordingly

I'm starting work on this today and will provide regular updates on the progress. If there are any specific Promptflow requirements or considerations you'd like me to be aware of, please let me know.

I look forward to collaborating with the Promptflow team to make this integration successful.

0mza987 commented 1 month ago

Hi @KatoStevenMubiru

Thank you for your contribution request. I have a few questions:

  1. Could you please provide more details about the "Unify" you mentioned? Do you have any documentation or a website that we can refer to for more information?
  2. When you mention "integration" are you referring to contributing a flow example? If so, please refer to this folder and ensure that your example follows a similar structure and content.
KatoStevenMubiru commented 1 month ago

Hi @0mza987,

Thanks for your questions. Here's some clarification:

  1. Unify (https://unify.ai/docs/index.html) is an open-source framework for unified access to various LLMs across providers, offering optimization for quality, speed, and cost-efficiency.

  2. I'm proposing either a community integration or a package to integrate Unify's capabilities with Promptflow, not a specific flow example. This could involve custom components utilizing Unify's features in Promptflow workflows.

Which approach - community integration or package - would better align with Promptflow's architecture and plans?

Let me know if you need any further information.

0mza987 commented 1 month ago

@KatoStevenMubiru Thanks for the clarification.

Could you please provide a pseudo-code example to demonstrate how users would utilize Promptflow along with the integrated Unify features? This would help us better understand the targeted user scenario.

My initial thought is that a community integration (which, if I understand correctly, involves adding Unify to Promptflow's dependencies) may not be feasible unless it addresses a significant user need and Unify is proved to be the most suitable tool for this purpose. At this stage, developing a new package might be a more practical solution.

cc @eniac871 @wangchao1230

KatoStevenMubiru commented 1 month ago

Thank you for your feedback and questions, @0mza987. I understand your concerns about community integration. Let me provide a pseudo-code example that demonstrates how users could leverage Promptflow with integrated Unify features, addressing significant user needs:

from promptflow import flow, tool
from unify import Unify

class UnifyTool:
    def __init__(self):
        self.unify_client = Unify(api_key="YOUR_API_KEY")

    @tool
    def optimize_llm(self, prompt: str, constraints: dict):
        optimal_model = self.unify_client.select_optimal_model(constraints)
        response = self.unify_client.generate(prompt, model=optimal_model)
        return response

    @tool
    def benchmark_models(self, prompt_set: list, models: list):
        results = self.unify_client.benchmark(prompt_set, models)
        return results

@flow
def adaptive_translation_flow(text: str, target_language: str):
    unify_tool = UnifyTool()

    # Step 1: Determine text complexity
    complexity_prompt = f"Analyze the complexity of this text: {text}"
    complexity_result = unify_tool.optimize_llm(
        complexity_prompt,
        constraints={"max_latency": 1000, "min_quality": 0.7}
    )

    # Step 2: Choose appropriate model based on complexity
    if "high complexity" in complexity_result.lower():
        translation_constraints = {"min_quality": 0.9, "max_cost": 0.05}
    else:
        translation_constraints = {"min_quality": 0.7, "max_cost": 0.02}

    # Step 3: Perform translation
    translation_prompt = f"Translate the following text to {target_language}: {text}"
    translation = unify_tool.optimize_llm(translation_prompt, constraints=translation_constraints)

    # Step 4: Quality check
    quality_check_prompt = f"Evaluate the quality of this translation: Original: {text}, Translation: {translation}"
    quality_score = unify_tool.optimize_llm(
        quality_check_prompt,
        constraints={"max_latency": 1500, "min_quality": 0.8}
    )

    return {
        "original_text": text,
        "translated_text": translation,
        "quality_score": quality_score
    }

# Usage
result = adaptive_translation_flow("Hello, how are you?", "French")
print(result)

# Benchmarking
benchmark_results = UnifyTool().benchmark_models(
    ["Translate 'Hello' to Spanish", "Translate 'Goodbye' to German"],
    ["gpt-3.5-turbo", "gpt-4", "claude-2"]
)
print(benchmark_results)

This example demonstrates how Unify could be integrated into Promptflow to address significant user needs:

  1. Adaptive Model Selection: The flow dynamically selects the most appropriate LLM based on text complexity and user-defined constraints (latency, quality, cost).

  2. Multi-step Workflows: It showcases a multi-step translation process, leveraging different models for complexity analysis, translation, and quality checking.

  3. Performance Optimization: By using Unify's optimize_llm tool, the flow ensures each step uses the best-performing model within given constraints.

  4. Benchmarking: Users can easily benchmark multiple models for specific tasks, helping them make informed decisions about which models to use in their flows.

  5. Cost Management: The flow demonstrates how to adjust model selection based on task requirements, potentially reducing costs for simpler tasks.

  6. Quality Assurance: It includes a built-in quality check step, ensuring the output meets the required standards.

Benefits for Promptflow users:

  1. Improved performance and cost-efficiency through intelligent model selection.
  2. Easier management of complex, multi-step LLM workflows.
  3. Built-in benchmarking capabilities for continuous improvement.
  4. Flexibility to adjust model
KatoStevenMubiru commented 1 month ago

I'm happy to provide additional examples showcasing other Unify features that could benefit Promptflow users. Some areas we could explore further include:

  1. Optimizing large-scale batch processing
  2. Improving performance with caching for repetitive tasks
  3. Using analytics for workflow insights and optimization
  4. Integrating fine-tuning management

If any of these or other aspects interest the Promptflow team, I can prepare more detailed pseudo-code examples. Our aim is to demonstrate how Unify can enhance Promptflow's capabilities and address user needs.

0mza987 commented 1 month ago

Hi @KatoStevenMubiru,

Thank you for the detailed explanation and code example. Based on the sample code, it seems that the Unify features can be seamlessly integrated into our flex flow. Please review the following guidelines and examples and see if it can meet your needs:

KatoStevenMubiru commented 1 month ago

Hi @0mza987 ,

Following suggestions from my superiors and our previous discussions, we propose adding a unify.py module to the tools directory of Promptflow. This file will integrate Unify’s functionalities, enabling Promptflow to access a diverse range of LLMs and optimize model selection based on latency, quality, and cost criteria. Our goal is to enhance the flexibility and efficiency of Promptflow through this integration.

Looking forward to your thoughts and any additional requirements you might have.

KatoStevenMubiru commented 1 month ago

Where by we aim to add the Unify integration in the section for Connections in Promptflow documentation. This will enhance Promptflow's capabilities by allowing access to a diverse range of LLMs with optimized model selection based on latency, quality, and cost.

ChenJieting commented 1 month ago

Hi @KatoStevenMubiru, thanks for reaching out! In addition to flex flow, you can also consider to make it a custom tool following this guidance. Then you can submit a PR to this repo to contribute a tool doc and contribute a sample flow. For example: a tool doc, a sample flow for tool. Please feel free to contact me if you need any help on the contribution process. :)

KatoStevenMubiru commented 1 month ago

Okay thank you both @ChenJieting and @0mza987 for the help. Lets make an initial implementation and submit a pull request.

0mza987 commented 1 month ago

@KatoStevenMubiru Just to clarify, now there are 3 different options to integrate Unify features according to above discussion:

  1. Add a flex flow sample: This is the way I recommended, it requires much less work, just a usage sample.
  2. Add a costum tool: Mentioned by Jieting, need to create a separate tool package, guidance is already provided.
  3. Add unify.py module in the tools directly: This is not recommended. Tools directory only contains built-in tools, all 3rd party tools need to be costum tools. Not to mention this will add unify package as a dependency to prompt flow.

You can choose the implementation 1 or 2. Please make sure you are not implementing method 3, otherwise we may not be able to merge it.

Thanks.

KatoStevenMubiru commented 1 month ago

Okay @0mza987 , the last unify.py option, is out , we will focus on the first 2 options.

Thank you

KatoStevenMubiru commented 1 month ago

Hi @0mza987 and @ChenJieting,

I wanted to share a suggested integration plan i've outlined for bringing Unify into Promptflow, complete with specific file names. Could you please confirm if this aligns with your expectations?

  1. Develop Integration: We'll generate the unify_llm_tool package template and implement functions in unify_llm_tool.py.
  2. Add Tests: We'll update test_unify_llm_tool.py to ensure comprehensive testing.
  3. Documentation: We're planning to enhance the README.md and add detailed docstrings in unify_llm_tool.py.
  4. Create Example Flow: We'll set up the unify_flow_example directory, including files like flow.dag.yaml and optimize_llm.py.
  5. Development Scripts: We'll follow the setup instructions closely and ensure all linting and formatting checks pass with tools like flake8 and black.
  6. PR and Feedback: We'll submit a PR, actively engage with feedback, and make the necessary adjustments.
  7. Final Checks: We'll ensure that all CI checks are passing and that we've addressed all the feedback.

Project structure: unify-promptflow-integration/ │ ├── src/ │ └── unify_llm_tool/ │ ├── __init__.py │ ├── unify_llm_tool.py │ ├── utils.py │ └── yamls/ │ └── unify_llm_tool.yaml │ ├── tests/ │ ├── __init__.py │ └── test_unify_llm_tool.py │ ├── examples/ │ └── unify_flow_example/ │ ├── flow.dag.yaml │ └── optimize_llm.py │ ├── docs/ │ └── README.md │ ├── .github/ │ └── workflows/ │ └── ci.yml │ ├── .gitignore ├── setup.py ├── requirements.txt ├── MANIFEST.in ├── LICENSE └── README.md Do these steps look good to you, or is there anything else we should consider?

Thanks

0mza987 commented 1 month ago

Hi @KatoStevenMubiru,

For the tool source code, you can create a new repo on github and develop it like any other project (public or private, both are fine).

Then you can submit a PR to our repo to add tool document and flow examples, please refer to:

ChenJieting commented 1 month ago

Hi @KatoStevenMubiru, I agree with @0mza987, for the tool contribution, we'd suggest you submitting a PR to add/update the documentation following the structure we have built: 1) Add a new page for your tool documentation to this directory, for example: azure-ai-language-tool document. 2) Update the custom tool index table by adding your tool package meta info. 3) Add the sample flow for your tool to this directory following the Readme guidance. For example, azure-ai-language-tool sample flows Thank you!

KatoStevenMubiru commented 1 month ago

Okay thank you @ChenJieting and @0mza987 , I have made an implementation . I will submit our repo once everything is set.

Thank you, I truly appreciate your advise and time.

RiddhiJivani commented 3 weeks ago

I've addressed your comments in PR. Please review at you convenience. @0mza987 @ChenJieting

0mza987 commented 3 weeks ago

@RiddhiJivani I have merged your pr, thanks for your effort.