google-ai-edge / mediapipe

Cross-platform, customizable ML solutions for live and streaming media.
https://ai.google.dev/edge/mediapipe
Apache License 2.0
27.7k stars 5.18k forks source link

How to set a system prompt for RAG implementation for Inference for Gemma 2b on IOS ? #5277

Open omkar806 opened 7 months ago

omkar806 commented 7 months ago

Have I written custom code (as opposed to using a stock example script provided in MediaPipe)

None

OS Platform and Distribution

IOS

MediaPipe Tasks SDK version

No response

Task name (e.g. Image classification, Gesture recognition etc.)

LLM inference

Programming Language and version (e.g. C++, Python, Java)

SwiftUI

Describe the actual behavior

Currently we can load the gemma 2b on IOS and chat with it in general . But if we want to set some system prompt like You will act as this agent or a bot and your name is this , then How can user set this . As there is only an function to genrate response taking users query .

Describe the expected behaviour

Currently we can load the gemma 2b on IOS and chat with it in general . But if we want to set some system prompt like You will act as this agent or a bot and your name is this , then How can user set this .

Standalone code/steps you may have used to try to get what you need

.

Other info / Complete Logs

No response

kuaashish commented 7 months ago

Hi @omkar806,

At present, this feature is unavailable. Should you wish to incorporate it, we kindly ask you to submit a feature request outlining the potential benefits for the community upon its implementation. Subsequently, we will review and forward your request to the appropriate team. Based on the ensuing discussion and demand, we can consider implementing it in the near future.

Thank you!!

omkar806 commented 7 months ago

okay I will add a feature request for this

schmidt-sebastian commented 7 months ago

We are working on making it easier to build more advanced use case on top of our LLM Inference. That being said, you can already tell the model to act like an agent by simply including this instruction in the prompt you are using. We will have more examples for this in the coming weeks.

ignoramous commented 3 months ago

That being said, you can already tell the model to act like an agent by simply including this instruction in the prompt you are using. We will have more examples for this in the coming weeks.

Hi: Has this guide been published?

talumbau commented 2 weeks ago

Hi, Sorry we don't have any new guides published on RAG yet, but we do plan to launch an example of a RAG pipeline with a Gemma model soon. I noticed that this issue is specifically asking about System prompts, but obviously a full RAG example would involve more than just prompting the model in a particular way. We hope to launch an example that shows information retrieval which is then fed to the model for inference, as is typical in RAG workflows.