Open chienhuikuo opened 1 week ago
Hi @chienhuikuo,
Unfortunately, the documentation you are following is not applicable to the Phi-3.5-mini, as we only support and have tested Phi-2. As a result, we may have limited ability to assist with this issue. However, we will mark this feature and share it with our team to see if we can offer any support.
Thank you!!
Hi @kuaashish,
Thank you for your reply.
As you mentioned that only Phi-2 is supported and has been tested, I tried to convert the Phi-2 TFLite model to a .task
file following the sample code. However, I encountered an issue because I couldn't obtain the tokenizer.model
from Hugging Face. The microsoft/phi-2 repository doesn't provide it.
I understand that we can use the genai.converter
library to convert Phi-2 to get the phi2.bin
, but I would like to go through the process of creating the .task
file myself. Could you please advise me on how to obtain the tokenizer.model
?
MediaPipe Solution (you are using)
LLM Inference
Programming language
Android Java/Kotlin
Are you willing to contribute it
None
Describe the feature and the current behaviour/state
Would it be possible to provide the
phi-3.task
orphi-3.5.task
for download directly on Kaggle?Will this change the current API? How?
No response
Who will benefit with this feature?
No response
Please specify the use cases for this feature
To use phi-3/3.5 in an easy way.
Any Other info
I followed the documentation to create the
phi-3.5-mini.task
, but when I built the app and entered a prompt on my Android phone, I encountered the following error:I sought assistance in the ai-edge-torch repo, but it didn't resolve the issue: https://github.com/google-ai-edge/ai-edge-torch/issues/293