google-ai-edge / mediapipe

Cross-platform, customizable ML solutions for live and streaming media.
https://ai.google.dev/edge/mediapipe
Apache License 2.0
27.42k stars 5.15k forks source link

Use local model file but have an error "Not allowed to load local resource: file:///D:/model/gemma-2b-it-gpu-int4.bin" #5597

Open akau16 opened 2 months ago

akau16 commented 2 months ago

Have I written custom code (as opposed to using a stock example script provided in MediaPipe)

None

OS Platform and Distribution

Firebase Hosting

MediaPipe Tasks SDK version

No response

Task name (e.g. Image classification, Gesture recognition etc.)

/llm_inference /js/

Programming Language and version (e.g. C++, Python, Java)

html, javascript

Describe the actual behavior

can not access local model file

Describe the expected behaviour

can access model file

Standalone code/steps you may have used to try to get what you need

When I run the llm_inference in localhost, it's ok to access model file like "gemma-2b-it-gpu-int4.bin" that is in project folder, but when I run llm_inference in Firebase Hosting, it can not access on-device's model file, it will show "Not allowed to load local resource: file:///D:/model/gemma-2b-it-gpu-int4.bin".
And I query it to get that info 'In standard HTML and JavaScript, it is not possible to directly specify to read files with a specific path on the local machine. This is due to browser security restrictions designed to protect user privacy and prevent malicious websites from automatically accessing the local file system'.

But I try your sample in MediaPipe Studio(https://mediapipe-studio.webapps.google.com/studio/demo/llm_inference), I can click 'Choose a model file' and select model file in my device and run OK, I would like to ask how can it do that? Thank you!

Other info / Complete Logs

No response

kuaashish commented 2 months ago

Hi @akau16,

Could you please review the Stack Overflow thread https://stackoverflow.com/questions/5074680/chrome-safari-errornot-allowed-to-load-local-resource-file-d-css-style and try the suggested solution? Let us know if you still need further assistance.

Thank you!!

akau16 commented 2 months ago

Hi kuaashish:

Thanks for your kindly reply, I think my problem is a little different with it. Above is my code.

LlmInference .createFromOptions(genaiFileset, { baseOptions: {modelAssetPath:'D:/model/gemma-2b-it-gpu-int4.bin'}, .....

and it will show the error Not allowed to load local resource: file:///D:/model/gemma-2b-it-gpu-int4.bin", how can I resolve the problem? thank you