Closed Ana0112 closed 2 weeks ago
Hi, @Ana0112! Thanks for your positive feedback!
MiniSearch used transformers.js (v2) in its early days, but when Wllama came up, it was more performative and easier to maintain. And on the GPU side, I noticed that the WebLLM format consumes less memory than the ONNX on transformers.js (v3).
For this reason, there are currently no plans for bringing back transformers.js as an inference engine, but I liked the suggestion of using it for other purposes, like allowing speech-to-text searches.
Could you share more about other usages you thought of for transformers.js in this project?
Thank you!
May be use image models to let user do image search?
Is your feature request related to a problem? Please describe.
Hey, I tried your platform, and was amazed to see that i could run wllama & webLLM models easily on my basic laptop. When I try running web-llm through their original github project, my browser usually becomes unresponsive.
It is good to have integration with transformers.js . If say, i want to search using voice, there is speech-to-text model in transformers.js
Describe the solution you'd like
Integrate models from transformers.js with or without webGPU support.
Thank you.
Describe alternatives you've considered
No response
Additional context
No response