Open bc opened 8 months ago
Thank you for your contribution. We will check and reply to you as soon as possible.
@bc Thank you for checking out the project. There are couple of limitations in bundling the llama model in the Chrome extension memory/storage being the primary. Also wanted to keep the extension light and working with various models that Ollama supports. I did play around with embedding models using https://github.com/xenova/transformers.js but ran into issues with GPU/CPU/memory and thus relied on the Ollama API. Hope the explanation helps.
Thanks for your awesome work here. I have a more architectural questionâAs deployment is a challenge for non-technical users, is it possible to wrap a llama model directly into the chrome extension JS code? Or are there any limitations to chrome extension memory/storage/CPU that would make that difficult or impossible? Thanks!