-
Would you consider supporting [Transformers.js](https://github.com/huggingface/transformers.js) as an additional provider? This library provides access to text generation and content classification fu…
-
Hey folks,
this package is absolutely awesome! I'm always watching out for performant small models, so this is a goldmine for me.
I have some questions/possible feature ideas for getting static …
do-me updated
4 weeks ago
-
Can this model be memory optimized for onnx quantization?
Great improvements! 👍
-
**Is your feature request related to a problem? Please describe.**
yes the library needs sharp support
**Describe the solution you'd like**
possibility of using
```javascript
import { pipeline …
-
**没有这个功能(改进)的时候,你遇到了什么问题?**
设置本地运行的翻译模型
**你想要如何解决你的问题?**
直接使用transformers.js或ollama的api配置
**如果想要的解决方案无法实现,你觉得还有什么办法你是可以接受的?**
**补充信息**
-
### System Info
when using AutoModel.from_pretrained(modelName) it has error message Error: Could not locate file: "https://huggingface.co/google-bert/bert-base-uncased/resolve/main/onnx/model.onnx",…
-
Has anyone been able to achieve this successfully ?
I've tried saving my model in gguf format with:
```
model.save_pretrained_gguf("model_gguf", tokenizer)
```
and then to convert it in onn…
-
The app uses only OpenAI API to generate the storybook stories, so the user has to add the OpenAI API key to use it.
It would be useful to have a free option like transformersjs or webllm to generat…
-
### Feature request
### Feature request
Integrate `ExecuTorch` as a new backend to transformers.js, enabling an new ["Export to ExecuTorch"](https://github.com/huggingface/transformers/issues/3225…
-
`@huggingface/transformers` has been announced to be available on `NPM`, we might try to upgrade it since we are using `transformers-branch:v3-one_commit` to leverage `webgpu` support.
#### Some no…