-
Is it possible to convert quantized models? I tried and got error: ```WARNING: Error converting initializer: Unsupported tensor data type uint8 for operator None```
-
I've been able to use the Tensorrt node for a few things, and it works really well for speeding up the upscaling process. Would it be possible to make it compatible with DAT and HAT formats as well? C…
-
### What happened?
Converting the following models https://github.com/iree-gd/iree.zoo/blob/main/.github/workflows/multiple-models.yml#L10 (31 models so far) from **kaggle** with version `20240812.…
-
# Summary
Review the following models files to determine what needs to get converted into valkyrie resources. When you pick up a model, please include its entirety.
~`rails generate hyrax:work_reso…
-
I want to convert the huggingface models to gguf file, but always met errors when following the README.md, then I found the files in llama.cpp repo are updated:
python download_huggingface.py --mod…
-
How to convert models to bmodel? Can you provide a reference example? Thanks.
-
-
The following is my code, and the issue occurred in the loading model section:
from rknn.api import RKNN
if __name__ == '__main__':
# 确定目标设备target
target = 'rv1126'
# 创建RKNN对象
…
-
Package Version
------------------------ ------------------
certifi 2024.7.4
charset-normalizer 3.3.2
filelock 3.15.4
fsspec …
-
Now that we are using SQLAlchemy 2.0, we should begin the process of moving away from the deprecated SQLAlchemy 1.x method of doing things, and moving into the new SQLAlchemy 2.x form. To start, the `…