-
System Info
GPU: NVIDIA RTX 4090
TensorRT-LLM 0.13
quest 1: How can I use the OpenAPI to perform inference on a TensorRT engine model?
root@docker-desktop:/llm/tensorrt-llm-0.13.0/examples/apps# pyt…
-
像是缺失了文件
Unrecognized model in D:\LIUGEGE\ComfyUI\models\Joy_caption_alpha\text_model. Should have a `model_type` key in its config.json, or contain one of the following strings in its name: albert, a…
-
### System Info
```shell
optimum==1.23.1
transformers==4.43.4
onnxruntime-gpu==1.19.2
sentence-transformers==3.2.0
Windows
Python 3.11.6
```
### Who can help?
@michaelbenayoun
…
-
This issue will be used to track compilation failures for migraphx models on CPU and GPU. Compile failures for each model should have a link to an issue with a smaller reproducer in the notes column.
…
-
Hi,I was wondering how can I use arabert[bert-large-arabertv02-twitter] for simple dialect classification. I've downloaded the model.
-
Hi, I have recently raised the issue [here](https://github.com/wietsedv/bertje/issues/38#issue-1929534625) which you can refer to. I was basically trying to use an external model try and get the bert_…
-
Enable support for BERT-Tiny model using TTNN ops and port functionality to n300 card (single device).
-
Document how to fine-tune a pre-trained BERT model for text classification.
-
First of all, thank for sharing your code. And I have succeed replay your work on resnet and densenet. But I have meet some trouble on replaying Bert.
1. I conda the environment based on requirmen…
-
To train a mm_grodunding_dino ,we need to load both BERT and Swin two pre-trained models。
To fine-tune a mm_grounding_dino using my dataset, I need to load a pre-trained MM_Grounding_DINO and the con…