-
I am currently working on fine-tuning a LiLT model using the tutorial notebook titled "[LiLT/Fine_tune_LiLT_on_a_custom_dataset,_in_any_language.ipynb](https://github.com/NielsRogge/Transformers-Tutor…
-
Hey!
I am trying to follow this guide: https://huggingface.co/docs/optimum-neuron/tutorials/fine_tune_bert and fine tune BERT on a trn1.2xlarge instance. I setup the datasets as mentioned in the bl…
-
1. Take notes.
2. Search and learn some terminologies.
3. List questions that I want to ask.
-
Hi friends! 👋
There are a lot of cool existing resources for how to do *x* with *x* model, and we’d like to showcase and aggregate these resources on a model’s documentation. This’ll help users see…
-
Hi,
Thanks for the codebase.
I have a question about coCondenser-marco fine-tuning command [here](https://github.com/texttron/tevatron/tree/main/examples/coCondenser-marco#fine-tuning-stage-1)…
-
torchsharp有bert微调的相关案例吗,怎么加载pytorch版本的bert呢?
-
# stable diffusion
>Stable Diffusion是一个非常实用的AI绘画工具,它的免费开源性和高效实用性为用户提供了更多的可能性
>[Stability-AI/stablediffusion: High-Resolution Image Synthesis with Latent Diffusion Models (github.com)](https://github…
-
"Is this training pipeline fine-tuning pretrained SwinT weights or training from scratch? Testing the new weights reveals issues: they fail to recognize general objects and have incomplete detection o…
-
Are the trained models on Kraken 4.x compatible with Kraken 5.x? Has anything to do with the new line extractor?
```
(kraken-5.2.0) incognito@DESKTOP-NHKR7QL:~$ kraken -i 159.jpg lines.json segmen…
-
This is not a proper feature request; rather I need the guidance to build our customized model using BertSentenceEmbedding which would be built on top of pretrained model for ex: small_bert_L2_128; I…