-
Should finetune all TDBN or just CRBM+LogReg?
At the moment (13 avril 2016), I try to put together all my models.
In other words, until now I train RBMs and use them to generate a new dataset (train,…
-
我想請問,像是pruning、finetuning這些code目前並沒有顯示做完防禦後模型的ASR對嗎?
還是說我有工具可以使用?
![image](https://github.com/user-attachments/assets/8f551b7f-048a-403b-bf4b-9f04ea738f25)
-
Trying to use `generate_qa_embedding_pairs` method to create synthetic data.
`from llama_index.finetuning import generate_qa_embedding_pairs`
I run into an error:
```
---------------------…
-
Hello everyone, below is my code for fine-tuning XTTS for a new language. It works well in my case with over 100 hours of audio.
https://github.com/nguyenhoanganh2002/XTTSv2-Finetuning-for-New-Lang…
-
I have the following error when finetune the DocOwl1.5-Omni. It always raises error when index is 10. Please help!!!
```
File "/opt/conda/envs/mplug_owl2/lib/python3.10/site-packages/deepspeed/run…
-
Hi, thanks for sharing this wonderful work. Since you use the multi-frame multi-view inputs during pretraining stage, I want to know whether did you still use the temporal multi-frame inputs during fi…
-
Hi there,
First of all thanks for such an awesome work, I tried it for my custom usecase, it gave me awesome results.
But only 2 classes are missing, I also tried using different custom labels, as t…
-
# Title of the Talk: No Code SLM Finetuning with MonsterAPI
## Abstract of the Talk:
Dive into the world of no-code large language model (LLM) finetuning in this informative talk presented by Mons…
-
Good Job! Do you have a plan to support LoRA or other PEFT?
-
在finetune代码中,部分也加入了loss的计算,想请教下这样相比conditioning language modeling loss有什么特别的好处吗?