Closed WingCH closed 5 months ago
same issue using v1.3.0
You can use this to reproduce
https://www.youtube.com/watch?v=CmFIcAn0PeQ
The problem is that when the original line is long, the translation might drop to the next slot...which make the whole translation slip
version 1.3.2 still having this problem using the same youtube link above to test.
version 1.3.2 still having this problem for video over 40 minutes
https://github.com/Makememo/MemoAI/releases/tag/v1.3.3
https://github.com/Makememo/MemoAI/releases/tag/v1.3.3
- When transcribing, select cuda mode for GPU mode
Go to Settings-Lab-Turn on the flash attention mechanism switch
- 轉寫時,GPU 模式選擇 cuda 模式
- 前往設置 - 實驗室 - 打開 flash attention 機制開關
I just tested the provided sample https://www.youtube.com/watch?v=CmFIcAn0PeQ and the translation is off from the begining
The problem seems like missing translation on the red square above and everything slip
And then I tried to use Large(v3) to test again and translate, something wierd happen. You see the translation all group into 1 line...
I think you can use that youtube link to try a couple times on translation, you will see there are lot of missing translation, or the translation just slip to next line
I understand you group a couple lines together for translation to save cost, which make sense. I think you might try to do something in your coding like the picture above and see if you can mark the number for each line a bit more accurate
BTW, please reopen this case, I am pretty sure this is not fixed yet after a couple tries
@Makememo
BTW, please reopen this case, I am pretty sure this is not fixed yet after a couple tries
@Makememo
His problem is a transliteration problem. The problem of your translation misalignment is an AI problem, and frankly there is no 100% solution, we tried your way, JSON output, and various methods. Did not have the desired effect. So we provide single sentences and continued translation to alleviate this dislocation.
Screenshots
Desktop (please complete the following information):
Additional context Add any other context about the problem here.