Closed derek3131 closed 1 month ago
I have the same issue, but only when using local Ollama models. Which model are you using?
Also, local Ollama models.
codellama:13b llama3.1:8b mistral-nemo:12b mixtral:8x7b:47b
I just updated to the latest version, and this seems to fix the double output. See https://github.com/danielmiessler/fabric/commit/0ef4e465e4590211b78b0c78fb8c7d4f38ce224a
Could you try running go install github.com/danielmiessler/fabric@latest
and let me know if this fixed the issue for you too?
Yes, should be fixed, please retest.
I am closing this, please reopen if it is still there.
What happened?
I noticed today that I was getting double output if I do NOT use the --stream pattern.
$ yt --transcript https://youtu.be/xhjmgRH91Go\?si\=MlMAUHpTfkRiVdf3 | fabric --stream --pattern summarize
versus
$ yt --transcript https://youtu.be/xhjmgRH91Go\?si\=MlMAUHpTfkRiVdf3 | fabric --pattern summarize
Version check
Relevant log output
Relevant screenshots (optional)
No response