danielmiessler / fabric

fabric is an open-source framework for augmenting humans using AI. It provides a modular framework for solving specific problems using a crowdsourced set of AI prompts that can be used anywhere.
https://danielmiessler.com/p/fabric-origin-story
MIT License
23.42k stars 2.48k forks source link

[Question]: broken pipe #567

Closed iplayfast closed 1 month ago

iplayfast commented 3 months ago

What is your question?

I tried out a sample yt --transcript https://youtu.be/zdbVtZIn9IM | fabric -m llama3:latest -sp extract_wisdom which gave the error

Traceback (most recent call last):
  File "/home/chris/.local/bin/yt", line 8, in <module>
    sys.exit(main_yt())
  File "/home/chris/.local/pipx/venvs/fabric/lib/python3.10/site-packages/installer/client/cli/yt.py", line 148, in main
    main_function(args.url, args)
  File "/home/chris/.local/pipx/venvs/fabric/lib/python3.10/site-packages/installer/client/cli/yt.py", line 113, in main_function
    print(transcript_text.encode('utf-8').decode('unicode-escape'))
BrokenPipeError: [Errno 32] Broken pipe

Investing further I tried yt --transcript https://youtu.be/zdbVtZIn9IM which seemed to work. I then tried yt --transcript https://youtu.be/zdbVtZIn9IM | ollama run llama3 "extract wisdom from this" which also worked. so apparently I'm not using ollama right?

iplayfast commented 3 months ago

Found the problem. I had originally installed a few months back and the new install wasn't being seen and the old one was corrupted. cleaned everything out and started fresh now seems to be working.

primed-lan-laucirica commented 3 months ago

I noticed this isn't working today. Is this fixed in a dev branch? Tested today with yt on ubuntu 22 with gpt4o... pipe does not work. My workaround is:

(fabric) primed@3Demon:~/localcode/fabric$ TRANSCRIPT=$(yt --transcript "https://www.youtube.com/watch?v=IYz6tE3XmHM")
(fabric) primed@3Demon:~/localcode/fabric$ echo $TRANSCRIPT | summarize