Closed sujoykb closed 3 months ago
I ran into the same issue, and for me the following change did the trick:
diff --git whisper.cpp/BUILD.mk whisper.cpp/BUILD.mk
index 7aa96f1..8daab0a 100644
--- whisper.cpp/BUILD.mk
+++ whisper.cpp/BUILD.mk
@@ -43,4 +43,5 @@ $(WHISPER_CPP_OBJS): whisper.cpp/BUILD.mk
.PHONY: o/$(MODE)/whisper.cpp
o/$(MODE)/whisper.cpp: \
+ o/$(MODE)/whisper.cpp/main \
o/$(MODE)/whisper.cpp/server
> .PHONY: o/$(MODE)/whisper.cpp
> o/$(MODE)/whisper.cpp: \
> + o/$(MODE)/whisper.cpp/main \
> o/$(MODE)/whisper.cpp/server
Correct, this works! Thank you so much @artyom
BTW, have you been able to run the whisperfile with cuda? Adding --gpu auto
did not work for me (tried on colab).
I've committed that change to the repository. Thanks for the report. For the record, you could also say make -j o//whisper.cpp/main
. I've also updated things so that make -j && sudo make install
will create an executable on your PATH called whisperfile
. Enjoy!
What happened?
I cloned the latest llamafile repo and tried to run whisperfile following this readme:
Make did not show any error but the second command failed with this message:
o//whisper.cpp/main: No such file or directory
I tried this on Google Colab (w/ and w/o GPU) as well as on my local machine (no GPU). Nowhere I was able to find any executable
main
insideo//whisper.cpp
.The following files are there inside
o//whisper.cpp
(showing you the output of Google colab below):Version
llamafile 0.8.12 (latest)
What operating system are you seeing the problem on?
Linux
Relevant log output
No response