collabora / WhisperLive

A nearly-live implementation of OpenAI's Whisper.
MIT License
2.1k stars 286 forks source link

M1 Mac crashes when running client #233

Open nreHieW opened 5 months ago

nreHieW commented 5 months ago

Hello, I managed to run the server successfully and have the following client code:

# main.py
from whisper_live.client import TranscriptionClient

client = TranscriptionClient(
    "localhost",
    9000,
)
print("Client started")
client("test.mp3")

However when i run python main.py, I get the following errors:

[INFO]: * recording
Client started
[INFO]: Waiting for server ready ...
[INFO]: Server Ready!
libc++abi: terminating
zsh: abort      python main.py

I have tried a fresh install on a fresh conda environment with the same issue. I have around 8GB of memory still free so it shouldn't be an out-of-memory issue.

Any help is appreciated, thanks!

Kishlay-notabot commented 3 months ago

Does not including the specified model type in the client code cause this issue? try this: model="tiny", or any model you prefer.