Please let me start by saying thank you and I do love the thought put into this concept project. I think this type of project is critical for long term societal protections to ensure LLM Localization empowers people and not only States or major corporations.
The error seems like a simple one but so far working the problem has not resulted in a solution. Any ideas are most welcome!
Below is System information:
`System Software Overview:
System Version: macOS 14.4.1 (23E224)
Kernel Version: Darwin 23.4.0
Boot Volume: Macintosh HD
Boot Mode: Normal
Computer Name: N/A
User Name: N/A
Secure Virtual Memory: Enabled
System Integrity Protection: Enabled
Time since boot: 25 days, 18 hours, 9 minutes`
`Hardware Overview:
Model Name: MacBook Pro
Model Identifier: Mac14,6
Model Number: Z179000H4LL/A
Chip: Apple M2 Max
Total Number of Cores: 12 (8 performance and 4 efficiency)
Memory: 96 GB
System Firmware Version: 10151.101.3
OS Loader Version: 10151.101.3
Serial Number (system): N/A
Hardware UUID: 9A28DB48-8F29-5124-996B-7A6D8507642F
Provisioning UDID: 00006021-0018690C14F0C01E
Activation Lock Status: Enabled``
This is a clean install as per the default guidance. As per the other Issue the requirements.txt had to be adjusted to the below to even get as far as an error on the run command "python -v assistant.py" with the applicable section of the failure shown below at point of error:
`~/projects/ollama-voice-mac | trunk !7 python assistant.py ok
A module that was compiled using NumPy 1.x cannot be run in
NumPy 2.0.2 as it may crash. To support both 1.x and 2.x
versions of NumPy, modules must be compiled with NumPy 2.0.
Some module may need to rebuild instead e.g. with 'pybind11>=2.12'.
If you are a user of the module, the easiest solution will be to
downgrade to 'numpy<2' or try to upgrade the affected module.
We expect that some modules will need time to support NumPy 2.
Traceback (most recent call last): File "/Users/scott.mackenzie/projects/ollama-voice-mac/assistant.py", line 6, in
import torch
File "/Users/scott.mackenzie/Library/Python/3.9/lib/python/site-packages/torch/init.py", line 1382, in
from .functional import # noqa: F403
File "/Users/scott.mackenzie/Library/Python/3.9/lib/python/site-packages/torch/functional.py", line 7, in
import torch.nn.functional as F
File "/Users/scott.mackenzie/Library/Python/3.9/lib/python/site-packages/torch/nn/init.py", line 1, in
from .modules import # noqa: F403
File "/Users/scott.mackenzie/Library/Python/3.9/lib/python/site-packages/torch/nn/modules/init.py", line 35, in
from .transformer import TransformerEncoder, TransformerDecoder, \
File "/Users/scott.mackenzie/Library/Python/3.9/lib/python/site-packages/torch/nn/modules/transformer.py", line 20, in
device: torch.device = torch.device(torch._C._get_default_device()), # torch.device('cpu'),
/Users/scott.mackenzie/Library/Python/3.9/lib/python/site-packages/torch/nn/modules/transformer.py:20: UserWarning: Failed to initialize NumPy: _ARRAY_API not found (Triggered internally at /Users/runner/work/pytorch/pytorch/pytorch/torch/csrc/utils/tensor_numpy.cpp:84.)
device: torch.device = torch.device(torch._C._get_default_device()), # torch.device('cpu'),
/Users/scott.mackenzie/Library/Python/3.9/lib/python/site-packages/urllib3/init.py:35: NotOpenSSLWarning: urllib3 v2 only supports OpenSSL 1.1.1+, currently the 'ssl' module is compiled with 'LibreSSL 2.8.3'. See: https://github.com/urllib3/urllib3/issues/3020
warnings.warn(
pygame 2.5.2 (SDL 2.28.3, Python 3.9.6)
Hello from the pygame community. https://www.pygame.org/contribute.html
2024-10-30 18:03:10,383 - INFO - Starting Assistant
2024-10-30 18:03:10,839 - INFO - Initializing Assistant
2024-10-30 18:03:10,839 - INFO - Initializing configuration
2024-10-30 18:03:12,273 - INFO - Displaying message: Loading...
2024-10-30 18:03:12,712 - INFO - Converting text to speech: Hi, how can I help you?
AI:
Hi, how can I help you?
2024-10-30 18:03:12,712 - INFO - Initializing TTS engine
2024-10-30 18:03:13,213 - INFO - Displaying message: Press and hold space to speak
2024-10-30 18:03:13.281 Python[20325:1053897] +[IMKClient subclass]: chose IMKClient_Modern
2024-10-30 18:03:13,285 - INFO - Converting text to speech
2024-10-30 18:03:13,290 - INFO - Speech playback completed
2024-10-30 18:03:17,798 - INFO - Push-to-talk key pressed
2024-10-30 18:03:17,799 - INFO - Capturing waveform from microphone
2024-10-30 18:03:17,799 - INFO - Displaying recording start
2024-10-30 18:03:20,319 - INFO - Converting speech to text
2024-10-30 18:03:20,320 - INFO - Starting transcription
2024-10-30 18:03:20,324 - ERROR - An error occurred during transcription: Numpy is not available
2024-10-30 18:03:20,324 - INFO - Asking OLLaMa with prompt:
2024-10-30 18:03:20,331 - DEBUG - Starting new HTTP connection (1): localhost:11434
2024-10-30 18:03:20,343 - DEBUG - http://localhost:11434 "POST /api/generate HTTP/11" 200 109
2024-10-30 18:03:20,343 - INFO - Converting text to speech:`
Has anyone gotten ollama-voice-mac to work on a MacBook m2?
Please let me start by saying thank you and I do love the thought put into this concept project. I think this type of project is critical for long term societal protections to ensure LLM Localization empowers people and not only States or major corporations.
The error seems like a simple one but so far working the problem has not resulted in a solution. Any ideas are most welcome!
Below is System information:
`System Software Overview:
System Version: macOS 14.4.1 (23E224) Kernel Version: Darwin 23.4.0 Boot Volume: Macintosh HD Boot Mode: Normal Computer Name: N/A User Name: N/A Secure Virtual Memory: Enabled System Integrity Protection: Enabled Time since boot: 25 days, 18 hours, 9 minutes`
`Hardware Overview:
Model Name: MacBook Pro Model Identifier: Mac14,6 Model Number: Z179000H4LL/A Chip: Apple M2 Max Total Number of Cores: 12 (8 performance and 4 efficiency) Memory: 96 GB System Firmware Version: 10151.101.3 OS Loader Version: 10151.101.3 Serial Number (system): N/A Hardware UUID: 9A28DB48-8F29-5124-996B-7A6D8507642F Provisioning UDID: 00006021-0018690C14F0C01E Activation Lock Status: Enabled``
This is a clean install as per the default guidance. As per the other Issue the requirements.txt had to be adjusted to the below to even get as far as an error on the run command "python -v assistant.py" with the applicable section of the failure shown below at point of error:
`~/projects/ollama-voice-mac | trunk !7 python assistant.py ok
A module that was compiled using NumPy 1.x cannot be run in NumPy 2.0.2 as it may crash. To support both 1.x and 2.x versions of NumPy, modules must be compiled with NumPy 2.0. Some module may need to rebuild instead e.g. with 'pybind11>=2.12'.
If you are a user of the module, the easiest solution will be to downgrade to 'numpy<2' or try to upgrade the affected module. We expect that some modules will need time to support NumPy 2.
Traceback (most recent call last): File "/Users/scott.mackenzie/projects/ollama-voice-mac/assistant.py", line 6, in
import torch
File "/Users/scott.mackenzie/Library/Python/3.9/lib/python/site-packages/torch/init.py", line 1382, in
from .functional import # noqa: F403
File "/Users/scott.mackenzie/Library/Python/3.9/lib/python/site-packages/torch/functional.py", line 7, in
import torch.nn.functional as F
File "/Users/scott.mackenzie/Library/Python/3.9/lib/python/site-packages/torch/nn/init.py", line 1, in
from .modules import # noqa: F403
File "/Users/scott.mackenzie/Library/Python/3.9/lib/python/site-packages/torch/nn/modules/init.py", line 35, in
from .transformer import TransformerEncoder, TransformerDecoder, \
File "/Users/scott.mackenzie/Library/Python/3.9/lib/python/site-packages/torch/nn/modules/transformer.py", line 20, in
device: torch.device = torch.device(torch._C._get_default_device()), # torch.device('cpu'),
/Users/scott.mackenzie/Library/Python/3.9/lib/python/site-packages/torch/nn/modules/transformer.py:20: UserWarning: Failed to initialize NumPy: _ARRAY_API not found (Triggered internally at /Users/runner/work/pytorch/pytorch/pytorch/torch/csrc/utils/tensor_numpy.cpp:84.)
device: torch.device = torch.device(torch._C._get_default_device()), # torch.device('cpu'),
/Users/scott.mackenzie/Library/Python/3.9/lib/python/site-packages/urllib3/init.py:35: NotOpenSSLWarning: urllib3 v2 only supports OpenSSL 1.1.1+, currently the 'ssl' module is compiled with 'LibreSSL 2.8.3'. See: https://github.com/urllib3/urllib3/issues/3020
warnings.warn(
pygame 2.5.2 (SDL 2.28.3, Python 3.9.6)
Hello from the pygame community. https://www.pygame.org/contribute.html
2024-10-30 18:03:10,383 - INFO - Starting Assistant
2024-10-30 18:03:10,839 - INFO - Initializing Assistant
2024-10-30 18:03:10,839 - INFO - Initializing configuration
2024-10-30 18:03:12,273 - INFO - Displaying message: Loading...
2024-10-30 18:03:12,712 - INFO - Converting text to speech: Hi, how can I help you?
AI: Hi, how can I help you? 2024-10-30 18:03:12,712 - INFO - Initializing TTS engine 2024-10-30 18:03:13,213 - INFO - Displaying message: Press and hold space to speak 2024-10-30 18:03:13.281 Python[20325:1053897] +[IMKClient subclass]: chose IMKClient_Modern 2024-10-30 18:03:13,285 - INFO - Converting text to speech 2024-10-30 18:03:13,290 - INFO - Speech playback completed 2024-10-30 18:03:17,798 - INFO - Push-to-talk key pressed 2024-10-30 18:03:17,799 - INFO - Capturing waveform from microphone 2024-10-30 18:03:17,799 - INFO - Displaying recording start 2024-10-30 18:03:20,319 - INFO - Converting speech to text 2024-10-30 18:03:20,320 - INFO - Starting transcription 2024-10-30 18:03:20,324 - ERROR - An error occurred during transcription: Numpy is not available 2024-10-30 18:03:20,324 - INFO - Asking OLLaMa with prompt: 2024-10-30 18:03:20,331 - DEBUG - Starting new HTTP connection (1): localhost:11434 2024-10-30 18:03:20,343 - DEBUG - http://localhost:11434 "POST /api/generate HTTP/11" 200 109 2024-10-30 18:03:20,343 - INFO - Converting text to speech:`
Has anyone gotten ollama-voice-mac to work on a MacBook m2?
Any ideas on this error / obstacle?