suno-ai / bark

🔊 Text-Prompted Generative Audio Model
MIT License
35.24k stars 4.13k forks source link

Apple MPS support feedback #266

Open leejr72 opened 1 year ago

leejr72 commented 1 year ago

Seen the older issues on the topic issue#22, Understanding the MPS is experimental are the results below expected performance? Please let me know if not implementing correctly.

Happy to refactor and rerun tests in supporting dev of MPS feature.

bark_MPS_True_1 bark_MPS_True_2 bark_MPS_False_1 bark_MPS_False_2

gkucsko commented 1 year ago

hmm, i think when i tried i saw something around 1.5-2x speedup vs CPU on an M1 Max

leejr72 commented 1 year ago

Any special process to enabling. Besides adding the os import mps = true? Will try and find a tool to measure mps usage and run some longer samples. Thank you for confirming should be seeing increased performance.

gkucsko commented 1 year ago

much appreciated!

leejr72 commented 1 year ago

All testing was done with the default Jupiter long form generator example. Except added the MPS = True/False Is it supposed to turn off MPS when set to false? Dose the very first time executing take longer?

love the project and wish to help in any manor. Let me know if can help test further or run different testing for the project.

First run with MPS True Second run with MPS False Third run with MPS True Fourth run with MPS False CPU MPS True and False GPU MPS True and False
gkucsko commented 1 year ago

so you are seeing the same speed? did you restart the kernel inbetween and set the env var before doing anything else?

gkucsko commented 1 year ago

will try to run on my m1 later when i have better internet, but when i checked last i saw

gkucsko commented 1 year ago
image
gkucsko commented 1 year ago

yeah so after some warmup i'm seeing like 2x faster on MPS (this is M1 Max)

leejr72 commented 1 year ago

Trying to repro however something happened to were my code no longer uses the MPS. Reinstalled following with the following commands conda create -n bark python=3.10 conda activate bark git clone https://github.com/suno-ai/bark cd bark pip install . pip uninstall -y torch torchvision torchaudio pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cpu

`from bark import SAMPLE_RATE, generate_audio, preload_models from scipy.io.wavfile import write as write_wav from IPython.display import Audio

import os os.environ["SUNO_ENABLE_MPS"] = "True" torch.device("mps")

download and load all models

preload_models()

check environment

print(f"Python Platform: {platform.platform()}") print(f"Python {sys.version}") print(f"PyTorch Version: {torch.version}") print("GPU is", "available" if has_gpu else "NOT AVAILABLE") print("MPS (Apple Metal) is", "AVAILABLE" if has_mps else "NOT AVAILABLE") print(f"Target device is {device}") print(f"Torch Device: {torch.device()}") print(os.environ["SUNO_ENABLE_MPS"])

generate audio from text

text_prompt = """ Hello, my name is Suno. And, uh — and I like pizza. [laughs] But I also have other interests such as playing tic tac toe. """ audio_array = generate_audio(text_prompt)

save audio to disk

write_wav("bark_generation.wav", SAMPLE_RATE, audio_array)

play text in notebook

Audio(audio_array, rate=SAMPLE_RATE)`

Am I missing a required command? Screenshot 2023-05-12 at 1 49 39 PM

Screenshot 2023-05-12 at 2 28 23 PM

Screenshot 2023-05-12 at 5 21 27 PM

Screenshot 2023-05-12 at 5 22 22 PM
gkucsko commented 1 year ago

you have to set the environment variable before anything else gets imported, could that be the reason?

leejr72 commented 1 year ago

Thank you for your help new to the python. Got everything running and results below. Seems we are seeing similar performance. First time MPS little slow however pickups. Over all seeing 2x improvements. Thank you please let me know if I can help with any other testing.

Including audio samples from the testing. There seems to be a uniqueness to quality cpu vs mps.

bark_cpu_1.wav vs bark_mps_4.wav

Bark_MPS_VS_CPU.zip

Quick question when executing the "pip install ." how do I install the dev section also?

Also received the following when I tried to use conda to install the nightly version of PyTorch File "/opt/homebrew/Caskroom/miniconda/base/envs/bark-infinity/lib/python3.10/site-packages/torch/nn/functional.py", line 2210, in embedding return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse) IndexError: index out of range in self

wiped out the conda environment and rebuild following your instructions using pip

Install method git clone https://github.com/suno-ai/bark cd bark conda create -n bark python=3.10 conda activate bark pip install . pip uninstall -y torch torchvision torchaudio pip install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cpu I also have to pip install nltk to split up the sentences.

Python code used for testing. import os os.environ["SUNO_ENABLE_MPS"] = "True" os.environ["SUNO_OFFLOAD_CPU"] = "False" os.environ["SUNO_USE_SMALL_MODELS"] = "False"

from bark import SAMPLE_RATE, generate_audio, preload_models from scipy.io.wavfile import write as write_wav from IPython.display import Audio from datetime import datetime as dt import numpy as np import nltk # we'll use this to split into sentences nltk.download('punkt')

preload_models()

text_prompt = """ Hello, I am called BARK and am a new text to audio model made by SUNO! Let me read an excerpt from War of the Worlds to you. [clears throat] """

audio_array = generate_audio(text_prompt, history_prompt="v2/en_speaker_6")

Screenshot 2023-05-13 at 7 31 33 PM

Screenshot 2023-05-13 at 7 05 22 PM Screenshot 2023-05-13 at 7 27 35 PM
keldenl commented 1 year ago

On M1 pro 32gb ram, Bark works on CPU.

When trying MPS, I'm getting the self index out of bounds error when not using PyTorch/other dependencies nightly, but when I install nightly it just hangs for at least 5 minutes.

I'm on python3.11 – any tips on getting MPS working?

gkucsko commented 1 year ago

this should be a super simple error to fix (1-liner), but i can't do it since i can't reproduce it :/ would love help here from anyone who can

japanvik commented 1 year ago

@keldenl It was also hanging for me, but it worked for me if I set SUNO_OFFLOAD_CPU = False

fiq commented 1 year ago

Not tried nightly, can have a look when I have time. I've been using bark for a while on an M2Pro with the following. There are CPU fallback scenarios, but that hasn't prevented generation in reasonable time (unmeasured ;) ):

SUNO_ENABLE_MPS=True
PYTORCH_ENABLE_MPS_FALLBACK=1