Open HalitTalha opened 1 year ago
I've found that if you run the following cell and restart the runtime once it completes - before running an updated install command - it at least gets you some of the way.
First run:
!pip uninstall -y keras tensorflow tensorflow-probability absl-py astunparse flatbuffers gast google-pasta grpcio h5py keras keras-preprocessing libclang numpy opt-einsum protobuf setuptools six tensorboard tensorflow-io-gcs-filesystem termcolor tf-estimator-nightly typing-extensions wrapt
!pip install --disable-pip-version-check --no-cache-dir tensorflow==2.11.0
!pip install tensorflow-probability==0.15.0
!pip install keras==2.11.0
Then click on the 'restart runtime' button that appears in the output from the above cell. Then run:
# !pip install -qU ddsp[data_preparation]==1.6.3
!sudo apt-get install libportaudio2 &> /dev/null
!pip install -U ddsp[data_preparation] &> /dev/null
# Initialize global path for using google drive.
DRIVE_DIR = ''
This way at least the base packages are installed. However, the train_autoencoder notebook then fails further down when trying to compute the dataset statistics, which is an issue several others have commented on and no one seems to have a good fix for.
The above fix works for the train VST notebook (https://github.com/magenta/ddsp-vst) and that notebook does allow you to train a model and export a plugin.
In case someone from Magenta is checking this, it'd be nice if the notebooks actually worked!
Hi there, Sorry in advance if this sounds silly and has already been addressed, I'm trying to get used to colab and dependency conflicts etc. When I try to install, on the first step in the colab demo, I'm getting a "metadata generation failed" error. I'm not changing runtime or any other configuration. By any chance, could you point me in the right direction?
It's the same when I try it locally. I figure this might be related to the runtime and pip versions, which are the latest in my case.