p-lambda / jukemir

Perform transfer learning for MIR using Jukebox!
MIT License
172 stars 22 forks source link

Errors when running collab notebook #14

Open Rsalganik1123 opened 11 months ago

Rsalganik1123 commented 11 months ago

Hello, I am experiencing a series of errors when trying to run the collab notebook provided with the code.

First, installing jukebox throws the following error:

× python setup.py bdist_wheel did not run successfully.
  │ exit code: 1
  ╰─> See above for output.

However, this can be solved by installing a different version of jukebox:

!pip install --upgrade git+https://github.com/craftmine1000/jukebox-saveopt.git

However, then in the initialization block, the following errors arise:


/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py:2025: UserWarning: for encoders.0.level_blocks.0.model.0.0.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
  warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py:2025: UserWarning: for encoders.0.level_blocks.0.model.0.0.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
  warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py:2025: UserWarning: for encoders.0.level_blocks.0.model.0.1.model.0.model.1.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
  warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
...
``` in the line: `top_prior = make_prior(hparams, vqvae, device)#device)`

Please advise on how to resolve. Thanks in advance