Open juberti opened 3 months ago
Thought this would be straightforward, but struggling to get this to work with ultravox-v0.2, the model fails to load when quantized.
This is blowing up with the error:
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/root/workspace/ultravox/ultravox/tools/infer_tool.py", line 238, in <module>
main(simple_parsing.parse(InferArgs))
File "/root/workspace/ultravox/ultravox/tools/infer_tool.py", line 223, in main
inference = ultravox_infer.UltravoxInference(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/workspace/ultravox/ultravox/inference/ultravox_infer.py", line 51, in __init__
model = ultravox_model.UltravoxModel.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/ultravox-x4uhSn6m-py3.11/lib/python3.11/site-packages/transformers/modeling_utils.py", line 3820, in from_pretrained
dispatch_model(model, **device_map_kwargs)
File "/root/.cache/pypoetry/virtualenvs/ultravox-x4uhSn6m-py3.11/lib/python3.11/site-packages/accelerate/big_modeling.py", line 419, in dispatch_model
attach_align_device_hook_on_blocks(
File "/root/.cache/pypoetry/virtualenvs/ultravox-x4uhSn6m-py3.11/lib/python3.11/site-packages/accelerate/hooks.py", line 608, in attach_align_device_hook_on_blocks
add_hook_to_module(module, hook)
File "/root/.cache/pypoetry/virtualenvs/ultravox-x4uhSn6m-py3.11/lib/python3.11/site-packages/accelerate/hooks.py", line 157, in add_hook_to_module
module = hook.init_hook(module)
^^^^^^^^^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/ultravox-x4uhSn6m-py3.11/lib/python3.11/site-packages/accelerate/hooks.py", line 275, in init_hook
set_module_tensor_to_device(module, name, self.execution_device, tied_params_map=self.tied_params_map)
File "/root/.cache/pypoetry/virtualenvs/ultravox-x4uhSn6m-py3.11/lib/python3.11/site-packages/accelerate/utils/modeling.py", line 354, in set_module_tensor_to_device
raise ValueError(f"{tensor_name} is on the meta device, we need a `value` to put in on {device}.")
ValueError: weight is on the meta device, we need a `value` to put in on 0.
There are some other weight loading hiccups, @farzadab is looking at those, then will take another run at this.
Still hitting these issues with v0.3:
# just infer --text_only --prompt hi -q 8 -m fixie-ai/ultravox-v0_3
poetry run python -m ultravox.tools.infer_tool --text_only --prompt hi -q 8 -m fixie-ai/ultravox-v0_3
config.json: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 3.85k/3.85k [00:00<00:00, 47.1MB/s]
`low_cpu_mem_usage` was None, now set to True since model is quantized.
model.safetensors.index.json: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 28.6k/28.6k [00:00<00:00, 142MB/s]
model-00001-of-00004.safetensors: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4.93G/4.93G [00:36<00:00, 134MB/s]
model-00002-of-00004.safetensors: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 5.00G/5.00G [00:37<00:00, 134MB/s]
model-00003-of-00004.safetensors: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4.92G/4.92G [00:34<00:00, 142MB/s]
model-00004-of-00004.safetensors: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1.29G/1.29G [00:14<00:00, 91.5MB/s]
Downloading shards: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4/4 [02:03<00:00, 30.88s/it]
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.conv1.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.conv1.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.conv2.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.conv2.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.embed_positions.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.0.self_attn.k_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.0.self_attn.v_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.0.self_attn.v_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.0.self_attn.q_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.0.self_attn.q_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.0.self_attn.out_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.0.self_attn.out_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.0.self_attn_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.0.self_attn_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.0.fc1.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.0.fc1.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.0.fc2.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.0.fc2.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.0.final_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.0.final_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.1.self_attn.k_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.1.self_attn.v_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.1.self_attn.v_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.1.self_attn.q_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.1.self_attn.q_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.1.self_attn.out_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.1.self_attn.out_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.1.self_attn_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.1.self_attn_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.1.fc1.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.1.fc1.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.1.fc2.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.1.fc2.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.1.final_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.1.final_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.2.self_attn.k_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.2.self_attn.v_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.2.self_attn.v_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.2.self_attn.q_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.2.self_attn.q_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.2.self_attn.out_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.2.self_attn.out_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.2.self_attn_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.2.self_attn_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.2.fc1.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.2.fc1.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.2.fc2.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.2.fc2.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.2.final_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.2.final_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.3.self_attn.k_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.3.self_attn.v_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.3.self_attn.v_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.3.self_attn.q_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.3.self_attn.q_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.3.self_attn.out_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.3.self_attn.out_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.3.self_attn_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.3.self_attn_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.3.fc1.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.3.fc1.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.3.fc2.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.3.fc2.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.3.final_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.3.final_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.4.self_attn.k_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.4.self_attn.v_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.4.self_attn.v_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.4.self_attn.q_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.4.self_attn.q_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.4.self_attn.out_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.4.self_attn.out_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.4.self_attn_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.4.self_attn_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.4.fc1.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.4.fc1.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.4.fc2.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.4.fc2.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.4.final_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.4.final_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.5.self_attn.k_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.5.self_attn.v_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.5.self_attn.v_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.5.self_attn.q_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.5.self_attn.q_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.5.self_attn.out_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.5.self_attn.out_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.5.self_attn_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.5.self_attn_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.5.fc1.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.5.fc1.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.5.fc2.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.5.fc2.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.5.final_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.5.final_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.6.self_attn.k_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.6.self_attn.v_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.6.self_attn.v_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.6.self_attn.q_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.6.self_attn.q_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.6.self_attn.out_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.6.self_attn.out_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.6.self_attn_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.6.self_attn_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.6.fc1.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.6.fc1.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.6.fc2.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.6.fc2.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.6.final_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.6.final_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.7.self_attn.k_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.7.self_attn.v_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.7.self_attn.v_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.7.self_attn.q_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.7.self_attn.q_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.7.self_attn.out_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.7.self_attn.out_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.7.self_attn_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.7.self_attn_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.7.fc1.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.7.fc1.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.7.fc2.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.7.fc2.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.7.final_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.7.final_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.8.self_attn.k_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.8.self_attn.v_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.8.self_attn.v_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.8.self_attn.q_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.8.self_attn.q_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.8.self_attn.out_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.8.self_attn.out_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.8.self_attn_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.8.self_attn_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.8.fc1.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.8.fc1.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.8.fc2.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.8.fc2.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.8.final_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.8.final_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.9.self_attn.k_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.9.self_attn.v_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.9.self_attn.v_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.9.self_attn.q_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.9.self_attn.q_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.9.self_attn.out_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.9.self_attn.out_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.9.self_attn_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.9.self_attn_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.9.fc1.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.9.fc1.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.9.fc2.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.9.fc2.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.9.final_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.9.final_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.10.self_attn.k_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.10.self_attn.v_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.10.self_attn.v_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.10.self_attn.q_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.10.self_attn.q_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.10.self_attn.out_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.10.self_attn.out_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.10.self_attn_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.10.self_attn_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.10.fc1.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.10.fc1.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.10.fc2.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.10.fc2.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.10.final_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.10.final_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.11.self_attn.k_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.11.self_attn.v_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.11.self_attn.v_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.11.self_attn.q_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.11.self_attn.q_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.11.self_attn.out_proj.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.11.self_attn.out_proj.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.11.self_attn_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.11.self_attn_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.11.fc1.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.11.fc1.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.11.fc2.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.11.fc2.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.11.final_layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layers.11.final_layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layer_norm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/torch/nn/modules/module.py:2068: UserWarning: for model.encoder.layer_norm.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
Loading checkpoint shards: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4/4 [00:01<00:00, 3.95it/s]
Some weights of the model checkpoint at fixie-ai/ultravox-v0_3 were not used when initializing UltravoxModel: ['multi_modal_projector.linear_1.weight', 'multi_modal_projector.linear_2.weight']
- This IS expected if you are initializing UltravoxModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing UltravoxModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
generation_config.json: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 181/181 [00:00<00:00, 2.01MB/s]
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home/mosaicml/ultravox/ultravox/tools/infer_tool.py", line 258, in <module>
main(simple_parsing.parse(InferArgs))
File "/home/mosaicml/ultravox/ultravox/tools/infer_tool.py", line 243, in main
inference = ultravox_infer.UltravoxInference(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/mosaicml/ultravox/ultravox/inference/ultravox_infer.py", line 52, in __init__
model = ultravox_model.UltravoxModel.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/transformers/modeling_utils.py", line 4015, in from_pretrained
dispatch_model(model, **device_map_kwargs)
File "/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/accelerate/big_modeling.py", line 420, in dispatch_model
attach_align_device_hook_on_blocks(
File "/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/accelerate/hooks.py", line 615, in attach_align_device_hook_on_blocks
add_hook_to_module(module, hook)
File "/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/accelerate/hooks.py", line 160, in add_hook_to_module
module = hook.init_hook(module)
^^^^^^^^^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/accelerate/hooks.py", line 282, in init_hook
set_module_tensor_to_device(module, name, self.execution_device, tied_params_map=self.tied_params_map)
File "/root/.cache/pypoetry/virtualenvs/ultravox-AwYeYm8r-py3.11/lib/python3.11/site-packages/accelerate/utils/modeling.py", line 364, in set_module_tensor_to_device
raise ValueError(f"{tensor_name} is on the meta device, we need a `value` to put in on {device}.")
ValueError: weight is on the meta device, we need a `value` to put in on 0.
error: Recipe `infer` failed on line 43 with exit code 1
root@797c3e6d-9b15-40d3-adcc-e87b0db10e19-0:/home/mosaicml/ultravox#
Notes from @farzadab:
The error "weight is on the meta device" means that you're likely running out of memory when doing the quantization somehow. The "meta" device is an imaginary device that allows PyTorch to work on large to propagate the shapes (not the data) AFAIU. I was able to get past that point by switching the order of operations a bit (to the best of I understood what was happening), but still get a different error further down the line. Here are the changes I made: https://github.com/fixie-ai/ultravox/commit/bd72d29ead02d6ed7d8ec5572e8d2acedb60362b
Relevant to #8