-
Notifications
You must be signed in to change notification settings - Fork 115
Open
Description
I got error
File "/usr/local/lib/python3.11/dist-packages/transformers/models/llama/modeling_llama.py", line 246, in forward
cos, sin = position_embeddings
^^^^^^^^
TypeError: cannot unpack non-iterable NoneType object
I tried the script with Facebook/opt-125m and it ran perfectly and pruned-model worked correctly, but when using llama3 models I am getting same error. I have tried on Google colab and on my personal laptop as well, but same error.
This is the full error traceback for reference:
Traceback (most recent call last):
File "/content/sparsegpt/./llama.py", line 323, in <module>
llama_sequential(model, dataloader, DEV)
File "/usr/local/lib/python3.11/dist-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/content/sparsegpt/./llama.py", line 118, in llama_sequential
outs[j] = layer(inps[j].unsqueeze(0), attention_mask=attention_mask)[0]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/transformers/modeling_layers.py", line 48, in __call__
return super().__call__(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/torch/nn/modules/module.py", line 1750, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/transformers/models/llama/modeling_llama.py", line 308, in forward
hidden_states, self_attn_weights = self.self_attn(
^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/torch/nn/modules/module.py", line 1750, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/transformers/models/llama/modeling_llama.py", line 246, in forward
cos, sin = position_embeddings
^^^^^^^^
TypeError: cannot unpack non-iterable NoneType object
jsvir
Metadata
Metadata
Assignees
Labels
No labels