-
Couldn't load subscription status.
- Fork 31k
Closed
Labels
Description
System Info
Copy-and-paste the text below in your GitHub issue and FILL OUT the two last points.
transformersversion: 4.52.0.dev0- Platform: macOS-15.4.1-arm64-arm-64bit
- Python version: 3.11.11
- Huggingface_hub version: 0.30.1
- Safetensors version: 0.5.2
- Accelerate version: 1.4.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.8.0.dev20250325 (False)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
Who can help?
Information
- The official example scripts
- My own modified scripts
Tasks
- An officially supported task in the
examplesfolder (such as GLUE/SQuAD, ...) - My own task or dataset (give details below)
Reproduction
Qwen model (including Qwen2 and Qwen3) will fail to export on latest trunk. This seems to be a regression since latest release.
I've also verified that transformers==4.51.3 those models are export fine.
I've also verified that same the tests can run and pass on other models, e.g. llama, gemma, etc. So it is Qwen spefici.
How to reproduce?
RUN_SLOW=1 pytest tests/models/qwen2/test_modeling_qwen2.py -v -s -k test_export
The failure and stacktrace:
FAILED tests/models/qwen2/test_modeling_qwen2.py::Qwen2IntegrationTest::test_export_static_cache - torch._dynamo.exc.Unsupported: Unexpected type in sourceless builder builtins.method
from user code:
File "/Users/guangyang/transformers/src/transformers/integrations/executorch.py", line 312, in forward
outs = self.model(
File "/Users/guangyang/miniconda3/envs/executorch/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl
return forward_call(*args, **kwargs)
File "/Users/guangyang/transformers/src/transformers/utils/generic.py", line 969, in wrapper
output = func(self, *args, **kwargs)
File "/Users/guangyang/transformers/src/transformers/models/qwen2/modeling_qwen2.py", line 823, in forward
outputs: BaseModelOutputWithPast = self.model(
File "/Users/guangyang/miniconda3/envs/executorch/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl
return forward_call(*args, **kwargs)
File "/Users/guangyang/transformers/src/transformers/utils/generic.py", line 969, in wrapper
output = func(self, *args, **kwargs)
File "/Users/guangyang/transformers/src/transformers/models/qwen2/modeling_qwen2.py", line 531, in forward
causal_mask = self._update_causal_mask(
File "/Users/guangyang/transformers/src/transformers/models/qwen2/modeling_qwen2.py", line 640, in _update_causal_mask
causal_mask = self._prepare_4d_causal_attention_mask_with_cache_position(
File "/Users/guangyang/transformers/src/transformers/models/qwen2/modeling_qwen2.py", line 708, in _prepare_4d_causal_attention_mask_with_cache_position
if config.get_text_config().sliding_window is not None:
File "/Users/guangyang/transformers/src/transformers/configuration_utils.py", line 211, in __getattribute__
return super().__getattribute__(key)
Set TORCHDYNAMO_VERBOSE=1 for the internal stack trace (please do this especially if you're reporting a bug to PyTorch). For even more developer context, set TORCH_LOGS="+dynamo"
Expected behavior
The test should pass