-
Notifications
You must be signed in to change notification settings - Fork 31.2k
Fix bug of _prepare_4d_attention_mask #27847
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
ArthurZucker
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The failing CI is related to the styling, make style should help
| "Calling `transformers.models.llama.modeling_llama._prepare_4d_attention_mask` is deprecated and will be removed in v4.37. Use `transformers.modeling_attn_mask_utils.AttentionMaskConverter._prepare_4d_attention_mask" | ||
| ) | ||
| return AttentionMaskConverter._prepare_4d_attention_mask(mask=mask, dtype=dtype, tgt_len=tgt_len) | ||
| return _prepare_4d_attention_mask(mask=mask, dtype=dtype, tgt_len=tgt_len) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is deprecated anyway, but good catch.
younesbelkada
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice catch! I left one comment, for the failing CI can you try to merge with main and make sure you have ruff==0.1.5 installed when running make fixup ?
|
Hi @ArthurZucker @younesbelkada . All CIs are green, would you please help me review and merge it? Thx! |
younesbelkada
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
Hi @ArthurZucker @younesbelkada . Since
_prepare_4d_attention_maskis no longer a member function ofAttentionMaskConverter, I directly import_prepare_4d_attention_maskfrommodeling_attn_mask_utils. Would you please help to review it? Thx!BTW, the failed CIs are not related to my changes