Skip to content

Conversation

@ydshieh
Copy link
Collaborator

@ydshieh ydshieh commented Sep 28, 2023

What does this PR do?

The method random_attention_mask used in testing makes sure the last token is non-zero. However, this property will be changed if a causal mask is applied.

This causes some issues in CI, see issue reported

pytorch/pytorch#110213

In general, a sequence with all zero as attention mask is bad. Let's avoid testing with such case.

(However, we probably need to do some processing in the modeling code - if torch decide this is undefined behavior and won't make change to have previous behavior).

@ydshieh ydshieh requested a review from LysandreJik September 28, 2023 13:44
@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Sep 28, 2023

The documentation is not available anymore as the PR was closed or merged.

Copy link
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thaks @ydshieh!

@ydshieh ydshieh merged commit 3911774 into main Sep 29, 2023
@ydshieh ydshieh deleted the debug_flacon branch September 29, 2023 09:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants