Skip to content

Conversation

alanwaketan
Copy link
Collaborator

Summary:
This pull requests fix a bug in #17 where it forgot to guard 2D sharding for activations and inputs.

Test Plan:
N/A.

@alanwaketan alanwaketan requested review from JackCaoG and jonb377 August 2, 2023 02:56
@alanwaketan alanwaketan self-assigned this Aug 2, 2023
@alanwaketan
Copy link
Collaborator Author

Thanks, Jon.

@alanwaketan alanwaketan merged commit 0a9c9e0 into llama2-google-next-training Aug 2, 2023
alanwaketan added a commit that referenced this pull request Oct 27, 2023
Summary:
This pull requests fix a bug in #17 where it forgot to guard 2D sharding for activations and inputs.

Test Plan:
N/A.
yeounoh pushed a commit that referenced this pull request Mar 19, 2024
Summary:
This pull requests fix a bug in #17 where it forgot to guard 2D sharding for activations and inputs.

Test Plan:
N/A.
vanbasten23 pushed a commit that referenced this pull request May 21, 2024
Summary:
This pull requests fix a bug in #17 where it forgot to guard 2D sharding for activations and inputs.

Test Plan:
N/A.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants