Skip to content

Commit dfda2df

Browse files
zhc7pytorchmergebot
authored andcommitted
very small typo in fsdp2 comment (pytorch#163155)
Pull Request resolved: pytorch#163155 Approved by: https://github.com/awgu, https://github.com/Skylion007
1 parent 876824f commit dfda2df

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

torch/distributed/fsdp/_fully_shard/_fsdp_state.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -230,7 +230,7 @@ def _pre_forward(
230230
self, module: nn.Module, args: tuple[Any, ...], kwargs: dict[str, Any]
231231
) -> tuple[tuple[Any, ...], dict[str, Any]]:
232232
# When composing with module-hook-based activation checkpointing, the
233-
# the pre-backward hook is responsible for the unshard
233+
# pre-backward hook is responsible for the unshard
234234
if self._training_state == TrainingState.PRE_BACKWARD:
235235
return args, kwargs
236236
self._training_state = TrainingState.FORWARD

0 commit comments

Comments
 (0)