Skip to content

Commit d497f8d

Browse files
committed
Add a check for is_sagemaker_mp when setting _n_gpu again. Should be last broken thing
1 parent 7f5154a commit d497f8d

File tree

1 file changed

+3
-2
lines changed

1 file changed

+3
-2
lines changed

src/transformers/training_args.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1649,8 +1649,9 @@ def _setup_devices(self) -> "torch.device":
16491649
if is_torch_tpu_available():
16501650
device = self.distributed_state.device
16511651
self._n_gpu = 0
1652-
elif is_sagemaker_dp_enabled():
1653-
self._n_gpu = 1
1652+
elif is_sagemaker_dp_enabled() or is_sagemaker_mp_enabled():
1653+
# Already set _n_gpu
1654+
pass
16541655
elif self.distributed_state.distributed_type == DistributedType.NO:
16551656
if self.use_mps_device:
16561657
if not torch.backends.mps.is_available():

0 commit comments

Comments
 (0)