@@ -472,19 +472,22 @@ Distributed and 16-bit precision
472472Below are the possible configurations we support.
473473
474474+-------+---------+-----+-----+--------+-----------------------------------------------------------------------+
475- | 1 GPU | 1+ GPUs | DP | DDP | 16-bit | command |
475+ | 1 GPU | 1+ GPUs | DDP | DP | 16-bit | command |
476476+=======+=========+=====+=====+========+=======================================================================+
477477| Y | | | | | `Trainer(accelerator="gpu", devices=1) ` |
478478+-------+---------+-----+-----+--------+-----------------------------------------------------------------------+
479479| Y | | | | Y | `Trainer(accelerator="gpu", devices=1, precision=16) ` |
480480+-------+---------+-----+-----+--------+-----------------------------------------------------------------------+
481- | | Y | Y | | | `Trainer(accelerator="gpu", devices=k, strategy='dp ') ` |
481+ | | Y | Y | | | `Trainer(accelerator="gpu", devices=k, strategy='ddp ') ` |
482482+-------+---------+-----+-----+--------+-----------------------------------------------------------------------+
483- | | Y | | Y | | `Trainer(accelerator="gpu", devices=k, strategy='ddp') ` |
483+ | | Y | Y | | Y | `Trainer(accelerator="gpu", devices=k, strategy='ddp', precision=16) ` |
484484+-------+---------+-----+-----+--------+-----------------------------------------------------------------------+
485- | | Y | | Y | Y | `Trainer(accelerator="gpu", devices=k, strategy='ddp', precision=16) ` |
485+ | | Y | | Y | | `Trainer(accelerator="gpu", devices=k, strategy='dp') ` |
486486+-------+---------+-----+-----+--------+-----------------------------------------------------------------------+
487- # FIXME(carlos): check native amp and DP
487+ | | Y | | Y | Y | `Trainer(accelerator="gpu", devices=k, strategy='dp', precision=16) ` |
488+ +-------+---------+-----+-----+--------+-----------------------------------------------------------------------+
489+
490+ DDP and DP can also be used with 1 GPU, but there's no reason to do so other than debugging distributed-related issues.
488491
489492Implement Your Own Distributed (DDP) training
490493^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
0 commit comments