Skip to content
This repository was archived by the owner on Jun 4, 2025. It is now read-only.

Commit b90eb20

Browse files
authored
Add log_frequency arg (#33)
* Add: modifier_log_frequency user arg
1 parent 081465a commit b90eb20

File tree

1 file changed

+8
-0
lines changed

1 file changed

+8
-0
lines changed

src/transformers/training_args.py

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -747,6 +747,14 @@ class TrainingArguments:
747747
default="",
748748
metadata={"help": "Used by the SageMaker launcher to send mp-specific args. Ignored in Trainer"},
749749
)
750+
modifier_log_frequency: float = field(
751+
default = 0.1,
752+
metadata={
753+
"help": (
754+
"How often to log SparseML modifier data, in number of epochs or fraction of epochs"
755+
)
756+
}
757+
)
750758

751759
def __post_init__(self):
752760
# Handle --use_env option in torch.distributed.launch (local_rank not passed as an arg then).

0 commit comments

Comments
 (0)