-
Notifications
You must be signed in to change notification settings - Fork 1.8k
chore [BREAKING CHANGE]: Flatten PyTorchConfig knobs into TorchLlmArgs #4603
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
678236c
to
cc4863c
Compare
c7573da
to
1ec59ec
Compare
19f20a3
to
c3d0f10
Compare
/bot run --add-multi-gpu-test --disable-fail-fast |
PR_Github #6285 [ run ] triggered by Bot |
PR_Github #6285 [ run ] completed with state |
c3d0f10
to
1115607
Compare
/bot run --add-multi-gpu-test --disable-fail-fast |
PR_Github #6379 [ run ] triggered by Bot |
PR_Github #6379 [ run ] completed with state |
/bot run --add-multi-gpu-test --disable-fail-fast |
PR_Github #6418 [ run ] triggered by Bot |
7755876
to
a9f24cb
Compare
/bot run --add-multi-gpu-test --disable-fail-fast |
PR_Github #6425 [ run ] triggered by Bot |
PR_Github #6418 [ run ] completed with state |
a9f24cb
to
6393176
Compare
PR_Github #6582 [ run ] triggered by Bot |
PR_Github #6582 [ run ] completed with state |
2624ad4
to
869e6c8
Compare
/bot run --disable-fail-fast |
PR_Github #6630 [ run ] triggered by Bot |
PR_Github #6630 [ run ] completed with state |
Signed-off-by: Superjomn <[email protected]>
update usages Signed-off-by: Superjomn <[email protected]>
Signed-off-by: Superjomn <[email protected]>
Signed-off-by: Superjomn <[email protected]>
Signed-off-by: Superjomn <[email protected]>
Signed-off-by: Superjomn <[email protected]>
869e6c8
to
bee40fa
Compare
/bot reuse-pipeline |
PR_Github #6763 [ reuse-pipeline ] triggered by Bot |
PR_Github #6763 [ reuse-pipeline ] completed with state |
YOUR_DATA_PATH=<your dataset file following the format> | ||
|
||
cat >./extra-llm-api-config.yml<<EOF | ||
pytorch_backend_config: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we should throw a "deprecated" message when the old config type is detected. Currently old configs are still accepted, but all the fields under pytorch_backend_config are ignored, thus none of the configs are taking effect. This causes confusion to users (like myself).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is a very fair request. @Superjomn
NVIDIA#4603) Signed-off-by: Superjomn <[email protected]> Signed-off-by: darraghdog <[email protected]>
Description
pytorch_backend_config
and flatten all PyTorchConfig knobs into TorchLlmArgs