Skip to content

Conversation

@wenxindongwork
Copy link
Contributor

@wenxindongwork wenxindongwork commented Oct 22, 2025

This pull request introduces a small but important update to the data parallel initialization logic in vllm/entrypoints/llm.py. The change ensures that the warning and error for unsupported single-process data parallel usage are not triggered when running on TPU platforms.

TPU DP support added in vllm-project/tpu-inference#865

  • The purpose of the PR, such as "Fix some issue (link existing issues this PR will resolve)".
  • The test plan, such as providing test command.
  • The test results, such as pasting the results comparison before and after, or e2e results
  • (Optional) The necessary documentation update, such as updating supported_models.md and examples for a new model.
  • (Optional) Release notes update. If your change is user facing, please update the release notes draft in the Google Doc.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a get_dynamic_token_budget method to the Scheduler class, providing a hook for subclasses to implement dynamic scheduling constraints. This is a good extension point. The new method is then used in the schedule method to constrain the number of tokens for both running and waiting requests.

My main feedback is to simplify the newly added logic. The three lines added in two places to apply the dynamic budget can be condensed into a single line. This will make the code more concise and avoid duplication, improving maintainability.

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

@mergify mergify bot added the frontend label Oct 27, 2025
@wenxindongwork wenxindongwork changed the title Tpu dp attention Support TPU Data Parallalism Oct 31, 2025
Signed-off-by: wenxindongwork <[email protected]>
Signed-off-by: wenxindongwork <[email protected]>
Signed-off-by: wenxindongwork <[email protected]>
Signed-off-by: wenxindongwork <[email protected]>
Signed-off-by: wenxindongwork <[email protected]>
Signed-off-by: wenxindongwork <[email protected]>
Signed-off-by: wenxindongwork <[email protected]>
Signed-off-by: wenxindongwork <[email protected]>
@yaochengji yaochengji changed the title Support TPU Data Parallalism [TPU] Support TPU Data Parallalism Oct 31, 2025
@yaochengji yaochengji changed the title [TPU] Support TPU Data Parallalism [Core][TPU] Support TPU Data Parallalism Oct 31, 2025
Copy link
Collaborator

@yaochengji yaochengji left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@yaochengji
Copy link
Collaborator

@njhill could you please take a look at the small change?

@njhill njhill added the ready ONLY add when PR is ready to merge/full CI is needed label Nov 1, 2025
@njhill njhill enabled auto-merge (squash) November 1, 2025 15:18
@njhill njhill merged commit af6e19f into vllm-project:main Nov 1, 2025
51 checks passed
zhaozuy pushed a commit to zhaozuy/vllm that referenced this pull request Nov 4, 2025
juliendenize pushed a commit to juliendenize/vllm that referenced this pull request Nov 6, 2025
ZhengHongming888 pushed a commit to ZhengHongming888/vllm that referenced this pull request Nov 8, 2025
rtourgeman pushed a commit to rtourgeman/vllm that referenced this pull request Nov 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

frontend ready ONLY add when PR is ready to merge/full CI is needed v1

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants