Skip to content

Conversation

@tuukkjs
Copy link
Collaborator

@tuukkjs tuukkjs commented Aug 28, 2025

Remove prompt_token_ids and use prompts instead.

prompt_token_ids was removed from LLM.generate args in this PR.

Tested running fmwork infer with Llama-3.1-8B-Instruct and rocm/vllm-dev:nightly_0610_rc2_0610_rc2_20250605, rocm/vllm-dev:nightly_main_20250818 and rocm/vllm-dev:nightly_main_20250826. After this change, all work.

Remove `prompt_token_ids` and use `prompts` instead.

`prompt_token_ids` was removed from `LLM.generate` args in this
[PR](vllm-project/vllm#18800).
@tuukkjs tuukkjs requested a review from lcskrishna August 28, 2025 12:12
Copy link
Owner

@lcskrishna lcskrishna left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.

@lcskrishna lcskrishna merged commit 8d0019e into main Aug 28, 2025
@tuukkjs tuukkjs deleted the fix/prompt_token_ids branch August 28, 2025 12:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants