-
-
Notifications
You must be signed in to change notification settings - Fork 10.8k
Description
Discussed in #4745
Originally posted by tanliboy May 10, 2024
Hi vLLM team,
We have been using vLLM for serving models, and it went really well. We have been using the OpenAI compatible API along with our customized "role" for different entities. However, when we upgraded the version to v0.4.2 recently, we realized that the customized "role" is not supported and the role is only limited to "system", "user", and "assistant".
I understand that it is tightly aligned with OpenAI's chat completion role definition; however, it limits the customization of different roles along with fine-tuning. Moreover, we also saw a trend (including the recent Llama3 chat template) to support different roles for multi-agent interactions.
Can you upgrade to bring back the previous support of customized roles in OpenAI chat completion APIs?
Thanks,
Li