Skip to content

Conversation

@g-eoj
Copy link

@g-eoj g-eoj commented Jun 13, 2024

Provides an openai entrypoint for prompt adapters using the same pattern as lora modules.

To run:

python -m vllm.entrypoints.openai.api_server --model <model name> --enable-prompt-adapter --prompt-adapters <adapter name>=<adapter path>

Missing:

  • tests
  • documentation

Joe G added 3 commits June 13, 2024 13:04
Assumes the interface for prompt adapters and lora modules remains
completely separate.
@g-eoj g-eoj marked this pull request as ready for review June 13, 2024 21:58
@g-eoj
Copy link
Author

g-eoj commented Jun 13, 2024

Will need some work if we want to support prompt adapter + lora module combos: vllm-project#4645 (comment)

@SwapnilDreams100 SwapnilDreams100 merged commit 9634b9d into SwapnilDreams100:main Jul 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants