Skip to content

Conversation

@pcmoritz
Copy link
Collaborator

Since #1804, the DeciLM model was broken because the constructor of LlamaForCausalLM was changed and the change was not reflected in the subclass DeciLMForCausalLM.

This was actually caught by test_models.py, but that test has not been running properly in the CI (it looks like due to low GPU memory), so it was not detected earlier.

@WoosukKwon WoosukKwon merged commit 4f2ad11 into vllm-project:main Feb 15, 2024
xjpang pushed a commit to xjpang/vllm that referenced this pull request Feb 20, 2024
xjpang pushed a commit to xjpang/vllm that referenced this pull request Feb 22, 2024
xjpang pushed a commit to xjpang/vllm that referenced this pull request Mar 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants