Skip to content

ROCm Docker build fails after GPTQ support #2137

@WoosukKwon

Description

@WoosukKwon

hey, i just noticed the recent most push fails on rocm

vllm/setup.py

Line 222 in b81a6a6

"csrc/quantization/gptq/q_gemm.cu",

can you add this line in the setup.py along with awq quantization like so

vllm/setup.py

Line 228 in b81a6a6

vllm_extension_sources.append("csrc/quantization/awq/gemm_kernels.cu")

If this feature is suppose to support rocm can you also check as hipify is not able to import #include<hipblas.h>

Thank you

Originally posted by @hex-plex in #916 (comment)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions