Skip to content

Commit e0e58f9

Browse files
authored
[Bug] Enforce contiguous input for dynamic_scaled_fp8_quant and static_scaled_fp8_quant (#21773)
Signed-off-by: yewentao256 <[email protected]>
1 parent b361f14 commit e0e58f9

File tree

1 file changed

+3
-2
lines changed

1 file changed

+3
-2
lines changed

vllm/_custom_ops.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1282,10 +1282,11 @@ def scaled_fp8_quant(
12821282
output, input.contiguous(), scale, scale_ub)
12831283
else:
12841284
scale = torch.zeros(1, device=input.device, dtype=torch.float32)
1285-
torch.ops._C.dynamic_scaled_fp8_quant(output, input, scale)
1285+
torch.ops._C.dynamic_scaled_fp8_quant(output, input.contiguous(),
1286+
scale)
12861287
else:
12871288
assert scale.numel() == 1, f"{scale.shape}"
1288-
torch.ops._C.static_scaled_fp8_quant(output, input, scale)
1289+
torch.ops._C.static_scaled_fp8_quant(output, input.contiguous(), scale)
12891290

12901291
return output, scale
12911292

0 commit comments

Comments
 (0)