-
Notifications
You must be signed in to change notification settings - Fork 3.7k
Closed
Labels
needs-triagePRs or issues that need to be investigated by maintainers to find the right assignees to address itPRs or issues that need to be investigated by maintainers to find the right assignees to address ittype: bug
Description
Hi,
I've stumbled upon an issue with relax onnx frontend, when working on a model containing softplus layers which are fed tensors with quite large values. After quick investigation it turns out that current onnx frontend is using "naive" softplus implementation which is numerically unstable.
Expected behavior
Softplus operator should output numerically stable output for large inputs (e.g. > 200.0 for float32).
Actual behavior
Currently for large inputs softplus produces inf value.
Environment
Ubuntu 24.04, TVM main branch, conda enviroment as per "install from source" guide
Steps to reproduce
Simple one softplus layer ONNX model, which can be exported from this PyTorch model:
import torch
import torch.nn as nn
class SoftplusModel(nn.Module):
def __init__(self):
super(SoftplusModel, self).__init__()
self.softplus = nn.Softplus()
def forward(self, x):
return self.softplus(x)produces TVM model which is numerically unstable for large inputs.
Triage
- needs-triage
- frontend:onnx
Metadata
Metadata
Assignees
Labels
needs-triagePRs or issues that need to be investigated by maintainers to find the right assignees to address itPRs or issues that need to be investigated by maintainers to find the right assignees to address ittype: bug