Skip to content

[Bug] Relax softplus op numerically unstable when loading from ONNX model #18241

@MarcelDudek

Description

@MarcelDudek

Hi,
I've stumbled upon an issue with relax onnx frontend, when working on a model containing softplus layers which are fed tensors with quite large values. After quick investigation it turns out that current onnx frontend is using "naive" softplus implementation which is numerically unstable.

Expected behavior

Softplus operator should output numerically stable output for large inputs (e.g. > 200.0 for float32).

Actual behavior

Currently for large inputs softplus produces inf value.

Environment

Ubuntu 24.04, TVM main branch, conda enviroment as per "install from source" guide

Steps to reproduce

Simple one softplus layer ONNX model, which can be exported from this PyTorch model:

import torch
import torch.nn as nn

class SoftplusModel(nn.Module):
    def __init__(self):
        super(SoftplusModel, self).__init__()
        self.softplus = nn.Softplus()
    
    def forward(self, x):
        return self.softplus(x)

produces TVM model which is numerically unstable for large inputs.

Triage

  • needs-triage
  • frontend:onnx

Metadata

Metadata

Assignees

No one assigned

    Labels

    needs-triagePRs or issues that need to be investigated by maintainers to find the right assignees to address ittype: bug

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions