Skip to content

[Bug] a typo mistake in pytorch frontend nonzero_numpy #16389

@taomiao

Description

@taomiao

Thanks for participating in the TVM community! We use https://discuss.tvm.ai for any general usage questions and discussions. The issue tracker is used for actionable items such as feature proposals discussion, roadmaps, and bug tracking. You are always welcomed to post on the forum first 😸

Issues that are inactive for a period of time may get closed. We adopt this policy so that we won't lose track of actionable issues that may fall at the bottom of the pile. Feel free to reopen a new one if you feel there is an additional problem that needs attention when an old one gets closed.

Expected behavior

work well

Actual behavior

test_nonzero_numpy.py:45: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
../../../../python/tvm/relay/frontend/pytorch.py:5418: in from_pytorch
    outputs = converter.convert_operators(operator_nodes, outputs, ret_name)
../../../../python/tvm/relay/frontend/pytorch.py:4528: in convert_operators
    unpacked = _unpack_tuple(inputs[0])
../../../../python/tvm/relay/frontend/pytorch.py:5137: in _unpack_tuple
    elif isinstance(tup.type_annotation, TupleType):
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = CallNode(Op(argwhere), [Var(input1, ty=TensorType([2, 10], bool))], (nullptr), [])
name = 'type_annotation'

    def __getattr__(self, name):
        # specially check handle since
        # this is required for PackedFunc calls
        if name == "handle":
            raise AttributeError("handle is not set")
    
        try:
            return _ffi_node_api.NodeGetAttr(self, name)
        except AttributeError:
>           raise AttributeError(f"{type(self)} has no attribute {name}") from None
E           AttributeError: <class 'tvm.relay.expr.Call'> has no attribute type_annotation

Environment

os: ubuntu
python: 3.9
pytorch: 2.0
tvm: main branch

Steps to reproduce

from torch import nn
import torch
import tvm


class NonZeroModule(nn.Module):
    """Module that performs nonzero"""

    def __init__(self):
        super().__init__()

    def forward(self, x, mask):
        mask_index = torch.nonzero(mask, as_tuple=True)
        x[mask_index] = torch.ones_like(x[mask_index])
        return x

def test_pytorch_nonzero():
    model = NonZeroModule()
    x = torch.zeros((2, 10), dtype=torch.float32)
    mask = torch.randint(0, 2, (2, 10)).bool()
    with torch.no_grad():
        traced_torch_model = torch.jit.trace(model, (x, mask))
    import_input = [("input0", (2, 10)), ("input1", (2, 10))]
    relay_model_ir, relay_model_params = tvm.relay.frontend.from_pytorch(
        traced_torch_model, import_input
    )

Triage

Please refer to the list of label tags here to find the relevant tags and add them below in a bullet format (example below).

  • needs-triage
  • frontend:pytorch

Metadata

Metadata

Assignees

No one assigned

    Labels

    needs-triagePRs or issues that need to be investigated by maintainers to find the right assignees to address ittype: bug

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions