-
Notifications
You must be signed in to change notification settings - Fork 31.2k
Closed
Description
System Info
transformers version: main
Platform: macOS 13.1 (x86_64)
Python version: 3.7.15
PyTorch version (GPU?): 1.13.0 (False)
Tensorflow version (GPU?): N/A
Using GPU in script?: No
Using distributed or parallel set-up in script?: No
Who can help?
Documentation: @sgugger, @stevhliu and @MKhalusova
Information
- The official example scripts
- My own modified scripts
Tasks
- An officially supported task in the
examplesfolder (such as GLUE/SQuAD, ...) - My own task or dataset (give details below)
Reproduction
import transformers
bert_model = transformers.BertForSequenceClassification.from_pretrained("...")
reveal_type(bert_model)
# expected: BertForSequenceClassification
# actual: tuple[Unknown | BertForSequenceClassification, dict[str, Unbound | Unknown] | dict[str, Unknown | list[Unknown]] | Unknown] | Unknown | BertForSequenceClassificationExpected behavior
The hinted type of the return from from_pretrained should be the same type as the class.
I know we could just annotate from_pretained to return typing.Self... but IMO this points to a lack of typing in a core part of transformers, and I think it would make this core code more robust if the static type checkers agreed that that is the correct return type (since there may be type errors lurking).
Metadata
Metadata
Assignees
Labels
No labels