-
Notifications
You must be signed in to change notification settings - Fork 31k
Closed
Description
🐛 Bug
The model I am using: OpenAIGPT
The language I am using the model on: English
The problem arises when using:
- the official example scripts: When I try to run examples/single_model_script/run_openai_gpt.py I get this error:
Traceback (most recent call last):
File "/home/rohola/Codes/Python/pytorch-transformers/examples/single_model_scripts/run_openai_gpt.py", line 288, in <module>
main()
File "/home/rohola/Codes/Python/pytorch-transformers/examples/single_model_scripts/run_openai_gpt.py", line 158, in main
model = OpenAIGPTDoubleHeadsModel.from_pretrained(args.model_name, num_special_tokens=len(special_tokens))
File "/home/rohola/Codes/Python/pytorch-transformers/pytorch_transformers/modeling_utils.py", line 330, in from_pretrained
model = cls(config, *model_args, **model_kwargs)
TypeError: __init__() got an unexpected keyword argument 'num_special_tokens'
The tasks I am working on is:
- ROCstories
To Reproduce
Steps to reproduce the behavior:
- Just run the "run_openai_gpt.py "
Environment
- OS: Ubuntu 16.04
- Python version: 3.6
- PyTorch version: 1.1.0
- PyTorch Transformers version (or branch): The last commit
- Using GPU: True
- Distributed of parallel setup: No
- Any other relevant information:
Additional context
Even when I remove that argument I get another error:
Traceback (most recent call last):
File "/home/rohola/Codes/Python/pytorch-transformers/examples/single_model_scripts/run_openai_gpt.py", line 288, in <module>
main()
File "/home/rohola/Codes/Python/pytorch-transformers/examples/single_model_scripts/run_openai_gpt.py", line 224, in main
losses = model(input_ids, mc_token_ids, lm_labels, mc_labels)
File "/home/rohola/Codes/Python/pytorch-transformers/env/lib/python3.6/site-packages/torch/nn/modules/module.py", line 493, in __call__
result = self.forward(*input, **kwargs)
File "/home/rohola/Codes/Python/pytorch-transformers/pytorch_transformers/modeling_openai.py", line 601, in forward
head_mask=head_mask)
File "/home/rohola/Codes/Python/pytorch-transformers/env/lib/python3.6/site-packages/torch/nn/modules/module.py", line 493, in __call__
result = self.forward(*input, **kwargs)
File "/home/rohola/Codes/Python/pytorch-transformers/pytorch_transformers/modeling_openai.py", line 425, in forward
hidden_states = inputs_embeds + position_embeds + token_type_embeds
RuntimeError: The size of tensor a (78) must match the size of tensor b (16) at non-singleton dimension 1
Metadata
Metadata
Assignees
Labels
No labels