Skip to content

hidden_dropout_prob and attention_probs_dropout_prob values in the documentation doesn't match with that in the code #25639

@SwapnanilMukherjee

Description

@SwapnanilMukherjee

System Info

  • transformers version: 4.30.2
  • Platform: Linux-5.15.0-79-generic-x86_64-with-glibc2.29
  • Python version: 3.8.10
  • Huggingface_hub version: 0.16.2
  • Safetensors version: 0.3.1
  • PyTorch version (GPU?): 2.0.1+cu117 (True)
  • Tensorflow version (GPU?): not installed (NA)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using GPU in script?: RTX A5000
  • Using distributed or parallel set-up in script?: No

Who can help?

@sgugger

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

I am training ViLT on e-SNLI-VE. This is basically what I am doing.

from transformers import ViltConfig, ViltProcessor, ViltForQuestionAnswering

label2id = {'contradiction':0, 'entailment':1, 'neutral':2}
id2label = {0:'contradiction', 1:'entailment', 2:'neutral'}

vilt_config = ViltConfig(label2id=label2id, id2label=id2label, max_position_embeddings=100)
vilt_config

This gives the following output.
Screenshot from 2023-08-22 02-21-17
The values for both variables should be 0.1 as indicated by the documentation.
Screenshot from 2023-08-22 02-23-18

Expected behavior

The values of the variables should have been zero.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions