Skip to content

Conversation

@callanwu
Copy link
Contributor

What does this PR do?

Fixes # (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

@stevhliu and @MKhalusova

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@MKhalusova
Copy link
Contributor

Thank you for your contribution! Please make sure that the quality checks pass - here's the contributing guide. It looks like you need to run make style.

@callanwu
Copy link
Contributor Author

Thank you for your contribution! Please make sure that the quality checks pass - here's the contributing guide. It looks like you need to run make style.

@MKhalusova Hi, thx for your reminder. I have passed all checks!

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint.

@callanwu
Copy link
Contributor Author

Add reference:
https://huggingface.co/google/mt5-small/blob/main/config.json
modules output of a loaded mt5-small

T5Block(
    (layer): ModuleList(
        (0): T5LayerSelfAttention(
        (SelfAttention): T5Attention(
            (q): Linear(in_features=512, out_features=384, bias=False)
            (k): Linear(in_features=512, out_features=384, bias=False)
            (v): Linear(in_features=512, out_features=384, bias=False)
            (o): Linear(in_features=384, out_features=512, bias=False)
        )
        (layer_norm): T5LayerNorm()
        (dropout): Dropout(p=0.1, inplace=False)
        )
        (1): T5LayerFF(
        (DenseReluDense): T5DenseGatedGeluDense(
            (wi_0): Linear(in_features=512, out_features=1024, bias=False)
            (wi_1): Linear(in_features=512, out_features=1024, bias=False)
            (wo): Linear(in_features=1024, out_features=512, bias=False)
            (dropout): Dropout(p=0.1, inplace=False)
        )
        (layer_norm): T5LayerNorm()
        (dropout): Dropout(p=0.1, inplace=False)
        )
    )
)

@callanwu
Copy link
Contributor Author

@MKhalusova @patrickvonplaten Can it be merged now? Are there any further improvements needed:)

@ArthurZucker ArthurZucker merged commit 1ddc4fa into huggingface:main Nov 23, 2023
@ArthurZucker
Copy link
Collaborator

Thanks @callanwu 🤗

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants