-
Notifications
You must be signed in to change notification settings - Fork 30.9k
[DOCS] Add example for HammingDiversityLogitsProcessor #25481
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[DOCS] Add example for HammingDiversityLogitsProcessor #25481
Conversation
…tyLogitsProcessor
…tyLogitsProcessor' into dev-documentation-HammingDiversityLogitsProcessor
@gante -- hey! Currently, the checks are failing because of some changes in the docs, but not sure if I can/should fix those-- using black, the files I've changed seem to be formatted correctly? Is there something I need to change in the docs to have this merge? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for opening the PR! The descriptions are very thorough, I wish I had them when I first read about group beam search 🔥
I have two requests, before we merge the PR:
- You should be able to fix the CI errors if you run
make fixup
in your terminal, commit the changes, and then push them :) - Although the example is good, it would be even better if it could be done entirely with
generate
calls, as opposed to with somegroup_beam_search
calls! Most users stay at agenerate
level, so an example usinggenerate
is much more powerful 💪
@gante Fixes made! :) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(the doc preview should work after correcting the syntax in the warning block)
Co-authored-by: Joao Gante <[email protected]>
Co-authored-by: Joao Gante <[email protected]>
This reverts commit 4764308
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome 🔥
There are two rendering nits that should be fixed, but other than that I'm happy with the PR. Thank you for documenting this complex logits processor! 💛
Example: the below example shows a comparison before and after applying Hamming Diversity. | ||
```python | ||
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Final nit 1: let's remove indentation in the example, it shows up awkwardly in the rendered docs (see here the preview)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You also need to add >>>
before the code as it is done for every other logit processor.
Examples:
```python
>>> from transformers import AutoTokenizer, AutoModelForCausalLM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oh, yeah, and the missing >>>
/ ...
, good catch!
# summary 1: the solar system formed 4.6 billion years ago from the collapse of a giant interstellar molecular cloud. | ||
# summary 2: the solar system formed 4.6 billion years ago from the collapse of a giant interstellar molecular cloud. | ||
``` | ||
For more details, see [Diverse Beam Search: Decoding Diverse Solutions from Neural Sequence |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Final nit 2: let's move this to above the example, and decrease the indentation -- it is not rendering correctly (see here the preview)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Argh, sorry. Just autoaccepted the formatting suggestions from the CLI. I'll revert the weird/unnecessary changes + fix the formatting for the logits comment.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for working on this!
Example: the below example shows a comparison before and after applying Hamming Diversity. | ||
```python | ||
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You also need to add >>>
before the code as it is done for every other logit processor.
Examples:
```python
>>> from transformers import AutoTokenizer, AutoModelForCausalLM
This reverts commit bfb1536.
This reverts commit 4764308
This reverts commit 4764308
This reverts commit ad6ceb6
This reverts commit ad6ceb6.
Co-authored-by: Arthur <[email protected]>
…tyLogitsProcessor' into dev-documentation-HammingDiversityLogitsProcessor # Conflicts: # src/transformers/generation/logits_process.py
This reverts commit 4764308
@jesspeck thank for you iterating and thank you for the contribution 🤗 |
…5481) * updated logits processor text * Update logits_process.py * fixed formatting with black * fixed formatting with black * fixed formatting with Make Fixup * more formatting fixes * Update src/transformers/generation/logits_process.py Co-authored-by: Joao Gante <[email protected]> * Update src/transformers/generation/logits_process.py Co-authored-by: Joao Gante <[email protected]> * Revert "fixed formatting with Make Fixup" This reverts commit 4764308 * Revert "fixed formatting with black" This reverts commit bfb1536. * Revert "fixed formatting with Make Fixup" This reverts commit 4764308 * Revert "fixed formatting with Make Fixup" This reverts commit 4764308 * Revert "fixed formatting with black" This reverts commit ad6ceb6 * Revert "fixed formatting with black" This reverts commit ad6ceb6. * Update src/transformers/generation/logits_process.py Co-authored-by: Arthur <[email protected]> * Revert "fixed formatting with Make Fixup" This reverts commit 4764308 * formatted logits_process with make fixup --------- Co-authored-by: jesspeck <[email protected]> Co-authored-by: Joao Gante <[email protected]> Co-authored-by: Arthur <[email protected]>
What does this PR do?
Adds an example to the docstring of HammingDiversityLogitsProcessor class definition in this file.
Part of the docs work on Generate: have an example on each logits processor class docstring
Changes -
Happy to edit it all down a bit if it's too much stuff!
Before submitting
to it if that's the case.
Who can review?
@gante