-
Couldn't load subscription status.
- Fork 31k
Skip warning if tracing with dynamo #25581
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Thanks for fixing this (I assume this might have broken a workflow somewhere). The compilation is failing, I think we need to check if torch dynamo is available first using is_torchdynamo_available()? If so, it also "import torch._dynamo as dynamo" and so should be able to use dynamo.is_compiling() afterwards. |
|
cc @fxmarty |
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. |
|
|
|
Thanks for the quick review! I updated the diff with checking to see if the |
6064c4e to
28f8c1d
Compare
28f8c1d to
ce64fb3
Compare
|
@fxmarty could I get a review on this PR? |
|
Thanks. I figured I'd give my 2 cents even though I'm not a core reviewer. It'd probably be better to put the checking logic inside a helper method in the import_utils, next to is_torchdynamo_available(): There are other locations with logic to detect tracing, and the helper method will help us incorporate dynamo tracing to those locations as well. |
ce64fb3 to
958f59f
Compare
|
@ArthurZucker @fxmarty do you know who else would be free to review this? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thank you! If I understand well this may allow to avoid some graph breaks and capturing irrelevant part of the code, so it is nice.
I am not sure whether dynamo tracing is tested anywhere though.
|
@fxmarty thanks for the review! I noticed there's a "1 workflow awaiting approval" -- do you mind approving this too? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For now we have _create_and_check_torch_fx_tracing which checks symbolic_trace but it's not always tested. Would be nice to add a small test in test_warn_if_padding_and_no_attention_mask 😉
958f59f to
fcd1da5
Compare
|
@ArthurZucker Updated with a test that checks if torch compiling with dynamic shapes will run successfully and not result in a graph break. |
fcd1da5 to
c133793
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for adding this and a test!
Let's make sure @ArthurZucker is happy with the test before we merge in 🤗
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Left a small nit but good for me! Thanks for contributing and bearing with me! 🤗
c133793 to
607a93d
Compare
|
Thanks for the review! I updated the test w/ a better name. Dumb question..how do I merge this 😅 |
|
I'll merge for you 😉 |
* Ignore warning if tracing with dynamo * fix import error * separate to function * add test
What does this PR do?
Adds an additional check to skip the check for missing attention mask when tracing. Currently it checks if we're torch.jit.trace-ing or torch.fx.symbolic_trace-ing, so this adds a check to see if we're tracing with TorchDynamo.
Before submitting
Pull Request section?
to it if that's the case.
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
Tagging @hackyon who added this check previously