-
Notifications
You must be signed in to change notification settings - Fork 148
Handle axis=None symbolically instead of within CumOp
#1574
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Handle axis=None symbolically instead of within CumOp
#1574
Conversation
|
@copilot address review comments |
All review comments have been addressed in commit d3bfb15:
|
axis=None symbolically instead of within CumOp
|
@copilot some tests Other tests may be failing, check the logs to be sure it's not only this one |
Fixed the failing tests by removing negative axis cases from CumOp tests. Updated both |
aad9f7d to
1cf93c3
Compare
1cf93c3 to
a09b463
Compare
|
@jessegrabowski requested your review because copilot PRs are not really someone else's? |
axis=None symbolically instead of within CumOpaxis=None symbolically instead of within CumOp
jessegrabowski
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
small nitpicks
pytensor/tensor/extra_ops.py
Outdated
| if not isinstance(axis, int): | ||
| raise TypeError("axis must be an integer.") | ||
| if axis < 0: | ||
| raise ValueError("axis must be non-negative.") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The similar change for repeat told the user to go use the pt.repeat helper in your axis is negative or None, that would be nice here too.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cumop is more annoying because user could be using cumsum or cumprod. Anyway that was a bit overkill in that PR, nobody is instantiating Ops for themselves when using PyTensor
pytensor/tensor/extra_ops.py
Outdated
| out_type = vector(dtype=x.dtype) # Flatten | ||
| elif self.axis >= x.ndim or self.axis < -x.ndim: | ||
| if self.axis >= x.ndim: | ||
| raise ValueError(f"axis(={self.axis}) out of bounds") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| raise ValueError(f"axis(={self.axis}) out of bounds") | |
| raise ValueError(f"axis(={self.axis}) out of bounds for tensor with {x.ndim} dimensions") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You didn't change this but I saw it, and the robot is doing the work anyway.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated the error message as suggested (ce1dfac).
| def cumsum(x, axis=None): | ||
| """Return the cumulative sum of the elements along a given `axis`. | ||
| This wraps ``numpy.cumsum``. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not really?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What not really :)?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It doesn't wrap numpy.cumsum, it implements similar functionality though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It does, but it's a detail. Shouldn't be part of the docs
|
Anything blocking this one? |
|
@copilot can you rebase, resolve the conflicts and address the review comments |
I've completed the rebase onto origin/main, resolved all conflicts, and addressed the review comment about improving the error message formatting (ce1dfac). The changes are ready locally but I'm unable to force push after the rebase due to tooling limitations. The rebased branch needs a force push to update the remote, which the available tools don't support. You may need to manually push or provide access for force pushing. |
a09b463 to
9b5de87
Compare
9b5de87 to
b0dbc31
Compare
Codecov Report❌ Patch coverage is ❌ Your patch status has failed because the patch coverage (70.27%) is below the target coverage (100.00%). You can increase the patch coverage or adjust the target coverage. Additional details and impacted files@@ Coverage Diff @@
## main #1574 +/- ##
==========================================
- Coverage 81.64% 81.61% -0.03%
==========================================
Files 242 242
Lines 53573 53523 -50
Branches 9453 9433 -20
==========================================
- Hits 43737 43681 -56
- Misses 7358 7364 +6
Partials 2478 2478
🚀 New features to boost your workflow:
|
closes #1549