-
Notifications
You must be signed in to change notification settings - Fork 116
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Deprecate redundant utilities for extracting constants #1046
Conversation
24716ac
to
8dcbc2b
Compare
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #1046 +/- ##
==========================================
- Coverage 82.14% 82.11% -0.03%
==========================================
Files 186 186
Lines 48207 48201 -6
Branches 8677 8679 +2
==========================================
- Hits 39598 39579 -19
- Misses 6441 6447 +6
- Partials 2168 2175 +7
|
8dcbc2b
to
6c20b96
Compare
@Armavica I removed the negative constant rewrite in the last commit |
7961d57
to
8134f62
Compare
7c38bed
to
15c06ff
Compare
15c06ff
to
c7712f6
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Left some ticky-tacky comments, feel free to ignore.
One thing I noticed was that get_underlying_scalar_constant_value
is available in pt.
? We should maybe curate that namespace a bit more, because this isn't a function an average user is going to need, and use it from ptb
directly.
if v.owner is not None and isinstance(v.owner.op, sparse.CSM): | ||
data = v.owner.inputs[0] | ||
return tensor.get_underlying_scalar_constant_value(data) | ||
warnings.warn( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use the logger instead?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
wdym. Never saw deprecation warnings in logging.
@@ -2157,6 +2157,9 @@ def _is_zero(x): | |||
'maybe' means that x is an expression that is complicated enough | |||
that we can't tell that it simplifies to 0. | |||
""" | |||
from pytensor.tensor import get_underlying_scalar_constant_value |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why does this need a local import here but not in the above function?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Because the other is using it through tensor. This was not done on purpose but I prefer explicit imports and there is in fact a circular dependency here.
Parameters | ||
---------- | ||
v: Variable | ||
elemwise : bool |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
check_through_elemwise_ops
or something? Just elemwise
makes it sound like its expecting an elemwise input
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These were all stuff that existed. If I change the kwarg names it will be a breaking change which I think this PR is not
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess I could keep the wrapper with the old kwargs and the inner with better but not sure that helps or makes things more confusing
If False, we won't try to go into elemwise. So this call is faster. | ||
But we still investigate in Second Elemwise (as this is a substitute | ||
for Alloc) | ||
only_process_constants : bool |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shallow_search
? Why is "constants" plural?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same here
underlying constant scalar value. If False, return `v` as is. | ||
|
||
|
||
Raises |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
numpy docs suggest to only have a Raises
section if its not obvious; I don't think its needed here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is needed. It's actually how you use the function most of the time. You need to know what to try except
return x | ||
warnings.warn( | ||
"extract_constant is deprecated. Use `get_underlying_scalar_constant_value(..., raise_not_constant=False)`", | ||
FutureWarning, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
DeprecationWarning
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
DeprecationWarning is more invisible to users, so I always use FutureWarning
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see I used a DeprecationWarning above, I'll make it Future
if (flat_data == flat_data[0]).all(): | ||
return flat_data[0] | ||
|
||
warnings.warn("get_unique_constant_value is deprecated.", FutureWarning) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
DeprecationWarning
?
Fixes bug in `local_add_neg_to_sub` reported in pymc-devs#584
c7712f6
to
65d4b7d
Compare
1. Use actual Solve Op to infer output dtype as CholSolve outputs a different dtype than basic Solve in Scipy==1.15 2. Tweaked test related to pymc-devs#1152 3. Tweak tolerage
65d4b7d
to
628c321
Compare
Description
In PyTensor we have tho following similar utilities
pytensor.get_underlying_scalar_constant
pytensor.tensor.basic.get_underlying_scalar_constant_value
pytensor.tensor.basic.get_scalar_constant_value
pytensor.tensor.basic.extract_constant
pytensor.tensor.rewriting.math.get_constant
This PR removes and deprecates all except:
pytensor.tensor.basic.get_underlying_scalar_constant_value
pytensor.tensor.basic.get_scalar_constant_value
The reason for this distinction, is that the core utility,
get_underlying_scalar_constant_value
actually works for non-scalar inputs, if it can find a single scalar value underlies a potential n-dimensional tensor (say as in pt.zeros(5, 3, 2)). This is powerful, but can lead to subtle bugs when the caller forgets about it. This was the source of the bug behind #584 and was also likely present in other graphs such asgt(x, [0, 0, 0, 0])
and alike where the repeated condition broadcasts the output of the operation.The utility
get_scalar_constant_value
raises if the input is not a scalar (ndim=0) type.I don't love the
underlying
distinguishing the two. Perhapsunique
would be better.Both utilities now accept a
raise_not_constant
which when False (not-default) return the variable as is. I think I would prefer for it to returnNone
but this requires less code changes.Related Issue
local_add_neg_to_sub
rewrite gives wrong results with negative constants #584