-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add general integer exponents #128
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As test I would just do a regression test for the function.
@@ -17,6 +17,27 @@ def gamma(x: torch.Tensor) -> torch.Tensor: | |||
return torch.exp(gammaln(x)) | |||
|
|||
|
|||
# Auxilary function for stable Fourier transform implementation | |||
def gammainc_upper_over_powerlaw(exponent, zz): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Give the equation in the docstring here.
return ( | ||
(2 - 4 * zz) * torch.exp(-zz) | ||
+ 4 * torch.sqrt(torch.pi) * zz**1.5 * torch.erfc(torch.sqrt(zz)) | ||
) / 3 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We need an error message if the exponent is not supported.
I think there is a check already at the init of InversePowerLawPotential
.
I would maybe call this function in the init of InversePowerLawPotential
for the given exponent
and maybe zz=0
to not have a duplicated test. But up to you.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, but there are some things that need to be double-checked. I will continue updating the tests once we fix them.
if exponent == 2: | ||
return torch.sqrt(torch.pi / zz) * torch.erfc(torch.sqrt(zz)) | ||
if exponent == 3: | ||
return -torch.expi(-zz) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As far as I know torch.expi
doesn't exist in torch
module ...
def gammainc_upper_over_powerlaw(exponent, zz): | ||
if exponent not in [1, 2, 3, 4, 5, 6]: | ||
raise ValueError(f"Unsupported exponent: {exponent}") | ||
|
||
if exponent == 1: | ||
return torch.exp(-zz) / zz | ||
if exponent == 2: | ||
return torch.sqrt(torch.pi / zz) * torch.erfc(torch.sqrt(zz)) | ||
if exponent == 3: | ||
return -torch.expi(-zz) | ||
if exponent == 4: | ||
return 2 * ( | ||
torch.exp(-zz) - torch.sqrt(torch.pi * zz) * torch.erfc(torch.sqrt(zz)) | ||
) | ||
if exponent == 5: | ||
return torch.exp(-zz) + zz * torch.expi(-zz) | ||
if exponent == 6: | ||
return ( | ||
(2 - 4 * zz) * torch.exp(-zz) | ||
+ 4 * torch.sqrt(torch.pi) * zz**1.5 * torch.erfc(torch.sqrt(zz)) | ||
) / 3 | ||
|
||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we double-check that everything here is correct? With the new implementation, test_values_ewald.py
fails for inverse potentials with an exponent of 1, as it does not coincide with the Madelung constant.
Add exponents >3 to the main branch.
The core changes in the potential have already been implemented.
The most important change that still has to be implemented and tested in the rest of the code is that
exponent
now is of data typeint
rather thanfloat
.Also, we need some good unit tests. I was thinking of a combination of regression tests for some of the more complicated structures + generalized Madelung constants that I have computed in the past with a different code.
Contributor (creator of pull-request) checklist
Reviewer checklist
📚 Documentation preview 📚: https://torch-pme--128.org.readthedocs.build/en/128/