-
Notifications
You must be signed in to change notification settings - Fork 421
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Better samplers for Bernoulli(p) and Geometric(1/2) random variables #1883
Comments
Not a maintainer, so don't take my position as canonical. There's a lot of optimization that can be done with the rand and rand! functions. Personally, I would also recommend defining explicit rand! methods alongside scalar rand methods. Very few distributions currently do this, but it can yield significant speedups when implemented. The core reason behind the performance increases is because Random.jl has SIMD implementations for Don't worry too much if you replace an existing method with something faster that yields different results (but is still within expectations for the distribution). Distributions.jl doesn't guarantee that I'm curious about the other method for sampling now; I'm tempted to try an implementation of my own to see if I can get it faster than what we currently have. |
Really like the suggestion to implement As far as my own implementation the details are as follows, benchmarks for my version on a Mac M1 laptop:
If you'd like to see a documented version I can point you to this well written implementation which I used as a reference https://docs.rs/opendp/latest/src/opendp/traits/samplers/bernoulli/mod.rs.html#25-137 I found this post which promises to be pretty efficient, but I don't understand the method right now https://discourse.julialang.org/t/faster-bernoulli-sampling/35209 |
The implementation for generating a
Bernoulli(p)
random variable right now isrand() <= p
. This is totally fine, but there are better ways to handle rational probabilities, and there is a method detailed here which uses less entropy (on expectation, 2 bits).For rationals the obvious solution is
rand(1:denominator) <= numerator
, which is faster on my machine, possibly due to the expense of converting the rational prob to a float for comparison.I also have code for the other sampler, but on my machine it's about 2x as slow as the current implementation, still pretty fast, 6.4ns vs the old method 2.7ns.
In writing that code I needed to sample a
Geometric(1/2)
distribution, and found that the current implementation is needlessly expensive for this default case. It is faster to generate random bits (sizeof(Int)
at a time) until you see a 1, this could go down to generating randomInt8
's as well if minimising entropy usage was a concern (but the default sampler generates 64 bits at a time and then truncates anyways)I am interested in implementing these, but want to know whether it is of interest. I also wouldn't be sure how to implement a sampler for a "default" geometric rv that doesn't meddle with the current implementation either.
Posting here to get thoughts, I think the method for rational Bernoulli's should absolutely be implemented as a strict improvement, but otherwise I want to know what people think is important.
The text was updated successfully, but these errors were encountered: