Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gradient descent based fitting for non-gaussian distributions #10

Open
pathfinder49 opened this issue Aug 6, 2019 · 6 comments
Open
Labels
enhancement New feature or request

Comments

@pathfinder49
Copy link
Member

It would be usefult to have a parameter inference function that does not assume gaussian errors.

Common examples are: binomial dristribution & Poisson distribution

@pathfinder49 pathfinder49 added the enhancement New feature or request label Aug 7, 2019
@pathfinder49
Copy link
Member Author

I'd like to start implementing this soon. However, I think it's worhwhile discussing how we want to do this.

At first thought the following issues come to mind:

  1. Should this be integrated with FitBase? If yes, It might be good to write a second class with a compatible interface. We could wrap these to make the user facing end a simple argument.
    a. The FitBase interface will probably need to be adapted to allow this.
  2. Do we want something that is specific to binomial distributions or more general?
  3. We will want some kind of parameter confidence interval. However, covariances are no longer valid?

@pathfinder49
Copy link
Member Author

Implementing a baysian calculation for P(Model|Data) would allow for a general maximum likelyhood calculation with a known confidence interval. I'd suggest defaulting to a flat prior which may be overridden by the user.

@pathfinder49
Copy link
Member Author

pathfinder49 commented Aug 27, 2019

After discussing with @hartytp, I think there are really 3 issues:

  1. Local fit likelyhood optimisation of binomial (or other) distributed data.
  2. Global fit likelyhood optimisation of binomial (or other) distributed data.
  3. Calculating the confidence interval for a single binomial data-point. (for plotting purposes)

@pathfinder49
Copy link
Member Author

3\. Calculating the confidence interval for a single binomial data-point. (for plotting purposes)

This post seems to suggest the right thing.

@pathfinder49 pathfinder49 changed the title Support maximum likelyhood for non-gaussian distributions Gradient descent based fitting for non-gaussian distributions Aug 27, 2019
@dnadlinger
Copy link
Member

dnadlinger commented Aug 27, 2019

Local minimisation of binomial (or other) distributed data.
Global minimisation of binomial (or other) distributed data.

Regarding these, as mentioned before, the only case where the distribution really ends up mattering is for acquisition-time-limited problems like state tomography. For that, established codes to do MLE/Bayesian estimation already exist. (oitg.circuits has some MLE code; I've recently done Bayesian estimates for the remote ion-ion data using Tomographer.)

For other online calibration problems, it's typically easier to just acquire a bit more data – and if one is data-rate-bound, then adaptive/online Bayesian methods (choosing e.g. Ramsey delay times, etc. based on prior data) where 1/N scaling in the error can often be achieved are the way to go.

@pathfinder49
Copy link
Member Author

Local minimisation of binomial (or other) distributed data.
Global minimisation of binomial (or other) distributed data.

Edited to be intelligible

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants