Skip to content

Commit

Permalink
Convert P to latex
Browse files Browse the repository at this point in the history
  • Loading branch information
Matt Bendel authored and Matt Bendel committed Dec 1, 2023
1 parent 043157f commit a787fd5
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions index.html
Original file line number Diff line number Diff line change
Expand Up @@ -238,9 +238,9 @@ <h5 style="padding-left: 5%;">Ours</h5>
<h3 class="m-0 mt-3 bold">Adapting the Weight on the STD Reward</h3>
<p class="m-0 mt-3 mb-3" style="text-align: left; font-size: 120%;">
For the independent-Gaussian case, we derive the correct weight \(\beta\) on the STD reward in closed form. For practical datasets, we propose a method to learn \(\beta\).
In particular, we <span class="bold">adapt</span> \(\beta\) during training so that the SNR gain from averaging P posterior samples matches the expected theoretical behavior, the latter of which we derive in our paper.
In particular, we <span class="bold">adapt</span> \(\beta\) during training so that the SNR gain from averaging \(P\) posterior samples matches the expected theoretical behavior, the latter of which we derive in our paper.
<br><br>
The orange curves below show the <span class="bold">empirical SNR versus P</span> for various values of \(\beta\) (for the case of multicoil MR image recovery) while the blue dashed curves show the <span class="bold">expected theoretical behavior</span>.
The orange curves below show the <span class="bold">empirical SNR versus \(P\)</span> for various values of \(\beta\) (for the case of multicoil MR image recovery) while the blue dashed curves show the <span class="bold">expected theoretical behavior</span>.
The figures show that, for this application, the optimal \(\beta=0.53\).
Please see our <a href="resources/poster.pdf" target="_blank">poster</a> or our <a href="https://openreview.net/forum?id=z4vKRmq7UO" target="_blank">paper</a>!
</p>
Expand Down

0 comments on commit a787fd5

Please sign in to comment.