Skip to content

Commit

Permalink
Release Fides 0.7.5 (#51)
Browse files Browse the repository at this point in the history
* add notebook with minimal example

* fixup log formatting

* Update Minimal.ipynb

* refactor tr update, fix ftol convergence

* version bump
  • Loading branch information
FFroehlich authored Feb 10, 2022
1 parent 376a802 commit 23ccc3e
Show file tree
Hide file tree
Showing 5 changed files with 206 additions and 36 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -9,3 +9,4 @@ docs/_build/*
docs/generated/*
coverage.xml
.coverage
examples/.ipynb_checkpoints/*
167 changes: 167 additions & 0 deletions examples/Minimal.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,167 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Minimal Example\n",
"\n",
"The aim of this notebook is to provide a minimal example how fides can be used to optimize user-defined functions. In this example, we will minimize the [Rosenbrock](https://en.wikipedia.org/wiki/Rosenbrock_function) function. First, we import the rosenbrock function an its derivatives from `scipy.optimize`."
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"from scipy.optimize import rosen, rosen_der, rosen_hess"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Next, we define an objective function that returns a triple with function value, gradient and hessian."
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"def obj(x):\n",
" return rosen(x), rosen_der(x), rosen_hess(x)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"To optimize this function, we first create a `fides.Optimizer` instance based on the objective function defined above. The optimizer also requires upper and lower boundaries for optimization variables $x$. In this example, each optimization variable is only bounded in one direction, with $1.5 \\leq x_0 \\lt \\infty$ and $-\\infty \\lt x_1 \\leq -1.5$. These bounds must be passed as numpy arrays."
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
"import fides\n",
"import numpy as np\n",
"\n",
"opt = fides.Optimizer(obj, \n",
" ub=np.asarray([np.inf, 1.5]), \n",
" lb=np.asarray([-1.5, -np.inf]))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"To perform optimization, we call the `minimize` method and pass the origin `(0, 0)` as starting point."
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {
"scrolled": false
},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"2022-01-11 16:13:17 fides(INFO) iter| fval | fdiff | tr ratio |tr radius| ||g|| | ||step||| step|acc\n",
"2022-01-11 16:13:17 fides(INFO) 0| +1.00E+00 | NaN | NaN | 1.0E+00 | 2.0E+00 | NaN | NaN |1\n",
"2022-01-11 16:13:17 fides(INFO) 1| +1.00E+00 | +9.9E+01 | -9.9E+01 | 1.0E+00 | 2.0E+00 | 1.0E+00 | 2d |0\n",
"2022-01-11 16:13:17 fides(INFO) 2| +9.53E-01 | -4.7E-02 | +1.1E-01 | 2.5E-01 | 2.0E+00 | 2.5E-01 | 2d |1\n",
"2022-01-11 16:13:17 fides(INFO) 3| +5.24E-01 | -4.3E-01 | +1.1E+00 | 6.2E-02 | 1.3E+01 | 7.7E-02 | 2d |1\n",
"2022-01-11 16:13:17 fides(INFO) 4| +3.92E-01 | -1.3E-01 | +9.4E-01 | 1.2E-01 | 1.4E+00 | 1.3E-01 | 2d |1\n",
"2022-01-11 16:13:17 fides(INFO) 5| +2.63E-01 | -1.3E-01 | +1.1E+00 | 2.5E-01 | 3.1E+00 | 1.7E-01 | 2d |1\n",
"2022-01-11 16:13:17 fides(INFO) 6| +1.74E-01 | -8.9E-02 | +1.4E+00 | 2.5E-01 | 4.2E+00 | 1.2E-01 | 2d |1\n",
"2022-01-11 16:13:17 fides(INFO) 7| +1.10E-01 | -6.4E-02 | +2.2E-01 | 2.5E-01 | 1.5E+00 | 2.0E-01 | 2d |1\n",
"2022-01-11 16:13:17 fides(INFO) 8| +7.20E-02 | -3.8E-02 | +1.1E+00 | 4.2E-02 | 5.5E+00 | 4.4E-02 | 2d |1\n",
"2022-01-11 16:13:17 fides(INFO) 9| +4.99E-02 | -2.2E-02 | +9.5E-01 | 8.4E-02 | 3.2E-01 | 8.3E-02 | 2d |1\n",
"2022-01-11 16:13:17 fides(INFO) iter| fval | fdiff | tr ratio |tr radius| ||g|| | ||step||| step|acc\n",
"2022-01-11 16:13:17 fides(INFO) 10| +2.45E-02 | -2.5E-02 | +3.6E-01 | 1.7E-01 | 8.4E-01 | 1.6E-01 | 2d |1\n",
"2022-01-11 16:13:17 fides(INFO) 11| +1.36E-02 | -1.1E-02 | +1.3E+00 | 1.7E-01 | 2.8E+00 | 4.9E-02 | 2d |1\n",
"2022-01-11 16:13:17 fides(INFO) 12| +1.36E-02 | -8.6E-03 | -8.2E-01 | 1.7E-01 | 1.9E-01 | 1.5E-01 | 2d |0\n",
"2022-01-11 16:13:17 fides(INFO) 13| +9.63E-03 | -3.9E-03 | +1.0E+00 | 4.0E-02 | 1.9E-01 | 3.8E-02 | 2d |1\n",
"2022-01-11 16:13:17 fides(INFO) 14| +4.14E-03 | -5.5E-03 | +8.9E-01 | 8.1E-02 | 1.9E-01 | 7.3E-02 | 2d |1\n",
"2022-01-11 16:13:17 fides(INFO) 15| +1.32E-03 | -2.8E-03 | +1.2E+00 | 1.6E-01 | 5.5E-01 | 5.9E-02 | 2d |1\n",
"2022-01-11 16:13:17 fides(INFO) 16| +2.68E-04 | -1.0E-03 | +1.2E+00 | 1.6E-01 | 3.2E-01 | 4.3E-02 | 2d |1\n",
"2022-01-11 16:13:17 fides(INFO) 17| +2.52E-05 | -2.4E-04 | +1.2E+00 | 1.6E-01 | 1.7E-01 | 2.5E-02 | 2d |1\n",
"2022-01-11 16:13:17 fides(INFO) 18| +4.15E-07 | -2.5E-05 | +1.1E+00 | 1.6E-01 | 5.3E-02 | 9.5E-03 | 2d |1\n",
"2022-01-11 16:13:17 fides(INFO) 19| +1.85E-10 | -4.2E-07 | +1.0E+00 | 1.6E-01 | 7.9E-03 | 1.4E-03 | 2d |1\n",
"2022-01-11 16:13:17 fides(INFO) iter| fval | fdiff | tr ratio |tr radius| ||g|| | ||step||| step|acc\n",
"2022-01-11 16:13:17 fides(INFO) 20| +3.62E-17 | -1.8E-10 | +1.0E+00 | 1.6E-01 | 1.6E-04 | 2.9E-05 | 2d |1\n",
"2022-01-11 16:13:17 fides(WARNING) Stopping as function difference 1.85E-10 was smaller than specified tolerances (atol=1.00E-08, rtol=1.00E-08)\n"
]
}
],
"source": [
"opt_f, opt_x, opt_grad, opt_hess = opt.minimize(np.asarray([0, 0]))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"During optimization, fides prints a series of diagnostic variables that can be accessed that may help the user to debug optimization. For example, here we can see that fides took 20 iterations for minimization (`iter` column), that the trust region radius was set to values between 1.0 and 0.81 (`tr radius` column) and that only two step proposals were rejected (`acc` column).\n",
"\n",
"To verify that fides found the correct optimum, we can compare the returned values against reference values (we know that the rosenbrock function has it's minimum at $(1.0, 1.0)$ with function value $0.0$)."
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
"assert np.allclose(opt_x, [1.0, 1.0])\n",
"assert np.isclose(opt_f, 0.0)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"To numerically verify that we found a local minimum, we can check whether the gradient is small and whether the Hessian has strictly positive eigenvalues."
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [],
"source": [
"assert np.allclose(opt_grad, [0.0, 0.0], atol=1e-7)\n",
"assert np.min(np.linalg.eig(opt_hess)[0]) > 0"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.7"
}
},
"nbformat": 4,
"nbformat_minor": 4
}
3 changes: 2 additions & 1 deletion fides/logging.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,8 @@ def create_logger(level: int) -> logging.Logger:
logger = logging.getLogger(f'fides_{logger_count}')
ch = logging.StreamHandler()
formatter = logging.Formatter(
'%(asctime)s - fides - %(levelname)s - %(message)s'
fmt='%(asctime)s fides(%(levelname)s) %(message)s',
datefmt='%Y-%m-%d %H:%M:%S'
)
ch.setFormatter(formatter)
logger.addHandler(ch)
Expand Down
69 changes: 35 additions & 34 deletions fides/minimize.py
Original file line number Diff line number Diff line change
Expand Up @@ -475,29 +475,32 @@ def update_tr_radius(self,
self.iterations_since_tr_update = 0
return False
else:
qpval = 0.5 * stepsx.dot(dv * np.abs(grad) * stepsx)
self.tr_ratio = (fval + qpval - self.fval) / step.qpval
aug = 0.5 * stepsx.dot(dv * np.abs(grad) * stepsx)
actual_decrease = self.fval - fval - aug
predicted_decrease = -step.qpval
if predicted_decrease <= 0.0:
self.tr_ratio = 0.0
else:
self.tr_ratio = actual_decrease / predicted_decrease

interior_solution = nsx < self.delta_iter * 0.9

# values as proposed in algorithm 4.1 in Nocedal & Wright
if self.tr_ratio >= self.get_option(Options.ETA) \
and not interior_solution and step.qpval <= 0:
and not interior_solution:
# increase radius
self.delta = self.get_option(Options.GAMMA2) * self.delta
self.iterations_since_tr_update = 0
elif self.tr_ratio <= self.get_option(Options.MU) or \
step.qpval > 0:
elif self.tr_ratio <= self.get_option(Options.MU):
# decrease radius
self.delta = np.nanmin([
self.delta * self.get_option(Options.GAMMA1),
nsx / 4
])
self.iterations_since_tr_update = 0
elif self.tr_ratio > self.get_option(Options.MU) and \
self.tr_ratio < self.get_option(Options.ETA):
elif self.get_option(Options.MU) < self.tr_ratio < \
self.get_option(Options.ETA):
self.n_intermediate_tr_radius += 1
return self.tr_ratio > 0.0 and step.qpval <= 0
return self.tr_ratio > 0.0

def check_convergence(self, step: Step, funout: Funout) -> None:
"""
Expand All @@ -521,7 +524,7 @@ def check_convergence(self, step: Step, funout: Funout) -> None:
stepsx = step.ss + step.ss0
nsx = norm(stepsx)

if self.delta <= self.delta_iter and \
if self.tr_ratio > self.get_option(Options.MU) and \
np.abs(fval - self.fval) < fatol + frtol*np.abs(self.fval):
self.exitflag = ExitFlag.FTOL
self.logger.warning(
Expand Down Expand Up @@ -657,23 +660,22 @@ def log_step(self, accepted: bool, step: Step, funout: Funout):
normg = norm(self.grad)

iterspaces = max(len(str(self.get_option(Options.MAXITER))), 5) - \
len(str(self.iteration))
len(str(self.iteration)) - 1
steptypespaces = 4 - len(step.type)

fval = funout.fval
if not np.isfinite(fval):
fval = self.fval
self.logger.info(
f'{" " * iterspaces}{self.iteration}'
f' | {fval if accepted else self.fval:+.3E}'
f' | {(fval - self.fval):+.2E}'
f' | {step.qpval:+.2E}'
f' | {self.tr_ratio:+.2E}'
f' | {self.delta_iter:.2E}'
f' | {normg:.2E}'
f' | {normdx:.2E}'
f' | {step.type}{" " * steptypespaces}'
f' | {int(accepted)}'
f'| {fval if accepted else self.fval:+.2E} '
f'| {(fval - self.fval):+.1E} '
f'| {self.tr_ratio:+.1E} '
f'| {self.delta_iter:.1E} '
f'| {normg:.1E} '
f'| {normdx:.1E} '
f'|{" " * steptypespaces}{step.type} '
f'|{int(accepted)}'
)

def track_history(self, accepted: bool, step: Step, funout: Funout):
Expand Down Expand Up @@ -733,31 +735,30 @@ def log_step_initial(self):
"""

iterspaces = max(len(str(self.get_option(Options.MAXITER))), 5) - \
len(str(self.iteration))
len(str(self.iteration)) - 1
self.logger.info(
f'{" " * iterspaces}{self.iteration}'
f' | {self.fval:+.3E}'
f' | NaN '
f' | NaN '
f' | NaN '
f' | {self.delta:.2E}'
f' | {norm(self.grad):.2E}'
f' | NaN '
f' | NaN '
f' | {int(np.isfinite(self.fval))}'
f'| {self.fval:+.2E} '
f'| NaN '
f'| NaN '
f'| {self.delta:.1E} '
f'| {norm(self.grad):.1E} '
f'| NaN '
f'| NaN '
f'|{int(np.isfinite(self.fval))}'
)

def log_header(self):
"""
Prints the header for diagnostic information, should complement
:py:func:`Optimizer.log_step`.
"""
iterspaces = len(str(self.get_option(Options.MAXITER))) - 5
iterspaces = max(len(str(self.get_option(Options.MAXITER))) - 5, 0)

self.logger.info(
f'{" " * iterspaces} iter '
f'| fval | fval diff | pred diff | tr ratio '
f'| delta | ||g|| | ||step|| | step | accept'
f'{" " * iterspaces}iter'
f'| fval | fdiff | tr ratio '
f'|tr radius| ||g|| | ||step||| step|acc'
)

def check_finite(self, funout: Optional[Funout] = None):
Expand Down
2 changes: 1 addition & 1 deletion fides/version.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
__version__ = "0.7.4"
__version__ = "0.7.5"

0 comments on commit 23ccc3e

Please sign in to comment.