Skip to content

HaoyueBaiZJU/Optimization-for-Machine-Learning-Project-Code

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Optimization-for-Machine-Learning-Project-Code

Optimization for Machine Learning Course Project

Homework 1

We consider ridge regression problem with randomly generated data. The goal is to implement gradient descent and experiment with different strong-convexity settings and different learning rates. Test with different weights of regularizer.

Implement functions:

  1. smoothness parameter
  2. closed form solution of the ridge regression problem
  3. compute the objective function value
  4. compute the gradient of the objective function
  5. gradient descent algorithm for t iterations

Homework 2

We consider optimization with the smoothed hinge loss, and randomly generated data. The goal is to implement gradient descent and experiment with different strong-convexity settings and different learning rates.

Implement functions:

  1. implement the smoothed hinge loss (SVM) objective function
  2. compute the gradient of the smoothed hinge loss object function
  3. heavy ball method with adaptive beta (practically simplified)
  4. Nesterov's acceleration method
  5. Nesterov's acceleration with adaptive beta (practically simplified)
  6. Nesterov's general acceleration method (applicable for smooth and non-strongly convex case)

Homework 3

Download data in the mnist directory (which contains class 1 (positive) versus 7 (negative) from the MNIST data). We consider the composite convex optimization with the L1-L2 regularized logistic regression objective function.

Implement functions:

  1. compute prox gradient of the objective f(w) + g(w)
  2. compute the proximal mapping
  3. gradient of the dual regularizer g^* (alpha)
  4. regularized dual averaging algorithm
  5. primal dual ascent method
  6. proximal gradient descent algorithm
  7. proximal descent with AG learning rate
  8. Nesterov's accelerated proximal gradient algorithm
  9. Nesterov's accelerated proximal gradient algorithm with AG learning rate

Homework 4

Download data in the mnist directory (which contains class 1 (positive) versus 7 (negative) from the MNIST data). We consider the smoothed hinge-loss (SVM) objective function with L1-L2 regularizer.

Implement functions:

  1. compute dual objective of L1-L2 regularizer -g^* (alpha)
  2. compute the dual coordinate ascent at a data point (x, y) of the smoothed hinge-loss (SVM) objective function
  3. compute dual objective of the smoothed hinge-loss (SVM) objective function
  4. implement primal proximal coordinate descent
  5. implement stochastic primal-dual coordinate method
  6. implement accelerated linearized alternating direction method of multipliers

About

Optimization for Machine Learning Course Project

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages