This repository contains my implementation of Neural Networks which I follow from Andrew NG's Deep Learning Specialization course on Coursera.
- Logistic Regression as a Neural Network
Neural Network having a single layer with only one neuron. - Shallow Neural Network
Neural Network having a single layer with 'n' neurons. - Deep Neural Network
Neural Network having 'L' layers with multiple neurons each. - L2 Regularization
Implementation of L2 Regularization (Ridge Regression). - Dropout
Implementation of Dropout. - Gradient Checking
Gradient checking to check whether the implementation of gradient descent algorithm is correct or not. - Exponentially Weighted MA (EWMA)
Implementation of Exponentially Weighted Moving Average. - EWMA in Gradient Descent
Implementation of Exponentially Weighted Moving Average in Gradient Descent. - Mini Batch Gradient Descent
Implementation of gradient descent with mini batches - Gradient Descent With Momentum
Implementation of Gradient Descent with Momentum.