100 Days of Machine Learning Coding as proposed by Siraj Raval
Get the datasets from here
Check out the code from here.
Check out the code from here.
Check out the code from here.
Moving forward into #100DaysOfMLCode today I dived into the deeper depth of what Logistic Regression actually is and what is the math involved behind it. Learned how cost function is calculated and then how to apply gradient descent algorithm to cost function to minimize the error in prediction.
Due to less time I will now be posting an infographic on alternate days.
Also if someone wants to help me out in documentaion of code and already has some experince in the field and knows Markdown for github please contact me on LinkedIn :) .
Check out the Code here
#100DaysOfMLCode To clear my insights on logistic regression I was searching on the internet for some resource or article and I came across this article (https://towardsdatascience.com/logistic-regression-detailed-overview-46c4da4303bc) by Saishruthi Swaminathan.
It gives a detailed description of Logistic Regression. Do check it out.
Got an intution on what SVM is and how it is used to solve Classification problem.
Learned more about how SVM works and implementing the K-NN algorithm.
Implemented the K-NN algorithm for classification. #100DaysOfMLCode Support Vector Machine Infographic is halfway complete. Will update it tomorrow.
Continuing with #100DaysOfMLCode today I went through the Naive Bayes classifier. I am also implementing the SVM in python using scikit-learn. Will update the code soon.
Today I implemented SVM on linearly related data. Used Scikit-Learn library. In Scikit-Learn we have SVC classifier which we use to achieve this task. Will be using kernel-trick on next implementation. Check the code here.
Learned about different types of naive bayes classifiers. Also started the lectures by Bloomberg. First one in the playlist was Black Box Machine Learning. It gives the whole overview about prediction functions, feature extraction, learning algorithms, performance evaluation, cross-validation, sample bias, nonstationarity, overfitting, and hyperparameter tuning.
Using Scikit-Learn library implemented SVM algorithm along with kernel function which maps our data points into higher dimension to find optimal hyperplane.
Completed the whole Week 1 and Week 2 on a single day. Learned Logistic regression as Neural Network.
Completed the Course 1 of the deep learning specialization. Implemented a neural net in python.
Started Lecture 1 of 18 of Caltech's Machine Learning Course - CS 156 by Professor Yaser Abu-Mostafa. It was basically an introduction to the upcoming lectures. He also explained Perceptron Algorithm.
Completed the Week 1 of Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization.
Watched some tutorials on how to do web scraping using Beautiful Soup in order to collect data for building a model.
Lecture 2 of 18 of Caltech's Machine Learning Course - CS 156 by Professor Yaser Abu-Mostafa. Learned about Hoeffding Inequality.
Lec 3 of Bloomberg ML course introduced some of the core concepts like input space, action space, outcome space, prediction functions, loss functions, and hypothesis spaces.
Check the code here.
Found an amazing channel on youtube 3Blue1Brown. It has a playlist called Essence of Linear Algebra. Started off by completing 4 videos which gave a complete overview of Vectors, Linear Combinations, Spans, Basis Vectors, Linear Transformations and Matrix Multiplication.
Link to the playlist here.
Continuing with the playlist completed next 4 videos discussing topics 3D Transformations, Determinants, Inverse Matrix, Column Space, Null Space and Non-Square Matrices.
Link to the playlist here.
In the playlist of 3Blue1Brown completed another 3 videos from the essence of linear algebra. Topics covered were Dot Product and Cross Product.
Link to the playlist here.
Completed the whole playlist today, videos 12-14. Really an amazing playlist to refresh the concepts of Linear Algebra. Topics covered were the change of basis, Eigenvectors and Eigenvalues, and Abstract Vector Spaces.
Link to the playlist here.
Completing the playlist - Essence of Linear Algebra by 3blue1brown a suggestion popped up by youtube regarding a series of videos again by the same channel 3Blue1Brown. Being already impressed by the previous series on Linear algebra I dived straight into it. Completed about 5 videos on topics such as Derivatives, Chain Rule, Product Rule, and derivative of exponential.
Link to the playlist here.
Watched 2 Videos on topic Implicit Diffrentiation and Limits from the playlist Essence of Calculus.
Link to the playlist here.
Watched the remaining 4 videos covering topics Like Integration and Higher order derivatives.
Link to the playlist here.
Check the code here.
An Amazing Video on neural networks by 3Blue1Brown youtube channel. This video gives a good understanding of Neural Networks and uses Handwritten digit dataset to explain the concept. Link To the video.
Part two of neural networks by 3Blue1Brown youtube channel. This video explains the concepts of Gradient Descent in an interesting way. 169 must watch and highly recommended. Link To the video.
Part three of neural networks by 3Blue1Brown youtube channel. This video mostly discusses the partial derivatives and backpropagation. Link To the video.
Part four of neural networks by 3Blue1Brown youtube channel. The goal here is to represent, in somewhat more formal terms, the intuition for how backpropagation works and the video moslty discusses the partial derivatives and backpropagation. Link To the video.
Link To the video.
Link To the video.
Link To the video.
Link To the video.
Moved to Unsupervised Learning and studied about Clustering. Working on my website check it out avikjain.me Also found a wonderful animation that can help to easily understand K - Means Clustering Link
Implemented K Means Clustering. Check the code here.
Got a new book "Python Data Science HandBook" by JK VanderPlas Check the Jupyter notebooks here.
Started with chapter 2 : Introduction to Numpy. Covered topics like Data Types, Numpy arrays and Computations on Numpy arrays.
Check the code -
Introduction to NumPy
Understanding Data Types in Python
The Basics of NumPy Arrays
Computation on NumPy Arrays: Universal Functions
Chapter 2 : Aggregations, Comparisions and Broadcasting
Link to Notebook:
Aggregations: Min, Max, and Everything In Between
Computation on Arrays: Broadcasting
Comparisons, Masks, and Boolean Logic
Chapter 2 : Fancy Indexing, sorting arrays, Struchered Data
Link to Notebook:
Fancy Indexing
Sorting Arrays
Structured Data: NumPy's Structured Arrays
Chapter 3 : Data Manipulation with Pandas
Covered Various topics like Pandas Objects, Data Indexing and Selection, Operating on Data, Handling Missing Data, Hierarchical Indexing, ConCat and Append.
Link To the Notebooks:
Data Manipulation with Pandas
Introducing Pandas Objects
Data Indexing and Selection
Operating on Data in Pandas
Handling Missing Data
Hierarchical Indexing
Combining Datasets: Concat and Append
Chapter 3: Completed following topics- Merge and Join, Aggregation and grouping and Pivot Tables.
Combining Datasets: Merge and Join
Aggregation and Grouping
Pivot Tables
Chapter 3: Vectorized Strings Operations, Working with Time Series
Links to Notebooks:
Vectorized String Operations
Working with Time Series
High-Performance Pandas: eval() and query()
Chapter 4: Visualization with Matplotlib
Learned about Simple Line Plots, Simple Scatter Plotsand Density and Contour Plots.
Links to Notebooks:
Visualization with Matplotlib
Simple Line Plots
Simple Scatter Plots
Visualizing Errors
Density and Contour Plots
Chapter 4: Visualization with Matplotlib
Learned about Histograms, How to customize plot legends, colorbars, and buliding Multiple Subplots.
Links to Notebooks:
Histograms, Binnings, and Density
Customizing Plot Legends
Customizing Colorbars
Multiple Subplots
Text and Annotation
Chapter 4: Covered Three Dimensional Plotting in Mathplotlib.
Links to Notebooks:
Three-Dimensional Plotting in Matplotlib
Studied about Hierarchical Clustering. Check out this amazing Visualization.