Algorithms written for foundations of machine learning class at AMMI
This repo contains Jupyter notebooks implementing various machine learning algorithms you will come across as you start your machine learning journey. Most are implemented as python classes which encapsulate all the various parts of the algorithm.
To learn more about the algorithms, you may require to start with a machine learning book. I wrote a Foundations of Machine Learning Note during the course which might be helpful.
To get a copy of the notebooks, run the following command in your terminal;
git clone https://github.com/ogunlao/foundations_of_ml
The code above fetches the foundations_of_ml
repository and saves it into foundations_of_ml
in your current directory.
- Advanced Numpy - Just to get comfortable with some numpy concepts
- Model Selection Algorithm - Backward Selection
- Coordinate descent algorithm
- Kullback–Leibler divergence
- Linear Algebra - Vectors and their properties
- Linear Regression
- Perceptron
- Binary Logistic Regression
- Logistic Regression with L1 Regularization
- Logistic Regression with L2 Regularization
- Multiclass Logistic Regression - Softmax Regression
- Logistic Regression with Newton Algorithm
- Gaussian Discriminant Analysis (GDA)
- Linear Discriminant Analysis (LDA)
- Quadratic Discriminant Analysis (QDA)
- Gaussian Mixture Models (GMM)
- Bernoulli-NaiveBayes
- MultiClass - NaiveBayes
- K-Fold Cross Validation
- KMeans Clustering
- Newtons Method
- 3D Plots tutorial
- Principal Component Analysis
Most algorithms use numpy and scipy internally for vectorized implementations. Note that these algorithms are not optimized for deployment but serve to explain the concepts in a clear way. For deployment, check out the scikit-learn python library for a start.
This project is licensed under MIT license.
Enjoy.