The following project contains python implementation of algorithms for:
- Linear Regression, Bayesian Linear Regression, L1 and L2 regularization (Assignment 1)
- Logistic Regression, Multi-Layer Neural Networks (Assignment 2)
- Gaussian Process Regression (Assignment 3)
In addition, there are written solutions for 5 homeworks to acompany them:
- Probability Theory, Basic Linear Algebra and Derivatives, MAP solution for Linear Regression (Homework Week 1)
- Posterior predictive distributions (Homework Week 2)
- Naive Bayes Spam Classication, Multi-class Logistic Regression (Homework Week 3)
- Kernel Outlier Detection (Homework Week 5)
- Mixture Models (Homework Week 6)
Assignments:
- Independent Component Analysis (Assignment 1)
- Graphical Models (Assignment 2)
- Bayesian PCA (Assignment 3) ^
Homeworks:
- Probability Distribution (Homework Week 1)
- Graphical Models (Homework Week 2)
- Factor Graphs (Homework Week 3)
- Expectation Maximization (Homework Week 4)
- Variational EM, Sampling Methods (Homework Week 5)
- Causality (Homework Week 6)
- IPython
- numpy
- scipy
- matplotlib
^ Bayesian PCA may not work properly. With a high chance there is some problem in derivation.
Copyright (c) 2015, 2016 Minh Ngo
Copyright (c) 2015, 2016 Arthur Bražinskas (only python code)
This project is distributed under the MIT license. It's a part of the Machine Learning 1 and the Machine Learning 2 courses taught by Ted Meeds, Joris Mooij and Max Welling at the University of Amsterdam. Please follow the UvA regulations governing Fraud and Plagiarism in the case if you are a student.