Johns Hopkins University - Spring 2015
This class is aimed at senior undergraduate and graduate students who want to do research in machine learning or data intensive applications (e.g., computer vision, robotics, computational biology, astrophysics).
The class is structured as a series of questions that data scientists must answer in tackling real-world problems. We will cover techniques in great depth. The class expects students to have a fair degree of comfort with mathematics, statistics and programming.
To gauge whether you have the required background, you should have at least taken:
Introduction to Probability & Statistics and Introduction to Linear Algebra, OR
Introduction to Machine Learning
It may also help to skim through some of the assigned reading to get a sense for the material.
Topics covered in class by date:
1/27: Logistics; Overview of machine learning; Introduction to Generative Models.
1/29: In class exercise on spam versus non-spam classification
2/3: Review spam versus non-spam classification; Model Selection; Marginal Likelihood; Bayes Factor; BIC
2/5: Hierarchical Priors; Linear Regression; Ridge Regression; Robust Estimation; Penalized objectives
2/10: Logistic Regression exercise; Gradient Descent, Linear Search, Compare spam/non-spam classifiers trained using Naive Bayes versus Logistic Regression
2/12: Directed Acyclic Graphs, Conditional Parameterization, I-map, G-map, Bayesian Networks.
2/17: SNOW DAY
2/19: I-equivalence class, D-sep, Active Trails, Soundness and Completeness of D-sep, P-map, minimal I-map
2/24: Learning in the fully observable case; in-class exercise to show that the BN MLE factorizes; Bayesian estimate of BN; D-sep of parameters.
2/26: Learning with missing data; EM; missingness
3/3: Wrap up EM.
3/5: SNOW DAY
3/10: Exact inference. Show that exact inference is NP-complete. Started the discussion about cluster graphs and variable elimination.
3/12: Cluster graph, clique trees, family preservation, running intersection, message passing, incremental inference.
3/17: SPRING BREAK
3/19: SPRING BREAK
3/24: Approximate Inference
3/26: Away for CI meeting
3/31: Iain Murray's MCMC tutorial
4/2: Gibbs sampling in class
4/4: MCMC, detailed balance (derive in class), regular chains
4/9: ICU time series example
4/14: Intro to Variational Inference
4/16: Structured Variational (Derive Clique Tree Inference via a Variational Formulation)
4/21: Structured output prediction
4/23: Solved an example problem in class on learning functional networks from time series of neuron activation data.
4/28: SCHOOL CLOSED
Click the Edit button to add class information.
You have the option of deleting this announcement from just the course homepage or deleting this announcement from the course homepage and Q&A feed. What would you like to do?
You'll lose everything you typed, plus all the time it took to type it...