Description

Machine learning is concerned with the question of how to make computers learn from experience. The ability to learn is not only central to most aspects of intelligent behavior, but machine learning techniques have become key components of many software systems. For example, machine learning techniques are used to create spam filters, analyze customer purchase data, understand natural language, or detect fraudulent credit card transactions.

This course will introduce the fundamental set of techniques and algorithms that constitute machine learning as of today, ranging from classification methods like decision trees, support vector machines and neural networks, over-structured models like hidden Markov models, to clustering and matrix factorization methods for recommendation. The course will not only discuss individual algorithms and methods but also tie principles and approaches together from a theoretical perspective, as well as the gaining fundamentals of applying machine learning techniques to real-world problems. In particular, the course will cover the following topics:

Generative Models, Bayesian Learning, Linear Regression, Logistic Regression, Perceptron, Neural Networks, Convolutional Neural Network, Recurrent Neural Network, Reinforcement Learning, Graphical Models, Clustering, Latent Linear Models, Support Vector Machines, Decision Tree, Boosting, Random Forest, Hidden Markov Models.

General Information

Prerequisites
CSE 250 and any of EAS 305/308, STA 401/421, MTH 309; or permission of instructor.
Course Website
https://piazza.com/buffalo/fall2024/cse475574/home

All announcements, course material, and related information will be communicated through the course website. Enrollment information will be emailed to your UBIT email address before the start of the class.
Textbook
- Kevin Murphy, Probabilistic Machine Learning: An Introduction, MIT Press, 2022. https://probml.github.io/pml-book/book1.html
The following books are recommended as optional reading:
- Chris Bishop, Pattern Recognition and Machine Learning, Springer, 2006.
- Tom Mitchell, Machine Learning. McGraw-Hill, 1997.
- David Mackay, Information Theory, Inference, and Learning Algorithms, Cambridge Press, 2003.
- Trevor Hastie, Robert Tibshirani and Jerome Friedman, The Elements of Statistical Learning. Springer, 2009.
- Richard Duda, Peter Hart and David Stork, Pattern Classification, 2nd ed. John Wiley & Sons, 2001.
Grading Information
Course grades will be computed based on the following factors (subject to changes):
1. In-class questions -- 10%
2. Short weekly quizzes (12) -- 20%
3. Programming Assignments (3) -- 30%
4. Mid-term one (in-class, open book/notes) on 10/01/2024 (Tue) -- 10%
5. Mid-term two (in-class, open book/notes) on 11/05/2024 (Tue) -- 10%
6. Final exam (open book/notes) on 12/16/2024 -- 20%

Letter grades will be given in the range of F to A (with minuses and pluses).

Late submission and missed exam policy: No late days are allowed for quizzes. A total of five days will be allowed across all programming assignments. After those five late days, you will be penalized 25% for each day that your submission is late. No make-up exams for the final exam will be administered other than for university-approved reasons.

Submission: Students will need to use the UBLearns system to submit all assignments.
Python
-- Installing python, ipython - http://ipython.org/install.html
-- Python IDE - https://store.enthought.com/downloads/#default
-- More about notebooks - http://ipython.org/notebook.html
-- Python for Developers, a complete book on Python programming by Ricardo Duarte - http://ricardoduarte.github.io/python-for-developers/
-- "Introduction to Machine Learning with Python" by Andreas Mueller and Sarah Guido.
https://github.com/amueller/introduction_to_ml_with_python

Announcements

Announcements are not public for this course.
Staff Office Hours
NameOffice Hours
Mingchen Gao
When?
Where?
Shaoshu Su
When?
Where?
Chao Wu
When?
Where?
Kangxian Xie
When?
Where?
Pouya Karimian
When?
Where?