CSE 446: Machine Learning Methods for designing systems that learn from data and improve with experience. Supervised learning and predictive modeling: decision trees, rule induction, nearest neighbors, Bayesian methods, neural networks, support vector machines, and model ensembles. Unsupervised learning and clustering. Prerequisite: CSE 332; either STAT 390, STAT 391, or CSE 312.
E E 511: Introduction to Statistical Learning Covers classification and estimation of vector observations, including both parametric and nonparametric approaches. Includes classification with likelihood functions and general discriminant functions, density estimation, supervised and unsupervised learning, feature reduction, model selection, and performance estimation. Prerequisite: either E E 505 or CSE 515.
E E 512: Graphical Models in Pattern Recognition Bayesian networks, Markov random fields, factor graphs, Markov properties, standard models as graphical models, graph theory (e.g., moralization and triangulation), probabilistic inference (including pearl' s belief propagation, Hugin, and Shafer-Shenoy), junction threes, dynamic Bayesian networks (including hidden Markov models), learning new models, models in practice. Prerequisite: E E 508; E E 511.
CSE 515: Statistical Methods In Computer Science Introduction to the probabilistic and statistical techniques used in modern computer systems. Graphical models, probabilistic inference, statistical learning, sequential models, decision theory. Prerequisite: either STAT 341 or STAT 391, and graduate standing in computer science, or permission of instructor.
STAT 527: Nonparametric Regression and Classification Covers techniques for smoothing and classification including spline models, kernel methods, generalized additive models, and classification and regression trees. Describes measures of predictive performance, along with methods for balancing bias and variance. Bayesian nonparametric methods for regression and density estimation (e.g., Gaussian processes and Dirichlet processes) are also covered.
STAT 535: Statistical Learning: Modeling, Prediction, and Computing I Covers statistical learning over discrete multivariate domains, exemplified by graphical probability models. Emphasizes the algorithmic and computational aspects of these models. Includes additional topics in probability and statistics of discrete structures, general purpose discrete optimization algorithms like dynamic programming and minimum spanning tree, and applications to data analysis. Prerequisite: experience with programming in a high level language.
STAT 538: Statistical Learning: Modeling, Prediction, and Computing II Reviews optimization and convex optimization in its relation to statistics. Covers the basics of unconstrained and constrained convex optimization, basics of clustering and classification, entropy, KL divergence and exponential family models, duality, modern learning algorithms like boosting, support vector machines, and variational approximations in inference. Prerequisite: experience with programming in a high level language.
CSE 546: Machine Learning Explores methods for designing systems that learn from data and improve with experience. Supervised learning and predictive modeling; decision trees, rule induction, nearest neighbors, Bayesian methods, neural networks, support vector machines, and model ensembles. Unsupervised learning and clustering. Prerequisite: either STAT 341, STAT 391, or equivalent, or permission of instructor.
CSE 547: Machine Learning for Big Data Machine Learning and statistical techniques for analyzing datasets of massive size and dimensionality. Representations include regularized linear models, graphical models, matrix factorization, sparsity, clustering, and latent factor models. Algorithms include sketching, random projections, hashing, fast nearest-neighbors, large-scale online learning, and parallel (Map-reduce, GraphLab). Prerequisite: either STAT 535 or CSE 546. This course is cross-listed as STAT 548.