CS583: Probabilistic Graphical Models - Spring 2014

Course Description

This course will cover probabilistic graphical models -- powerful and interpretable models for reasoning under uncertainty. The generic families of models such as directed, undirected, and factor graphs as well as specific representations such as hidden Markov models and conditional random fields will be discussed. The discussions will include both the theoretical aspects of representation, learning, and inference, and their applications in many interesting fields such as computer vision, natural language processing, computational biology, and medical diagnosis.

Course Topics

The following is a tentative and partial list of topics that will be covered in the class:

Course Information

Time and Location: Mon - Wed 1:50 - 3:05pm in Stuart Building 220
Mustafa Bilgic
Office: Stuart 228C
Email: mbilgic AT iit.edu
Office Hours: Wed 11am - 12pm (Other times by appointment)

Required background

Knowledge of probability and statistics is required. CS480 and CS584 are recommended but not required.

Course Format and Grading

In my course, the slides often serve only as a guide; I use the white board heavily. 

The evaluation will consist a midterm, a semester-long project, and a final. The point breakdown is:

Code of academic honesty: Please read the procedures on academic honesty here. If you violate the academic honesty (such as unauthorized collaboration, cheating, etc.), then depending on the severity of the violation, it can result in i) getting zero points on the respective assignment, ii) expulsion from the course, iii) suspension of your enrollment at the university, iv) expulsion from the university.

Course Project

There are three types of projects:

  1. Implement a project that I assign: Python is required.
  2. Make a contribution to an existing open-source probabilistic graphical model project.
  3. You propose your own project, you have complete freedom. This is good if you have a research project that you want to apply PGMs for.

No matter which project type you choose, all projects require: data, coding, experiments, analysis, and a report.

Course Material

There is a recommended text book for this course:

Probabilistic Graphical Models, by Daphne Koller and Nir Friedman

There will be additional reading materials (mostly available on the web).

Tentative Schedule

Date Topic Reading
Jan 13 Syllabus & Introduction Ch. 1
Jan 15 Foundations Ch. 2
Jan 20
Martin Luther King Day - No class
Jan 22
Bayesian networks Ch. 3
Jan 27
Jan 29
Markov networks Ch. 4
Feb 03
Feb 05
Local probabilistic models Ch. 5
Feb 10
Feb 12
Template-based representations Ch. 6
Feb 17
Variable Elimination Ch. 9
Feb 19
Feb 24
Clique Trees Ch. 10
Feb 26
Mar 3
Approximate Inference Ch. 12
Mar 5
Mar 10
Mar 12
MAP Inference Ch. 13
Mar 17
Mar 19
Mar 24 Learning overview Ch. 16
Mar 26 Parameter estimation Ch. 17
Mar 31
Apr 02 Structure learning in Bayesian networks Ch. 18
Apr 07
Apr 09
Learning undirected models Ch. 20
Apr 14
Apr 16 Collective classification
Apr 21
Apr 23 Hidden Markov models
Apr 28 Conditional random fields
Apr 30