CS583: Probabilistic Graphical Models - Spring 2013


Course Description

This course will cover probabilistic graphical models -- powerful and interpretable models for reasoning under uncertainty. The generic families of models such as directed, undirected, and factor graphs as well as specific representations such as hidden Markov models and conditional random fields will be discussed. The discussions will include both the theoretical aspects of representation, learning, and inference, and their applications in many interesting fields such as computer vision, natural language processing, computational biology, and medical diagnosis.

Course Topics

The following is a tentative and partial list of topics that will be covered in the class:

Course Information

Time and Location: Mon - Wed 1:50 - 3:05pm in Stuart Building 220
Professor:
Mustafa Bilgic
Office: Stuart 228C
Email: mbilgic AT iit.edu
Office Hours: Wed 11am - 12pm (Other times by appointment)

Required background

Knowledge of probability and statistics is required. CS480 and CS584 are recommended but not required.

Course Format and Grading

In my course, the slides often serve only as a guide; I use the white board heavily. 

The evaluation will consist a midterm, a semester-long project, and a final. The project will be on a topic of your choice (ideally a research project) and in any programming language you like. The point breakdown is:

Code of academic honesty: Please read the procedures on academic honesty here. If you violate the academic honesty (such as unauthorized collaboration, cheating, etc.), then depending on the severity of the violation, it can result in i) getting zero points on the respective assignment, ii) expulsion from the course, iii) suspension of your enrollment at the university, iv) expulsion from the university.

Course Material

There is a required text book for this course:

Probabilistic Graphical Models, by Daphne Koller and Nir Friedman

There will be additional reading materials (mostly available on the web).

Tentative Schedule

Date Topic Reading
Jan 14 Syllabus & Introduction Ch. 1
Jan 16 Foundations Ch. 2
Jan 21
Martin Luther King Day - No class
Jan 23
Bayesian networks Ch. 3
Jan 28
Jan 30
Markov networks Ch. 4
Feb 04
Feb 06
Local probabilistic models Ch. 5
Feb 11
Feb 13
Template-based representations Ch. 6
Feb 18
Variable Elimination Ch. 9
Feb 20
Feb 25
Clique Trees Ch. 10
Feb 27
Mar 4
Approximate Inference Ch. 12
Mar 6
Mar 11
MIDTERM EXAM
Mar 13
MAP Inference Ch. 13
Mar 18
SPRING BREAK
Mar 20
Mar 25 Learning – overview Ch. 16
Mar 27 Parameter estimation Ch. 17
Apr 01
Apr 03 Structure learning in Bayesian networks Ch. 18
Apr 08
Apr 10 Learning undirected models Ch. 20
Apr 15
Apr 17 Collective classification
Apr 22
Apr 24 Hidden Markov models
Apr 29 Conditional random fields
May 01
REVIEW