In this course, we will cover probabilistic graphical models: powerful and interpretable models for reasoning under uncertainty. We will survey a family of models, such as Bayesian networks, Markov networks, and factor graphs. The discussions will include both theoretical aspects of representation, learning, and inference in graphical models, and their applications to many interesting fields such as computer vision, natural language processing, computational biology, medical diagnosis, and more. Time permitting, we will also discuss specific graphical model representations such as Hidden Markov Models, Probabilistic Relational Models, Conditional Random Fields, etc.
Time and Location: Tue - Thu 1:50 - 3:05pm
in Stuart 225
Professor: Mustafa
Bilgic
Office: Stuart 228C
Email: mbilgic AT
iit.edu
Office Hours: Tue 3:30-4:30pm (Other times by
appointment)
The
class will consist of
a mixture of presentations and class discusions. As such, students are
expected to read the required materials and participate in the
discussions. Students are expected to return a short write-up on
each reading by 11:59pm the day before the lecture. The purpose of this
class is
to learn about graphical models and
apply them to research/application problems of your
choice. You will be
expected to carry-out a semester long project. The course will be fun,
a worth-wile learning experience, and hopefully useful for your
research problems.
There
is a required (see
important note below) text
book for this course:
Probabilistic
Graphical Models,
by Daphne Koller and Nir Friedman
There will be additional supplemental materials (mostly available on the web).
Important Note: If the cost of the textbook is preventing you
from
registering for the course, please contact me. A copy of the book is
also expected to be available in the library reserves.
There
is a course mailing
list (pgm-f10@mailer). Announcements will be made to this list. Join
using this link.
Date
|
Topic | Notes and Readings |
8/24
|
Course logistics | Syllabus Assignment 1 due 8/25, 11:59pm |
8/26
|
Introduction | Chapter 1 |
8/31
|
Background Material | Chapter 2 Assignment 2 due 9/1, 6:00pm |
9/2, 9/7, 9/9
|
Representation - Bayesian networks | Chapter 3 Assignment 3, due 9/8, 6:00pm Hugin file for the student example (Figure 3.4) |
9/14
|
Representation - CPDs |
Chapter 5 |
9/16, 9/21
|
Representation - Markov networks |
Chapter 4, Assignment 4, due 9/15, 6:00pm |
9/23
|
Application showdown |
Assignment 5, due 9/22, 6:00pm |
9/28, 9/30
|
Inference - Variable elimination |
Sections 9.1, 9.2, and 9.3. Assignment 6, due 9/29, 6pm. |
10/5, 7, 12
|
Inference - Message passing |
Assignment
7, due in class on 10/5. |
10/14
|
Inference
- MAP Inference |
|
10/19
|
Inference - Sampling |
Sections 12.1, 12.2, 12.3 Project Proposal, due in class, 10/19 |
10/21
|
Inference, Algorithm evaluation |
|
10/26
|
Proposal comments, my research
area |
Project proposal comments due |
10/28
|
Learning - Overview |
Chapter 16 |
11/02
|
Learning - Bayesian
Networks |
|
11/09
|
Learning - Markov Networks |
|
11/16
|
Template-based representations |
|
11/23
|
Template-based representations |
|
11/30
|
Project presentations |
|