Home        Syllabus (HTML, PDF)        Course Notes        Problem Sets        Readings        Resources


The Content

In the information age, we are surrounded with data, much of which contains an element of randomness, or "noise".  Taking advantage of the abundance of information requires the ability to make principled inferences and predictions under noisy conditions. 

A wide variety of inference and prediction can be understood in one way or another as applications of conditional probability.  In the first part of the course, we will develop some of the mathematical tools we need before we can study probability, and especially conditional probability, rigorously.  These include proof by induction, combinatorics, and set algebra.  In the middle section, we develop some theory of probability and discrete random variables, with an emphasis on conditional probability and Bayes' rule.  In the final segment, we look at some of the ways conditional probability can be applied to study random environments.  Topics in this section include basic information theory, discrete state Markov chains (a kind of random process), and basic elements of Bayesian statistics and decision theory.

The Where and When

Lecture Place: Social Sciences 312

Lecture Times: M and W, 10:30-11:45 A.M.

Discussion Place: Harvill 301

Discussion Times: F 11-11:50 OR 12-12:50

The Instructor

Name: Colin Reimer Dawson, Ph.D.

Email: cdawson@email.arïzona.edu

Office: Gould-Simpson 850

Office Hours: M 2-3, W 1-2, and/or by appointment