chapter 15 section 1 – 2 markov models. outline probabilistic inference bayes rule markov chains

Download CHAPTER 15 SECTION 1 – 2 Markov Models. Outline Probabilistic Inference Bayes Rule Markov Chains

Post on 03-Jan-2016

217 views

Category:

Documents

4 download

Embed Size (px)

TRANSCRIPT

Bayesian networks

Chapter 15 Section 1 2Markov Models

OutlineProbabilistic InferenceBayes RuleMarkov Chains

Probabilistic InferenceProbabilistic inference: compute a desired probability from other known probabilities (e.g. conditional from joint)We generally compute conditional probabilities:P(on time | no reported accidents) = 0.90 These represent the agents beliefs given the evidenceProbabilities change with new evidence:P(on time | no accidents, 5 a.m.) = 0.95P(on time | no accidents, 5 a.m., raining) = 0.80Observing new evidence causes beliefs to be updated

Bayes Rule

Terminology

Marginal distributions are sub-tables which eliminate variables A joint distribution over a set of random variables: specifies a real number for each outcome (ie each assignment)Conditional distributions are probability distributions over some variables given fixed values of others5Inference by enumerationP(sun)?

6Inference by enumerationP(sun | winter)?P(sun | winter, hot)?

7Inference by enumerationObvious problems: Worst-case time complexity O(dn) Space complexity O(dn) to store the joint distribution8The product rule

The product rule

The chain ruleBayes Rule

Inference with Bayes RuleReasoning over Time or SpaceOften, we want to reason about a sequence of observations Speech recognition Robot localization User attention Medical monitoring Need to introduce time (or space) into our models Markov Models (Markov Chains)

Markov Models (Markov Chains)

Conditional independenceBasic conditional independence: Past and future independent of the present Each time step only depends on the previous This is called the (first order) Markov property Note that the chain is just a (growable) BN: We can always use generic BN reasoning on it if we truncate the chain at a fixed length

Example: Markov Chain

Markov Chain Inference

Joint distribution of a Markov Model

Markov Models Recap

Mini-Forward Algorithm

Example Run of Mini-Forward AlgorithmFrom initial observations of sun:

From initial observations of rain:

Example Run of Mini-Forward AlgorithmFrom yet another initial distribution P(X1):

Stationary Distributions Example: Stationary DistributionsQuestion: Whats P(X) at time t = infinity?

Application of Stationary Distribution: Web Link AnalysisPageRank over a web graph Each web page is a state Initial distribution: uniform over pages Transitions: With prob. c, uniform jump to a random page (doMed lines, not all shown) With prob. 1-c, follow a random outlink (solid lines) Stationary distribution Will spend more time on highly reachable pages E.g. many ways to get to the Acrobat Reader download page Somewhat robust to link spam Google 1.0 returned the set of pages containing all your keywords in decreasing rank, now all search engines use link analysis along with many other factors (rank actually getting less important over time)

ReferencesCSE473: Introduction to Artificial Intelligence http://courses.cs.washington.edu/courses/cse473/

Recommended

View more >