cs 4700: foundations of artificial intelligence 4700: foundations of artificial intelligence ......

Download CS 4700: Foundations of Artificial Intelligence  4700: Foundations of Artificial Intelligence ... (mental-models / Johnson- ... How do we actually encode background knowledge and percepts in formal language? 39

Post on 10-Mar-2018

218 views

Category:

Documents

2 download

Embed Size (px)

TRANSCRIPT

  • 1

    CS 4700: Foundations of Artificial Intelligence

    Bart Selman selman@cs.cornell.edu

    Module: Knowledge, Reasoning, and Planning

    Part 1

    Logical Agents R&N: Chapter 7

  • 2

    A Model-Based Agent

  • 3

    Knowledge and Reasoning

    Knowledge and Reasoning: humans are very good at acquiring new information by combining raw knowledge, experience with reasoning. AI-slogan: Knowledge is power (or Data is power?)

    Examples: Medical diagnosis --- physician diagnosing a patient infers what disease, based on the knowledge he/she acquired as a student, textbooks, prior cases Common sense knowledge / reasoning --- common everyday assumptions / inferences e.g., lecture starts at four infer pm not am; when traveling, I assume there is some way to get from the airport to the hotel.

  • 4

    Logical agents: Agents with some representation of the complex knowledge about the world / its environment, and uses inference to derive new information from that knowledge combined with new inputs (e.g. via perception).

    Key issues: 1- Representation of knowledge

    What form? Meaning / semantics? 2- Reasoning and inference processes

    Efficiency.

  • 5

    Knowledge-base Agents

    Key issues: Representation of knowledge knowledge base Reasoning processes inference/reasoning

    (*) called Knowledge Representation (KR) language

    Knowledge base = set of sentences in a formal language representing facts about the world(*)

  • 6

    Knowledge bases Key aspects:

    How to add sentences to the knowledge base How to query the knowledge base Both tasks may involve inference i.e. how to derive new

    sentences from old sentences Logical agents inference must obey the fundamental

    requirement that when one asks a question to the knowledge base, the answer should follow from what has been told to the knowledge base previously. (In other words the inference process should not make things up)

  • A simple knowledge-based agent

    The agent must be able to: Represent states, actions, etc. Incorporate new percepts Update internal representations of the world Deduce hidden properties of the world Deduce appropriate actions

  • 8

    KR language candidate: logical language (propositional / first-order) combined with a logical inference mechanism How close to human thought? (mental-models / Johnson-Laird). What is the language of thought? Why not use natural language (e.g. English)?

    We want clear syntax & semantics (well-defined meaning), and, mechanism to infer new information. Soln.: Use a formal language.

    Greeks / Boole / Frege --- Rational thought: Logic?

  • 9 Consider: to-the-right-of(x,y)

  • 10

  • 11

  • 12

    Procedural style:

    Knowledge-based alternative:

    Modular. Change KB without changing program.

  • 13

  • 14

  • 15 The symbol grounding problem.

  • 16

  • 17

    True!

  • 18

    Compositional semantics

  • 19 I.e.: Models(KB) Models( )

    Note: KB defines exactly the set of worlds we are interested in.

  • 20

  • 21

  • 22

    Note: (1) This was Aristotles original goal --- Construct infallible arguments based purely on the form of statements --- not on the meaning of individual propositions. (2) Sets of models can be exponential size or worse, compared to symbolic inference (deduction).

  • 23

  • 24

  • 25

  • 26

    Illustrative example: Wumpus World

    Performance measure gold +1000, death -1000 (falling into a pit or being eaten by the wumpus) -1 per step, -10 for using the arrow

    Environment Rooms / squares connected by doors. Squares adjacent to wumpus are smelly Squares adjacent to pit are breezy Glitter iff gold is in the same square Shooting kills wumpus if you are facing it Shooting uses up the only arrow Grabbing picks up gold if in same square Releasing drops the gold in same square Randomly generated at start of game. Wumpus only senses current room.

    Sensors: Stench, Breeze, Glitter, Bump, Scream [perceptual inputs] Actuators: Left turn, Right turn, Forward, Grab, Release, Shoot

    (Somewhat whimsical!)

  • 27

    Wumpus world characterization

    Fully Observable No only local perception Deterministic Yes outcomes exactly specified Static Yes Wumpus and Pits do not move Discrete Yes Single-agent? Yes Wumpus is essentially a natural feature.

  • 28

    Exploring a wumpus world

    Stench, Breeze, Glitter, Bump, Scream

    None, none, none, none, none

    The knowledge base of the agent consists of the rules of the Wumpus world plus the percept nothing in [1,1]

    Boolean percept feature values:

  • 29

    Stench, Breeze, Glitter, Bump, Scream

    None, none, none, none, none T=0 The KB of the agent consists of the rules of the Wumpus world plus the percept nothing in [1,1]. By inference, the agents knowledge base also has the information that [2,1] and [1,2] are okay. Added as propositions.

    World known to agent at time = 0.

  • 30

    Further exploration

    Stench, Breeze, Glitter, Bump, Scream

    @ T = 1 What follows? Pit(2,2) or Pit(3,1)

    None, none, none, none, none

    V

    A agent V visited B - breeze

    A/B P?

    P?

    None, breeze, none, none, none

    Where next?

    T = 0 T = 1

  • 31

    S

    Where is Wumpus?

    Wumpus cannot be in (1,1) or in (2,2) (Why?) Wumpus in (1,3) Not breeze in (1,2) no pit in (2,2); but we know there is pit in (2,2) or (3,1) pit in (3,1)

    P?

    P? 1 2 3 4

    1

    2

    3

    4

    Stench, none, none, none, none

    P

    W

    P S

    T=3

    Stench, Breeze, Glitter, Bump, Scream

  • 32

    P

    W

    P

    We reasoned about the possible states the Wumpus world can be in, given our percepts and our knowledge of the rules of the Wumpus world. I.e., the content of KB at T=3.

    Essence of logical reasoning: Given all we know, Pit_in_(3,1) holds.

    (The world cannot be different.)

    What follows is what holds true in all those worlds that satisfy what is known at that time T=3 about the particular Wumpus world we are in.

    Models(KB) Models(P_in_(3,1))

    Example property: P_in_(3,1)

  • 33

    Formally: Entailment

    Situation after detecting nothing in [1,1], moving right, breeze in [2,1]. I.e. T=1.

    Consider possible models for KB with respect to the cells (1,2), (2,2) and (3,1), with respect to the existence or non existence of pits

    3 Boolean choices 8 possible interpretations

    (enumerate all the models or possible worlds wrt Pitt location)

    Knowledge Base (KB) in the Wumpus World Rules of the wumpus world + new percepts

    T = 1

  • 34

    KB = Wumpus-world rules + observations (T=1)

    Is KB consistent with all 8 possible worlds? Worlds

    that violate KB (are inconsistent with what we know)

    Q: Why does world violate KB?

  • 35

    Entailment in Wumpus World

    KB = Wumpus-world rules + observations 1 = "[1,2] has no pit", KB 1

    In every model in which KB is true, 1 is True (proved by model checking)

    Models of the KB and 1

    So, KB defines all worlds that we hold possible. Queries: we want to know the properties of those worlds. Thats how the semantics of logical entailment is defined.

    Note: \alpha_1 holds in more models than KB. Thats OK, but we dont care about those worlds.

  • 36

    Wumpus models KB = wumpus-world rules + observations 2 = "[2,2] has no pit", this is only True in some of the models for which KB is True, therefore KB 2

    Model Checking

    Models of 2

    A model of KB where does NOT hold! 2

  • 37

    Entailment via Model Checking

    Inference by Model checking We enumerate all the KB models and check if 1 and 2 are True in all the models (which implies that we can only use it when we have a finite number of models). I.e. using semantics directly.

    Models(KB) Models( )

  • 38

    Example redux: More formal

    Stench, Breeze, Glitter, Bump, Scream None, none, none, none, none

    V

    A agent V visited B - breeze

    A/B P?

    P?

    None, breeze, none, none, none

    How do we actually encode background knowledge and percepts in formal language?

  • 39

    Wumpus World KB Define propositions: Let Pi,j be true if there is a pit in [i, j]. Let Bi,j be true if there is a breeze in [i, j].

    Sentence 1 (R1): P1,1 Sentence 2 (R2): B1,1 Sentence 3 (R3): B2,1

    "Pits cause breezes in adjacent squares Sentence 4 (R4): B1,1 (P1,2 P2,1) Sentence 5 (R5): B2,1 (P1,1 P2,2 P3,1) etc.

    Notes: (1) one such statement about Breeze for each square. (2) similar statements about Wumpus, and stench

    and Gold and glitter. (Need more propositional letters.)

    [Given.] [Observation T = 0.] [Observation T = 1.]

  • 40

    What about Time? What about

Recommended

View more >