1 cs 430 / info 430 information retrieval lecture 8 query refinement and relevance feedback

37
1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

Upload: kelly-wood

Post on 12-Jan-2016

221 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

1

CS 430 / INFO 430 Information Retrieval

Lecture 8

Query Refinement and Relevance Feedback

Page 2: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

2

Course Administration

Assignment Reports

A sample report will be posted before the next assignment is due.

Preparation for Discussion Classes

Most of the readings were used last year. You can see the questions that were used on last year's Web site:

http://www.cs.cornell.edu/Courses/cs430/2004fa/

Page 3: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

3

CS 430 / INFO 430 Information Retrieval

Completion of Lecture 7

Page 4: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

4

Search for Substring

In some information retrieval applications, any substring can be a search term.

Tries, using suffix trees, provide lexicographical indexes for all the substrings in a document or set of documents.

Page 5: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

5

Tries: Search for Substring

Basic concept

The text is divided into unique semi-infinite strings, or sistrings. Each sistring has a starting position in the text, and continues to the right until it is unique.

The sistrings are stored in (the leaves of) a tree, the suffix tree. Common parts are stored only once.

Each sistring can be associated with a location within a document where the sistring occurs. Subtrees below a certain node represent all occurrences of the substring represented by that node.

Suffix trees have a size of the same order of magnitude as the input documents.

Page 6: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

6

Tries: Suffix Tree

Example: suffix tree for the following words:

begin beginning between bread break

b

e rea

gin tween d k

null ning

Page 7: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

7

Tries: Sistrings

A binary example

String: 01 100 100 010 111

Sistrings: 1 01 100 100 010 1112 11 001 000 101 113 10 010 001 011 14 00 100 010 1115 01 000 101 11

6 10 001 011 17 00 010 1118 00 101 11

Page 8: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

8

Tries: Lexical Ordering

7 00 010 1114 00 100 010 1118 00 101 115 01 000 101 111 01 100 100 010 111

6 10 001 011 13 10 010 001 011 12 11 001 000 101 11

Unique string indicated in blue

Page 9: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

9

Trie: Basic Concept

7

4 8

5 1

2

6 3

0

0

0

0

0

0

0

0

0

1

1

1

11

1

1

Page 10: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

10

Patricia Tree

7

4 8

5 1

2

6 3

0

0

0

00

0

0

0

1

1

1

110 1

1

1

2 2

3 3 4

5

Single-descendant nodes are eliminated.

Nodes have bit number.

Page 11: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

11

CS 430 / INFO 430 Information Retrieval

Lecture 8

Query Refinement and Relevance Feedback

Page 12: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

12

Query Refinement

Search

Reformulate query

Display retrieved information

new query

reformulated query

Query formulation

EXIT

Page 13: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

13

Reformulation of Query

Manual

• Add or remove search terms

• Change Boolean operators

• Change wild cards

Automatic

• Remove search terms

• Change weighting of search terms

• Add new search terms

Page 14: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

14

Manual Reformulation:Vocabulary Tools

Feedback

• Information about stop lists, stemming, etc.

• Numbers of hits on each term or phrase

Suggestions

• Thesaurus

• Browse lists of terms in the inverted index

• Controlled vocabulary

Page 15: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

15

Manual Reformulation: Document Tools

Feedback to user consists of document excerpts or surrogates

• Shows the user how the system has interpreted the query

Effective at suggesting how to restrict a search

• Shows examples of false hits

Less good at suggesting how to expand a search

• No examples of missed items

Page 16: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

16

Relevance Feedback: Document Vectors as Points on a Surface

• Normalize all document vectors to be of length 1

• Then the ends of the vectors all lie on a surface with unit radius

• For similar documents, we can represent parts of this surface as a flat region

• Similar document are represented as points that are close together on this surface

Page 17: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

17

Results of a Search

x x

xx

xx

x

hits from search

x documents found by search query

Page 18: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

18

Relevance Feedback (Concept)

x x

xx

oo

o

hits from original search

x documents identified by user as non-relevanto documents identified by user as relevant original query reformulated query

Page 19: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

19

Theoretically Best Query

x x

xx

oo

o

optimal query

x non-relevant documentso relevant documents

o

oo

x

x

x x

xx

x

x

xx

x

xx

x

Page 20: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

20

Theoretically Best Query

For a specific query, q, let:

DR be the set of all relevant documents

DN-R be the set of all non-relevant documents

sim (q, DR) be the mean similarity between query q and documents in DR

sim (q, DN-R) be the mean similarity between query q and documents in DN-R

The theoretically best query would maximize:

F = sim (q, DR) - sim (q, DN-R)

Page 21: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

21

Estimating the Best Query

In practice, DR and DN-R are not known. (The objective is to find them.)

However, the results of an initial query can be used to estimate sim (q, DR) and sim (q, DN-R).

Page 22: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

22

Rocchio's Modified Query

Modified query vector

= Original query vector

+ Mean of relevant documents found by original query

- Mean of non-relevant documents found by original query

Page 23: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

23

Query Modification

q1 = q0 + ri - sii =1

n1

n1

1 i =1

n2

n2

1

q0 = vector for the initial queryq1 = vector for the modified queryri = vector for relevant document isi = vector for non-relevant document in1 = number of relevant documentsn2 = number of non-relevant documents

Rocchio 1971

Page 24: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

24

Difficulties with Relevance Feedback

x x

xx

oo

o

optimal query

x non-relevant documentso relevant documents original query reformulated query

o

oo

x

x

x x

xx

x

x

xx

x

xx

x

Hits from the initial query are contained in the gray shaded area

Page 25: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

25

Difficulties with Relevance Feedback

x x

xx

oo

o

optimal results set

x non-relevant documentso relevant documents original query reformulated query

o

oo

x

x

x x

xx

x

x

xx

x

xx

x

What region provides the optimal results set?

Page 26: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

26

Effectiveness of Relevance Feedback

Best when:

• Relevant documents are tightly clustered (similarities are large)

• Similarities between relevant and non-relevant documents are small

Page 27: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

27

When to Use Relevance Feedback

Relevance feedback is most important when the user wishes to increase recall, i.e., it is important to find all relevant documents.

Under these circumstances, users can be expected to put effort into searching:

• Formulate queries thoughtfully with many terms

• Review results carefully to provide feedback

• Iterate several times

• Combine automatic query enhancement with studies of thesauruses and other manual enhancements

Page 28: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

28

Adjusting Parameters 1: Relevance Feedback

q1 = q0 + ri - sii =1

n1

n1

1

i =1

n2

n2

1

, and are weights that adjust the importance of the three vectors.

If = 0, the weights provide positive feedback, by emphasizing the relevant documents in the initial set.

If = 0, the weights provide negative feedback, by reducing the emphasis on the non-relevant documents in the initial set.

Page 29: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

29

Adjusting Parameters 2:Filtering Incoming Messages

D1, D2, D3, ... is a stream of incoming documents that are to be divided into two sets:

R - documents judged relevant to an information needS - documents judged not relevant to the information need

A query is defined as the vector in the term vector space:

q = (w1, w2, ..., wn)

where wi is the weight given to term i

Dj will be assigned to R if similarity(q, Dj) >

What is the optimal query, i.e., the optimal values of the wi?

Page 30: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

30

Seeking Optimal Parameters

Theoretical approach

Develop a theoretical model

Derive parameters

Test with users

Heuristic approach

Develop a heuristic

Vary parameters

Test with users

Machine learning approach

Page 31: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

31

Seeking Optimal Parameters using Machine Learning

GENERAL: EXAMPLE: Text Retrieval

Input: Input:• training examples • queries with relevance judgments• design space • parameters of retrieval function

Training: Training:• automatically find the solution • find parameters so that many in design space that works well relevant documents are ranked on the training data highly

Prediction: Prediction:• predict well on new examples • rank relevant documents high

also for new queries

Joachims

Page 32: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

32

Task Application

Text Routing Help-Desk Support:Who is an appropriate expert for a particular problem?

Information Information Agents:Filtering Which news articles are interesting to a particular

person?

Relevance Information Retrieval:Feedback What are other documents relevant for a particular

query?

Text Knowledge Management:Categorization Organizing a document database by semantic

categories.

Machine Learning: Tasks and Applications

Page 33: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

33

Learning to Rank

Assume:• distribution of queries P(q)• distribution of target rankings for query P(r | q)

Given:• collection D of documents• independent, identically distributed training sample (qi, ri)

Design:• set of ranking functions F• loss function l(ra, rb)• learning algorithm

Goal:• find f F that minimizes l(f (q), r) integrated across all queries

Page 34: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

34

A Loss Function for Rankings

For two orderings ra and rb, a pair is:

• concordant, if ra and rb agree in their orderingP = number of concordant pairs

• discordant, if ra and rb disagree in their orderingQ = number of discordant pairs

Loss function: l(ra, rb) = Q

Example:ra = (a, c, d, b, e, f, g, h)rb = (a, b, c, d, e, f, g, h)

The discordant pairs are: (c, b), (d, b) l(ra, rb) = 2

Joachims

Page 35: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

35

Machine Learning: Algorithms

The choice of algorithms is a subject of active research, which is covered in several courses, notably CS 478 and CS/INFO 630.

Some effective methods include:

Naive Bayes

Rocchio Algorithm

C4.5 Decision Tree

k-Nearest Neighbors

Support Vector Machine

Page 36: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

36

Relevance Feedback:Clickthrough Data

Relevance feedback methods have suffered from the unwillingness of users to provide feedback.

Joachims and others have developed methods that use Clickthrough data from online searches.

Concept:

Suppose that a query delivers a set of hits to a user.

If a user skips a link a and clicks on a link b ranked lower,then the user preference reflects rank(b) < rank(a).

Page 37: 1 CS 430 / INFO 430 Information Retrieval Lecture 8 Query Refinement and Relevance Feedback

37

Clickthrough Example

Ranking Presented to User:

1. Kernel Machineshttp://svm.first.gmd.de/

2. Support Vector Machinehttp://jbolivar.freeservers.com/

3. SVM-Light Support Vector Machinehttp://ais.gmd.de/~thorsten/svm light/

4. An Introduction to Support Vector Machineshttp://www.support-vector.net/

5. Support Vector Machine and Kernel ... Referenceshttp://svm.research.bell-labs.com/SVMrefs.html

Ranking: (3 < 2) and (4 < 2)

User clicks on 1, 3 and 4

Joachims