networks and natural language processing

28
Networks and NLP Networks and Natural Language Processing Presented by: Ahmed Magdy Ezzeldin

Upload: ahmed-magdy

Post on 11-May-2015

1.123 views

Category:

Technology


0 download

DESCRIPTION

This presentation is a briefing of a paper about Networks and Natural Language Processing. It describes many graph based methods and algorithms that help in syntactic parsing, lexical semantics and other applications.

TRANSCRIPT

Page 1: Networks and Natural Language Processing

Networks and NLP

Networks and Natural Language Processing

Presented by: Ahmed Magdy Ezzeldin

Page 2: Networks and Natural Language Processing

Graphs in NLP

● Graphs are used in many NLP applications like :- Text Summarization- Syntactic parsing- Word sense disambiguation- Ontology construction- Sentiment and subjectivity analysis- Text clustering

● Associative or semantic networks are used to represent the language units and their relations where language units are the vertices (nodes) and the relations are the edges (links).

Page 3: Networks and Natural Language Processing

Networks are GraphsNodes are Vertices

Links are Edges

- Node can represent text units can be : (words, collocations, word senses, sentences, documents)- Graph nodes do not have to be of the same category- Edges can represent relations: (co-occurrence, collocation, syntactic dependency, lexical similarity)

Page 4: Networks and Natural Language Processing

Outline

● Syntax1- Dependency Parsing2- Prepositional Phrase Attachment3- Co-reference Resolution

● Lexical Semantics1- Lexical Networks2- Semantic Similarity and Relatedness3- Word Sense Disambiguation4- Sentiment and Subjectivity Analysis

● Other Applications1- Summarization2- Semi-supervised Passage Retrieval3- Keyword Extraction

Page 5: Networks and Natural Language Processing

Syntax

Page 6: Networks and Natural Language Processing

1- Dependency Parsing

An approach to sentence parsing

Dependency tree of a sentence is a directed subgraph of the full graph connecting all words in the sentence.

So this subgraph is a tree

The root of the tree is the main predicate that takes arguments which are the child nodes

Page 7: Networks and Natural Language Processing

● (McDonald et al, 2005) made a parser that finds the tree with the highest score using CLE (Chu Liu Edmonds) Algorithm of Maximum spanning tree (MST) in a directed graph.

● Each node picks the neighbor with the highest score which will lead to a spanning tree or a cycle● CLE collapses each cycles into a single node ● CLE runs in O(n^2)

Page 8: Networks and Natural Language Processing

● No tree covers all nodes so the closest 2 nodes are collapsed

Page 9: Networks and Natural Language Processing

● We repeat this step until all nodes are collapsed then an MST is constructed by reversing the procedure and expanding all nodes.

● McDonald achieved excellent results on a standard English data set and even better results on Czech (free word order language)

Page 10: Networks and Natural Language Processing

2- Prepositional Phrase Attachment

● (Toutanova et al., 2004) A preposition like "with" is either attached to the main predicate (high verbal attachment) or the noun phrase before it (low nominal attachment).- “I ate pizza with olives.”- “I ate pizza with a knife.”

● He proposed a semi-supervised learning process where a graph of nouns and verbs is constructed and if 2 words appear in the same context they are connected with an edge.● Random walk until convergence

● Reached performance of 87.54% classification accuracy which is near the human performance which is 88.20%

Page 11: Networks and Natural Language Processing

3- Co-reference Resolution● Identifying relations between entity references in a text● Can be nouns or pronouns ● Approximate the correct assignment of references to entities in a text by using a graph-cut algorithm.Method:

A graph is constructed for each entity● Every entity is linked to all the possible co-reference with weighted edges where weights are the confidence of each co-reference.● Min-cut partitioning separate each entity and its co-references.

Page 12: Networks and Natural Language Processing

Lexical Semantics

Semantic Analysis, Machine Translation, Information retrieval, question answering, knowledge acquisition, word sense disambiguation, semantic role labeling,

textual entailment, lexical acquisition, semantic relations

Page 13: Networks and Natural Language Processing

1- Lexical Networks

a- Unsupervised lexical acquisition (Widdows and Dorow, 2002)Goal: build semantic classes automatically from raw corporaMethod:● Build a co-occurrence graph from British National Corpus where nodes are words linked by conjunction (and/or)● Over 100,000 nodes and over half a million edges. ● Representative nouns are manually selected and put in a seed set.● Largest number of links with the seed set is added to the seed

Page 14: Networks and Natural Language Processing

Result: Accuracy 82% which is far better than beforeThe drawback of this method is low coverage as it is limited to words in conjunction relation only.

Page 15: Networks and Natural Language Processing

1- Lexical Networks [continued]

b- Lexical Network Properties (Ferrer-i-Cancho and Sole, 2001)

Goal: ● Observe Lexical Networks propertiesMethod: ● Build a co-occurrence network where words are nodes that are linked with edges if they appear in the same sentences with distance of 2 words at most. ● Half million nodes with over 10 million edgesResult: ● Small-world effect: 2-3 jumps can connect any 2 words● Distribution of node degree is scale-free

Page 16: Networks and Natural Language Processing

2- Semantic Similarity and Relatedness

● Methods include metrics calculated on existing semantic networks like WordNet by applying shortest path algorithms to identify the closest semantic relation between 2 concepts (Leacock et al. 1998)

● Random Walk algorithm (Hughes and Ramage, 2007)● PageRank gets the stationary distribution of nodes in WordNet biased on each word of an input word pair.● Divergence between these distributions is calculated to show the words relatedness.

Page 17: Networks and Natural Language Processing

3- Word Sense Disambiguation

a- Label Propagation Algorithm (Niu et al. 2005) Method:● Construct a graph of labeled and unlabeled examples for a given ambiguous word● Word sense examples are the nodes and weighted edges are drawn by pairwise metric of similarity.● Known labeled examples are the seed set are assigned with their correct labels (manually)● Labels are propagated through the graph through the weighted edges● Labels are assigned with certain probability● The propagation is repeated until the correct labels are assigned. Result: Performs better than SVM when there is a small number of examples provided.

Page 18: Networks and Natural Language Processing

b- Knowledge-based word sense disambiguation (Mihalcea et al. 2004, Sinha and Mihalcea 2007)

Page 19: Networks and Natural Language Processing

Method:● Build a graph for a given text and all the senses of its words as nodes● Senses are connected on the basis of their semantic relations (synonymy, antonymy ...)● A random walk results in a set of scores that reflects the importance of each word sense.Result: ● Superior to other Knowledge-based word sense disambiguation that did not use graph based representations.Follow up work:● Mihalcea did not use semantic relations but she used weighted edges using a measure of lexical similarity● Brought generality as it can use any electronic dictionary not just a semantic network like WordNet

Page 20: Networks and Natural Language Processing

c- Comparative Evaluation of Graph Connectivity Algorithms (Navigli and Lapata, 2007)

● Applied on word sense graphs derived from WordNET

● Found out that the best measure to use is a closeness measure

Page 21: Networks and Natural Language Processing

4- Sentiment and Subjectivity Analysis

a- Using min-cut graph algorithm (Pang and Lea 2004) Method:● Drawing a graph where sentences are the nodes and the edges are drawn according to the sentences proximity● Each node is assigned a score showing the probability that its sentence is subjective using a supervised subjectivity classifier● Use min-cut algorithm to separate subjective from objective sentences.Results:● Better than the supervised subjectivity classifierb- By Assignment subjectivity and polarity labels (Esuli and Sebastiani 2007)Method:● Random walk on a graph seeded with nodes labeled for subjectivity and polarity.

Page 22: Networks and Natural Language Processing
Page 23: Networks and Natural Language Processing

Other Applications

Page 24: Networks and Natural Language Processing

1- Summarization

a- (Salton et al. 1994, 1997)● Draw a graph of the corpus where every node is a paragraph● Lexically similar paragraphs are linked with edges● A summary is retrieved by following paths defined by different algorithms to cover as much of the content of the graph as possible.

b- Lexical Centrality (Erkan and Radev 2004) (Mihalcea and Tarau 2004)Method:● Sentences are nodes of the graph● Random walk to define the most visited nodes as central to the documents● Remove duplicates or near duplicates● Select sentences with maximal marginal relevance

Page 25: Networks and Natural Language Processing

2- Semi-supervised Passage Retrieval

● Question Biased Passage Retrieval (OtterBacher et al., 2005)Answer a question from a group of documents

Method: ● Use biased random walk on a graph seeded with positive and negative examples● Each node is labeled according to the percentage a random walk ends at this node● The nodes with the highest score are central to the document set and similar to the seed nodes.

Page 26: Networks and Natural Language Processing

3- Keyword Extraction

● A set of terms that best describes the document

● Used in terminology Extraction and construction of domain specific dictionaries

Page 27: Networks and Natural Language Processing

● Mihalcea and Tarau, 2004

Method:● Build a co-occurrence graph of for the input text where words are the the text words● Words are linked by co-occurrence relation limited by the distance between words.● Random walk on graph● Words ranked as important important and found next to each other are collapsed into one key phrase

Result:● A lot better than tf.idf

Page 28: Networks and Natural Language Processing

References

Networks and Natural Language Processing (Mihalcea and Radev 2008)

Dragomir RadevUniversity of Michigan

[email protected]

Rada MihalceaUniversity of North Texas

[email protected]