[ieee 2013 brics congress on computational intelligence & 11th brazilian congress on...

6
Modeling defeasible reasoning for argumentation* Vadim Vagin Applied Mathematics Department National Research University "Moscow Power Engineering Institute" Moscow, Russian Federation [email protected] Oleg Morosin Applied Mathematics Department National Research University "Moscow Power Engineering Institute" Moscow, Russian Federation [email protected] Abstract this paper contains a description of an argumentation system that uses a defeasible reasoning mechanism. The main idea and the key points are given. Also it contains main algorithms for detecting the conflicts and finding statuses of arguments. Solutions of some problems, which are not solvable in the classical logics, are presented. Keywords argumentation; knowledge; inconsistance; defeasible reasoning. I. INTRODUCTION Decision support systems often contain inconsistent and conflicting information. The methods of the classical logics could not be applied for such knowledge bases. Argumentation is a good way to deal with such knowledge bases. Argumentation is usually considered as the process of constructing the assumptions of solving the analyzed problem. Typically, this process involves the detection of conflicts and finding solutions. In contrast to the classical logic, in the theory of argumentation there may be arguments "for" and "against" certain assumptions. It is necessary to show that there are more arguments "for" than “against” some assumption to confirm it. Thus, one argument is not enough to say the assumption is plausible, but the superiority of arguments over counterarguments already does this. The need in the theory of argumentation arises from inconsistence and uncertainty of data and knowledge that an intelligent system works with. There are three types of information, used in the argumentation [1]. 1. Objective information - information obtained from reliable sources, or which can be directly measured or validated. For instance, the statement "In the central part of Russia in the spring a day length increases" is objective information, supported by science and our observations. Such information is generally used as facts and considered as indefeasible arguments. 2. Subjective information - information derived from less reliable sources. These may be some assumptions or judgments. Often they are formulated using the phrases "typically," "usually," "likely". Such information serves as a "source" of contradictions and conflicts. 1 * The paper has been written with the financial support of RFBR (No. 11-07- 00038a). 3. Hypothetical information. This type of information is necessary for the construction of hypotheses. It is not necessary for hypothetical information to be true. Very often, it is false information, and can be a priori incorrect. However, constructed arguments may be useful for other considerations. For instance, it is unlikely that a sea level will rise in one meter at the next 10 years. However, this assumption may be useful in planning development of coastal areas with the possibility of their flooding. Lack of information is the reason why hypotheses are built, and attempts to prove them are made. All types of information mentioned above could be used for argumentative reasoning. Objective information is the source of facts, subjective information the source of defeasibility, and hypothetical information helps to make assumptions. There are several formalizations of the argumentation theory, for instance: the abstract argumentation system, proposed by P.M Dung [2], the argumentation system of F. Lin and Y. Shoham [3], the G.A.W. Vreeswijk’s system[4] and systems based on defeasible reasoning (J. L . Pollock) [5] and some others. All of these approaches can be divided into three types [1]. 1) Abstract systems at first were proposed by Dung[2] and later developed by Prakken and Sartori[6]. In these systems, the arguments are considered as elements of a set with the binary relation "attacks". In these systems, the authors do not disclose the internal structure of arguments and the nature of the set of such arguments. In addition, there are usually no mechanisms for constructing new arguments. 2) Coherence systems. In such systems, the basic strategy of handling conflicts in a knowledge base is the identification of consistent and coherent subsets of all information available in a knowledge base. Typically, these systems are based on the classical logic [1], though some others exist, that use modal, temporal or descriptive logics. 3) Defeasible reasoning. The common feature for these logics is the incorporation of a defeasible implication into a language. The arguments in such systems are presented as a series of reasons leading to the conclusion, and each step of which may have counterarguments. In this paper we’ll consider the development of defeasible argumentation systems that use the first order logic as an underlying language and based on the theory of defeasible reasoning proposed by John Pollock [5]. 1st BRICS Countries Congress on Computational Intelligence 978-1-4799-3194-1/13 $31.00 © 2013 IEEE DOI 10.1109/BRICS-CCI.&.CBIC.2013.54 304 1st BRICS Countries Congress on Computational Intelligence 978-1-4799-3194-1/13 $31.00 © 2013 IEEE DOI 10.1109/BRICS-CCI.&.CBIC.2013.54 304 2013 BRICS Congress on Computational Intelligence & 11th Brazilian Congress on Computational Intelligence 978-1-4799-3194-1/13 $31.00 © 2013 IEEE DOI 10.1109/BRICS-CCI-CBIC.2013.58 304

Upload: oleg

Post on 01-Feb-2017

218 views

Category:

Documents


5 download

TRANSCRIPT

Page 1: [IEEE 2013 BRICS Congress on Computational Intelligence & 11th Brazilian Congress on Computational Intelligence (BRICS-CCI & CBIC) - Ipojuca, Brazil (2013.9.8-2013.9.11)] 2013 BRICS

Modeling defeasible reasoning for argumentation*

Vadim Vagin Applied Mathematics Department

National Research University "Moscow Power Engineering Institute"

Moscow, Russian Federation [email protected]

Oleg Morosin Applied Mathematics Department

National Research University "Moscow Power Engineering Institute"

Moscow, Russian Federation [email protected]

Abstract — this paper contains a description of an argumentation system that uses a defeasible reasoning mechanism. The main idea and the key points are given. Also it contains main algorithms for detecting the conflicts and finding statuses of arguments. Solutions of some problems, which are not solvable in the classical logics, are presented.

Keywords — argumentation; knowledge; inconsistance; defeasible reasoning.

I. INTRODUCTION

Decision support systems often contain inconsistent and conflicting information. The methods of the classical logics could not be applied for such knowledge bases. Argumentation is a good way to deal with such knowledge bases. Argumentation is usually considered as the process of constructing the assumptions of solving the analyzed problem. Typically, this process involves the detection of conflicts and finding solutions. In contrast to the classical logic, in the theory of argumentation there may be arguments "for" and "against" certain assumptions. It is necessary to show that there are more arguments "for" than “against” some assumption to confirm it. Thus, one argument is not enough to say the assumption is plausible, but the superiority of arguments over counterarguments already does this.

The need in the theory of argumentation arises from inconsistence and uncertainty of data and knowledge that an intelligent system works with. There are three types of information, used in the argumentation [1]. 1. Objective information - information obtained from

reliable sources, or which can be directly measured or validated. For instance, the statement "In the central part of Russia in the spring a day length increases" is objective information, supported by science and our observations. Such information is generally used as facts and considered as indefeasible arguments.

2. Subjective information - information derived from less reliable sources. These may be some assumptions or judgments. Often they are formulated using the phrases "typically," "usually," "likely". Such information serves as a "source" of contradictions and conflicts. 1

* The paper has been written with the financial support of RFBR (No. 11-07-

00038a).

3. Hypothetical information. This type of information is necessary for the construction of hypotheses. It is not necessary for hypothetical information to be true. Very often, it is false information, and can be a priori incorrect. However, constructed arguments may be useful for other considerations. For instance, it is unlikely that a sea level will rise in one meter at the next 10 years. However, this assumption may be useful in planning development of coastal areas with the possibility of their flooding. Lack of information is the reason why hypotheses are built, and attempts to prove them are made. All types of information mentioned above could be used

for argumentative reasoning. Objective information is the source of facts, subjective information – the source of defeasibility, and hypothetical information helps to make assumptions.

There are several formalizations of the argumentation theory, for instance: the abstract argumentation system, proposed by P.M Dung [2], the argumentation system of F. Lin and Y. Shoham [3], the G.A.W. Vreeswijk’s system[4] and systems based on defeasible reasoning (J. L . Pollock) [5] and some others. All of these approaches can be divided into three types [1]. 1) Abstract systems at first were proposed by Dung[2] and later developed by Prakken and Sartori[6]. In these systems, the arguments are considered as elements of a set with the binary relation "attacks". In these systems, the authors do not disclose the internal structure of arguments and the nature of the set of such arguments. In addition, there are usually no mechanisms for constructing new arguments. 2) Coherence systems. In such systems, the basic strategy of handling conflicts in a knowledge base is the identification of consistent and coherent subsets of all information available in a knowledge base. Typically, these systems are based on the classical logic [1], though some others exist, that use modal, temporal or descriptive logics. 3) Defeasible reasoning. The common feature for these logics is the incorporation of a defeasible implication into a language. The arguments in such systems are presented as a series of reasons leading to the conclusion, and each step of which may have counterarguments. In this paper we’ll consider the development of defeasible argumentation systems that use the first order logic as an underlying language and based on the theory of defeasible reasoning proposed by John Pollock [5].

1st BRICS Countries Congress on Computational Intelligence

978-1-4799-3194-1/13 $31.00 © 2013 IEEE

DOI 10.1109/BRICS-CCI.&.CBIC.2013.54

304

1st BRICS Countries Congress on Computational Intelligence

978-1-4799-3194-1/13 $31.00 © 2013 IEEE

DOI 10.1109/BRICS-CCI.&.CBIC.2013.54

304

2013 BRICS Congress on Computational Intelligence & 11th Brazilian Congress on Computational Intelligence

978-1-4799-3194-1/13 $31.00 © 2013 IEEE

DOI 10.1109/BRICS-CCI-CBIC.2013.58

304

Page 2: [IEEE 2013 BRICS Congress on Computational Intelligence & 11th Brazilian Congress on Computational Intelligence (BRICS-CCI & CBIC) - Ipojuca, Brazil (2013.9.8-2013.9.11)] 2013 BRICS

II. GENERAL CONCEPTS

At first, it is necessary to introduce the basic definitions and the notion. Definition 1. The argument is a pair consisting of a set of premises and a conclusion [5]. Such couples will be written so: p / X, where p is a conclusion, and X - the set of premises. For example, the argument (p q)/{~A,B} means that premises ~ A, B leads to the conclusion p q. In all illustrations, we will present arguments as ovals. For arguments with the empty set of premises (such arguments are called facts), we will write only the conclusion. For example, the claim that the earth revolves around the sun, is the fact. Definition 2. The inference graph is a graph that shows the way of building new arguments from already existing ones. As well, the inference graph shows the conflicts between arguments. Definition 3. The deductive inference rule is a simple deductive rule, meaning that if Q is inferred from P and P is true, then Q is also true. Such rules are not defeasible. We will write such rules so: . On the inference graph, we will illustrate such rules as ordinary arrows (see the arguments Pand Q in fig. 1). Definition 4. Defeasible rules. Defeasible rules of inference can be obtained, for example, by induction or abduction. In this paper, we are not interested in a specific mechanism of inference, so these rules are given in a declarative way to the program input. Arguments obtained with the help of these rules will be called defeasible. We will write such rules as follows: M . On the graph we will illustrate these links as dashed arrows, and arguments obtained with the help of defeasible rules will illustrate as double line ovals (see the arguments M and N in fig. 1). The notion of a conflict is the basis of reasoning.

Fig. 1. Examples of inference rules

A. Conflicts As it was already mentioned above, conflicts appear because of inconsistent data in a knowledge base. Detection and solving of such conflicts is the main point of an argumentation system. We will consider two types of conflicts – rebutting and undercut.Definition 5. Rebutting is a situation when some of the arguments rebut the conclusion of the other arguments. In other words, the argument rebuts the argument

, when the conclusion rebuts conclusion and . Rebutting is a symmetrical form of attack. Definition 6. Undercut is the asymmetric form of attack when one argument undercuts a link between premises and the conclusion of the other argument. It is necessary to note, that in contrast to rebutting, there are no mechanisms for constructing such conflicts automatically,

so they must be formulated manually, and entered by user. So it is necessary to define rules of formulization of such type of conflicts. Definition 7. Undercutting rules. These rules of inference show a situation when some arguments undercut a defeasible link between other arguments. For example, there is the argument E, which undercuts the link between arguments C and D. These rules of undercut will be written as follows E�(C@D).

On the graph, undercut arguments and arguments affected by them, as well as rebutting arguments will be joined with thick dashed line arrows. Defeated arguments will be marked with a dark gray color and white font color (see the argument D in fig. 1).

There may appear a question about necessity of such type of conflicts. The most interesting thing in such conflicts is that they are asymmetrical and they attack the link between premise and the conclusion. Let us show some examples, where such rules would be extremely useful for describing some situations. First of all the standard example of a Tweetty[7] could be formulated so that the fact that Tweetty is a penguin is the reason why conclusion “Tweetty flies” derived from “it is a bird” is invalid. Such situations are also common in judicial practice, when there are circumstances for not applying this or that law.

Detection of conflicts is the most difficult part in defeasible reasoning. Main difficulties arise from necessity of maintenance of the first order logic. Speaking of propositional logic it is clear that rebutting is the situation, when two contradicting arguments A and ~A exist. Speaking of the first order logic the main idea is the application of a unification mechanism for finding rebutting arguments. A substitution U={t1/x1, t2/x2…tn/xn}, where ti – term and xi – variable, is called the unifier for expressions W1 and W2 if W1�U= W2�U, where W1�U is the result of replacing variables x1,x2,…,xn onterms t1,t2,…,tn and it is considered that W1 and W2 do not have variables with the same names (if so it is necessary to rename them) [8]. For example, two expressions P(x1)&G(x2)�H(x1,x2) and P(f(a))&G(b)�H(f(a),b) are unifiable with the unifier U={f(a)/x2,b/x2}, and expressions P(x)�G(x) and P(a)&G(b) are not unifiable.Two arguments A1=p1 / X1 and A2=p2 / X2 have a conflict of the type “rebutting” if exists such unifier U that:

1. ~p1�U=p2�U, where ~p1 is the negation of the conclusion p1.

2. X1�UUUUUUUUUU X2�U or X2�UUUUUUUUUU X1�U.

An argument A1 is the undercutting argument for arguments A2 and A3 if:

1. There is a defeasible link between A2 and A3. 2. There is the undercutting rule E�(C@D) and

there are unifiers U1 and U2 such that: a. E�U1=A1�U1

b. (C�U1) �U2=(A2) � U2

c. (D�U1) �U2=(A3) � U2

305305305

Page 3: [IEEE 2013 BRICS Congress on Computational Intelligence & 11th Brazilian Congress on Computational Intelligence (BRICS-CCI & CBIC) - Ipojuca, Brazil (2013.9.8-2013.9.11)] 2013 BRICS

The algorithm of the unification and the conflict detection will be presented in section III.

B. Statuses of arguments. Now let us turn to the main problem – finding statuses of

arguments. At each step of reasoning, finding the status of arguments (defeated it or not) plays a main role. Definition 8. Argument is called the initial one, if the set of its ancestors is empty, i.e. it means it is set initially. Definition 9. An argument basis is a set of arguments involved in inference of an argument. Let us give the definition of a function, that assigns statuses defeated or not to the graph nodes. [9]

The function σ assigns a provisional status to arguments,giving value defeated or undefeated to a subset of nodes(arguments) of the graph in such a way that: � σ assigns the status undefeated to all initial nodes.� σ assigns the status undefeated to the node n iff σ assigns

the status undefeated to all nodes of the node n basis and σ assigns the status defeated to all nodes that attack thenode n.

� σ assigns the status defeated to the node n iff either some nodes of the basis of n has status defeated, or the node n isattacked by some node with status undefeated.

The status assigned by σ is the final status of the argument n,if σ assigns a provisional status to n and σ is not involved in the defining of statuses of the other arguments related with n.So, to sum up there are three available status assignments –undefeated, defeated and provisionally defeated. The first status undefeated means that there are no reasons for not believing this argument. The second one defeated means that there are valuable reasons for not believing it. The third status provisionally defeated means that at this stage of reasoning there are no significant data for believing or denying an argument. In particular that means that there are two (or more) arguments attacking each other (a collective defeat in Pollack’s terminology[9]), and there is no information to solve this conflict. Let us illustrate such situation with a simple example. For instance, on a radio there is a weather forecast and it tells that it is raining. But you looked out the window and saw that the sun is shining, and there are no clouds in the sky. You need to go outside. The question: “Should you take an umbrella or not?” It is a difficult question whom you should believe - your eyes or a weather forecaster. Without any additional information this conflict is not solvable, so both arguments “to take umbrella” and “not to take umbrella” are assigned a status provisionally defeated. We will represent such arguments as pale grey ovals with dashed border color. Such arguments are considered as unreliable and could not attack other arguments. All arguments obtained by using them are also considered as defeated. An inference graph of this problem is represented in fig. 2.

Fig.2. The example of a provisional status assignment.

It is necessary to say that all statuses are not constant and they could be reassigned when new data appear (or even new arguments are constructed from already existing data).

C. Construction of new arguments. Construction of new arguments from existing ones is a

very important task. We will consider two ways of constructing these arguments. First one is the application of deductive and defeasible rules of inference. An algorithm of this application would be given in section III.B. The second way is to use a mechanism of monotonic reasoning. Any complete system of inference rules could be used. Pollack has proposed to use mechanism of natural deduction [10] and itwas successfully implemented in our system. This monotonic mechanism is secondary when speaking about defeasible reasoning, therefore a detailed description of it is not presented. However, it is necessary to point some key moments related with the support of the first order logic,which greatly extends the scope of system applications. Here one of the tasks is to add means of the first order logic to a monotonic subsystem. To do this we need to add some new inference rules. In the rules presented below the following notation has been used: {f1,…,fn}�{g1,…,gk}, where fi, i=1,…n - formulas containing schematic variables, gi, i=1,…k –conclusion formulas containing schematic variables. Under the schematic variable we mean any well-formed formula. This means that if we found an argument with a conclusion formula similar to the left part of a rule, we should construct a new argument with a conclusion formula similar to the right part of a rule. The premise set of an argument in these rules are left unmodified. The inference rules are the following [10] . 1. The negation of the existential quantifier ~ ( ) ( ) ~x p x p� � � . 2. The negation of the universal quantifier ~ ( ) ( ) ~x p x p� � � . 3. The conversion of the universal quantifier (�x)p�sb(p,x/x1), where sb(p,x/x1) - the result of replacing variable x in the formula p on the free variable x1 with a unique name; 4) The conversion of the existential quantifier (skolemization) (�x)p�sb(p,x/�(y1,…,yn)), where sb(p,x/�(y1,…,yn)) - the result of replacing x on skolem function where y1 , y2, ..., yn - free variables in p, obtained with the help of rule 3. For n = 0 skolem function is just a constant. These rules allow us to deal with quantified expressions. They could be applied to arguments, deductive and defeasible rules of inference.

III. ALGORITHMS. In this section main algorithms would be formulated. All these algorithms were successfully implemented on Visual Studio 2010 C# in the defeasible reasoning system that was developed. An example of an application of this system with a step by step description of reasoning process would be given in section IV.

306306306

Page 4: [IEEE 2013 BRICS Congress on Computational Intelligence & 11th Brazilian Congress on Computational Intelligence (BRICS-CCI & CBIC) - Ipojuca, Brazil (2013.9.8-2013.9.11)] 2013 BRICS

A. Unification procedure Unify(f1,f2, boolean unifiable)

Input data: formulas f1,f2. Output data: unifier U, boolean unifiable. Step 1. Set k=0; Wk =W={f1,f2 }; . Step 2. If Wk consists of only one element(it means that f1, and f2 became equal) then unifier exists, unifiable=true, else go to step 3. Step 3. Each expression in Wk is represented as a chain of symbols, and first subexpression that is not equal in all elements of Wk is taken. Let’s consider that not equal expressions are xk and tk. If x is a variable and t is a term then move to step 4. Otherwise, expressions in Wk couldn’t be unified, unifiable = false. Step 4. Set k+1=k�{tk/xk} and Wk+1=Wk�{tk/xk} , where k�{tk/xk} is the composition of substitutions and Wk�{tk/xk} is the result of application of substitution {tk/xk} to expression Wk. Step 5. Set k=k+1; go to step 2.

B. Procedure of inference rule applicationapply_inference_rules(Inference_list);

Input data: initial arguments Inference_list Output data: Inference graph

Step 1. For each element inf_rule from Inference_listexecute: Step 2. If inf_rule is a defeasible rule |p q� or deductive rule p q� then: Step 2.1. Take the left part of the rule - p.For each argument Arg = a/{X}, that exists at reasoning moment execute U=unify(a, p, out bool is_unifiable). Step 2.2. For those arguments Arg, whichcould be unified with the left part of a defeasible rule (is_unifiable = true) create new argument New_Arg =q�U/{X�U}.

Step 2.3. If inf_rule is a defeasible rule, create inference link between Arg and New_Arg of defeasible type, else create inference link between Arg and New_Arg of indefeasible type. Add New_Arg to inference queue.

C. Finding conflicts procedure find_conflicts(Inference_graph, undercut_rules)

Input data: Inference graph, list of undercutting rulesundercut_rules

Output data: Inference graph Step 1. For each undercutting rule uc_rule from undercut_rules of type ( @ )p q r� execute: Step 2. Take the left part of the rule - p. For each argument Arg = a/{X}, that exists at reasoning moment try to find unifierUleft=unify(a, p, out bool is_unifiable1).

Step 2.1. For those arguments Arg, which could be unified with the left part of defeasible rule (is_unifiable1 =true) execute the following steps:

Step 2.2.1. Apply unifier Uleft for q and r(qu=q� Uleft and ru=r� Uleft).

Step 2.2.2. Make an array X=[{xq1,,xr1},{xq2,,xr2},…,{xqn,,xrn}] consisting of pairs of arguments, that are connected with a defeasible link. For each element of that array do: Step 2.2.2.1 Execute unification procedure for qu and xqi: Uright=unify(qu, xqi, out boolis_unifiable2). If they are unifiable check whether found unifier Uright unifies ru and xri. If so, add new conflict to inference graph.

D. Status assignment procedure. recalculate__status(Inference graph);

Input data: initial Inference graph. Output data: final Inference graphStep 1. Build a queue Q=Arg1,Arg2,…,Argn consisting of all arguments in an inference graph. Step 2. While Q is not empty take the next Argi and do: Step 3. Build a set of arguments Arg_basisi, consisting of all arguments, involved in Argi inference. If this set is empty,then Argi is initially entered argument, and it is considered asundefeated. Else, if Arg_basisi is not empty and all arguments in it already have an assigned status, go to step 4, otherwise put Argi to the end of Q and go to step 2. Step 4. If there are defeated arguments in Arg_basisi, assign status defeated to Argi and go to step 2. Else build the set C consisting of arguments, that attack argument Argi and go to step 5. Step 5. If there are undefeated arguments in C and they are not attacked by the argument Argi then assign status defeated to the argument Argi and go to step 2. Else go to step 6.Step 6. If all arguments in C are defeated and they are not attacked by the argument Argi then assign status undefeated to Argi and go to step 2. Else go to step 7. Step 7. Execute status = check_provisional_defeat(Argi,С)procedure. Assign statuses obtained from check_provisional_defeat(Argi,С) to Argi and to that arguments in C, which are attacked by Argi.

check_provisional_defeat(Arg,С) Input data: argument Arg, set of arguments C

Output data: status Arg_status; Step 1. If all elements of C is attacked by Arg go to step 3. Else construct the subset C2 consisting of arguments, which are not attacked by Argi and go to step 2. Step 2. If there are undefeated arguments in C2 then Arg should be assigned a status defeated and Arg_status =“defeated”. Else assign a status undefeated to Arg and Arg_status = “undefeated”Step 3. For each argument CArgi in C do:

Step 3.1. Construct a set AttackersCArgi of arguments attacking CArgi. If AttackersCArgi C�{Arg} then go to next CArgi. Else go to step 3.2.

Step 3.2. Postpone the calculation of status of Arg,break the cycle, and return to recalculate__status procedure.Step 4. This situation means that there is a set of argumentsC�{Arg} each element of which attacks all other elements

307307307

Page 5: [IEEE 2013 BRICS Congress on Computational Intelligence & 11th Brazilian Congress on Computational Intelligence (BRICS-CCI & CBIC) - Ipojuca, Brazil (2013.9.8-2013.9.11)] 2013 BRICS

and they are not involved in any other conflicts. A status provisionally defeated is assigned to all elements of C�{Arg}.

IV. EXAMPLE.Consider the following statements:

1) If the country recently held elections, the political situation is unstable. 2) If the political situation is unstable, revolution is possible. 3) When there is a revolution, the bank system are usually destroyed, therefore the credit risks are very high in such countries. 4) If the country exports oil, then it usually has a strong currency. 5) In countries with a strong currency credit risks are minimal. 6) If on the election the ruling party gained majority of votes than the ruling party has remained the same. 7) If the ruling party has not changed there are no reasons for an unstable situation. 8) In Russia there were elections recently. 9) The ruling party has received the majority of votes. 10) Russia exports oil. Let’s rewrite this statements formally using the first order logic notation. Arguments: A1: Election(Russia,recently)A2: Majority(Russia,ruling_party)A3: Export_oil(Russia)A4: (Election(x,recently)&Majority(x,ruling_party) �Ruling_party_not_changed(x)) Inference rules: R1: (� x) (Election(x, recently) |=> Political_situation(x,unstable))R2: (� x)(Political_situation(x,unstable)|=> Revolution(x,possible))R3: (� x)(Revolution(x,possible)|=> Credit_risks(x))R4: (� x) (Export_oil(x) |=> Strong_curency(x))R5: (� x) (Strong_curency(x) |=> ~Credit_risks(x))R6: (� x) (Ruling_party_not_changed(x)=> (Election(x, recently) @ Political_situation(x,unstable)))All this formal statements are entered to the program. Let’s give a step by step solution. 1) First of all from argument A1: Election(Russia,recently)using rule R1:(� x) (Election(x, recently) |=> Political_situation(x,unstable)) and unifier U={Russia/x} argument A5:Political_situation(Russia,unstable) is received. 2) From argument A5 using rule R2: (� x)(Political_situation (x,unstable) |=> Revolution(x,possible)) and unifier U={Russia/x} argument A6:Revolution(Russia,possible) is received.

3) From argument A6 and rule R3:(�x)(Revolution(x,possible)|=> Credit_risks(x)) using unifier U={Russia/x} argument A7:Credit_risks(Russia) is received. 4) From argument A3:Export_oil(Russia) using rule R4: (� x) (Export_oil(x) |=> Strong_curency(x)) and unifier U={Russia/x} argument A8:(Strong_curency(Russia)) is received. 5) From argument A8 using rule R5:(� x) (Strong_curency(x) |=> ~Credit_risks(x)) and unifier U={Russia/x} argument A9~Credit_risks(Russia) is received. 6) From argument A4 using universal quantifier conversation argument A10: (Election(_x,recently) & Majority(_x,ruling_party)) � Ruling_party_not_changed(_x)is received (where _x means free variable).7) From argument A10:(Election(_x,recently) & Majority(_x,ruling_party)) � Ruling_party_not_changed(_x)by conjunction conversion, argument A11:Election(_x,recently) � (Majority(_x,ruling_party) �Ruling_party_not_changed(_x)) is received.8) From argument A11:Election(_x,recently) �(Majority(_x,ruling_party) � Ruling_party_not_changed(_x)) and argument A1:Election(Russia, recently) using Modus Ponens rule and unifier U={Russia/_x} argument A12(Majority(Russia,ruling_party) �Ruling_party_not_changed(Russia)) is received. 9) From argument A12 (Majority(Russia,ruling_party) �Ruling_party_not_changed(Russia)) and argument A2: Majority(Russia, ruling_party) using Modus Ponens rule argument A13: Ruling_party_not_changed(Russia) is received. At this point starts the detection of conflicts: 10) Using undercutting rule R6:(� x) (Ruling_party_not_changed(x)=> (Election(x, recently) @ Political_situation(x,unstable))) and unifier U={Russia/x} the conflict of the type “undercut” is received, which attacks argument A5:Political_situation(Russia,unstable). 11) A conflict of the type “rebutting” between arguments A7Credit_risks(Russia) and A9: ~Credit_risks(Russia) is detected. Now let’s see what statuses of defeat will be assigned.12) Argument A5: Political_situation(Russia,unstable) is attacked by undefeated argument A13, so the status defeated is assigned to it. 13) Arguments A6 and A7 have defeated arguments in their basis (argument A5) and so, they are assigned the status defeated also. 14) Argument A9: ~Credit_risks(Russia) is attacked by argument A7. Argument A9: ~Credit_risks(Russia) obtains the status undefeated, because the only argument (it is argument A7: Credit_risks(Russia)) that attacks argument A9 is defeated. An inference graph for this example is presented in fig. 3.

308308308

Page 6: [IEEE 2013 BRICS Congress on Computational Intelligence & 11th Brazilian Congress on Computational Intelligence (BRICS-CCI & CBIC) - Ipojuca, Brazil (2013.9.8-2013.9.11)] 2013 BRICS

Fig. 3. Inference graph for example problem

This example presents the problem that is not solvable in the classical logics. It has two conflicts of types “rebutting” and “undercut”. Different rules were used for constructing arguments – rules of natural deduction, rules that helped us to cope with quntified expressions and defeasible rules of infernce. These rules were built from textual task desription and entered to the program complex in a formal way. All statuses of defeat were assigned correctly and the program made right conclusions.

V. CONCLUSION

The main problem – the implementation of the argumentation system, based on the defeasible reasoning theory - was successfully achieved. The developed system was tested on benchmark tasks proposed by Gerard A.W. Vreeswijk “Interpolation of Benchmark Problems in Defeasible Reasoning” [7]. The support of the first order logic opens a wide scope of applications. Such systems could be used in many intelligent systems that use the first order logic as a method of knowledge representation. For instance, it could be applied in intelligent decision support systems,including expert systems.

Nevertheless, the problem of justification degrees remains open. An implementation of different mechanisms of calculating justification degrees for arguments is a topic of the future work.

REFERENCE

[1] Philippe Besnard and Anthony Hunter, “Elements of argumentation”, MIT press, 2008, 298 p.

[2] Bondarenko A., Dung P.M., Kowalski R.A., Toni F. An abstract, argumentation-theoretic framework for default reasoning. // Artificial Intelligence 93(1-2), 1997, pp. 63-101.

[3] Lin F., Shoham Y. Argument systems. A uniform basis for nonmonotonic reasoning. // Proc. Of the First Int. Conf. on Principles of Knoledge Representation and Reasoning. San Mateo, CA: Morgan Kaufmann Publishers Inc, 1989, pp. 245-355.

[4] Vreeswijk G.A.W. “Abstract argumentation systems”. // Artificial Intelligence 1997. V. 90, pp 225-279.

[5] John L. Pollock, “How to Reason Defeasibly,” // Artificial Intelligence 57, 1992, pp. 1-42.

[6] Prakken H., Sartor G. A system for defeasible argumentation wuth defeasible priorities // Proc. Of Int. Conf. on Formal Aspects of Practical Reasoning, Bonn, Germany: Springer Verlag, 1996.

[7] Gerard A.W. Vreeswijk “Interpolation of Benchmark Problems in Defeasible Reasoning” WOCFAI 1995, pp. 453-468.

[8] Vagin V. N., Golovina E.Ju., Zagoryanskaya A.A., Fomina M.V.: Exact and Plausible Inference in Intelligent Systems, eds. V.N. Vagin and D.A. Pospelov. 2-nd Edition. Moscow, FIZMATLIT - 712p. (2008) (in Russian).

[9] John L. Pollock "Defeasible Reasoning. Reasoning: Studies of Human Inference and its Foundations”, eds. Jonathan Adler and Lance Rips, Cambridge University Press, 2006, pp. 31.

[10] John L. Pollock “Natural Deduction” Technical Report, Department of Philosophy, University of Arizona, Tucson, 1996, 35 p.

309309309