tool support for executable documentation of java class hierarchies

22
SOFTWARE TESTING, VERIFICATION AND RELIABILITY Softw. Test. Verif. Reliab. 2005; 15:235–256 Published online 15 April 2005 in Wiley InterScience (www.interscience.wiley.com). DOI: 10.1002/stvr.324 Tool support for executable documentation of Java class hierarchies Daniel Hoffman 1, ,† , Paul Strooper 2 and Sarah Wilkin 3 1 Department of Computer Science, University of Victoria, P.O. Box 3055, STN CSC, Victoria, B.C., Canada V8W 3P6 2 School of Information Technology and Electrical Engineering, The University of Queensland, Brisbane, Queensland 4072, Australia 3 Apple Computer, Inc., 2 Infinite Loop, MS 302-2LF, Cupertino, CA 95014, U.S.A. SUMMARY While object-oriented programming offers great solutions for today’s software developers, this success has created difficult problems in class documentation and testing. In Java, two tools provide assistance: Javadoc allows class interface documentation to be embedded as code comments and JUnit supports unit testing by providing assert constructs and a test framework. This paper describes JUnitDoc, an integration of Javadoc and JUnit, which provides better support for class documentation and testing. With JUnitDoc, test cases are embedded in Javadoc comments and used as both examples for documentation and test cases for quality assurance. JUnitDoc extracts the test cases for use in HTML files serving as class documentation and in JUnit drivers for class testing. To address the difficult problem of testing inheritance hierarchies, JUnitDoc provides a novel solution in the form of a parallel test hierarchy. A small controlled experiment compares the readability of JUnitDoc documentation to formal documentation written in Object-Z. Copyright c 2005 John Wiley & Sons, Ltd. KEY WORDS: documentation; extreme programming; inheritance; Java; object-oriented; specification; automated testing 1. INTRODUCTION In 1968, McIlroy [1] proposed a software industry based on reusable components, serving roughly the same role that VLSI chips do in the hardware industry. After 35 years, McIlroy’s vision is becoming a reality. Components in the form of class libraries now exist for many object-oriented languages. While class libraries provide valuable solutions, their use gives rise to difficult problems in both documentation and testing. Correspondence to: Daniel Hoffman, Department of Computer Science, University of Victoria, P.O. Box 3055, STN CSC, Victoria, B.C., Canada V8W 3P6. E-mail: [email protected] Copyright c 2005 John Wiley & Sons, Ltd. Received 10 June 2003 Accepted 6 February 2005

Upload: daniel-hoffman

Post on 06-Jul-2016

232 views

Category:

Documents


8 download

TRANSCRIPT

Page 1: Tool support for executable documentation of Java class hierarchies

SOFTWARE TESTING, VERIFICATION AND RELIABILITYSoftw. Test. Verif. Reliab. 2005; 15:235–256Published online 15 April 2005 in Wiley InterScience (www.interscience.wiley.com). DOI: 10.1002/stvr.324

Tool support for executabledocumentation of Java classhierarchies

Daniel Hoffman1,∗,†, Paul Strooper2 and Sarah Wilkin3

1Department of Computer Science, University of Victoria, P.O. Box 3055, STN CSC, Victoria, B.C.,Canada V8W 3P62School of Information Technology and Electrical Engineering, The University of Queensland, Brisbane,Queensland 4072, Australia3Apple Computer, Inc., 2 Infinite Loop, MS 302-2LF, Cupertino, CA 95014, U.S.A.

SUMMARY

While object-oriented programming offers great solutions for today’s software developers, this success hascreated difficult problems in class documentation and testing. In Java, two tools provide assistance: Javadocallows class interface documentation to be embedded as code comments and JUnit supports unit testing byproviding assert constructs and a test framework. This paper describes JUnitDoc, an integration of Javadocand JUnit, which provides better support for class documentation and testing. With JUnitDoc, test cases areembedded in Javadoc comments and used as both examples for documentation and test cases for qualityassurance. JUnitDoc extracts the test cases for use in HTML files serving as class documentation and inJUnit drivers for class testing. To address the difficult problem of testing inheritance hierarchies, JUnitDocprovides a novel solution in the form of a parallel test hierarchy. A small controlled experiment comparesthe readability of JUnitDoc documentation to formal documentation written in Object-Z. Copyright c©2005 John Wiley & Sons, Ltd.

KEY WORDS: documentation; extreme programming; inheritance; Java; object-oriented; specification;automated testing

1. INTRODUCTION

In 1968, McIlroy [1] proposed a software industry based on reusable components, serving roughly thesame role that VLSI chips do in the hardware industry. After 35 years, McIlroy’s vision is becominga reality. Components in the form of class libraries now exist for many object-oriented languages.While class libraries provide valuable solutions, their use gives rise to difficult problems in bothdocumentation and testing.

∗Correspondence to: Daniel Hoffman, Department of Computer Science, University of Victoria, P.O. Box 3055, STN CSC,Victoria, B.C., Canada V8W 3P6.†E-mail: [email protected]

Copyright c© 2005 John Wiley & Sons, Ltd.Received 10 June 2003

Accepted 6 February 2005

Page 2: Tool support for executable documentation of Java class hierarchies

236 D. HOFFMAN, P. STROOPER AND S. WILKIN

Class libraries and frameworks provide large and complex application programming interfaces(APIs), making effective documentation essential for successful use. While the method names andprototypes are typically expressed in the implementation language, the method behaviour must bedocumented as well. Typically, this is done with brief prose descriptions, focusing on the situationsthat commonly arise in API use. Such documentation is inevitably imprecise and incomplete, leading tocostly misunderstandings between API implementors and API users. The main alternative comes fromthe formal methods community, which recommends mathematically precise specifications becausethey can be complete and unambiguous. Unfortunately, such specifications are expensive to write andmaintain. Worse, few developers are willing or able to read formal specifications.

Traditionally, unit testing has focused on individual software functions. With the advent of object-oriented technology, the unit-under-test has become the class. Inheritance provides the developer with apowerful tool for factoring out redundant code: put the common code in a superclass and handle specialcases in subclasses. The resulting class hierarchies make life difficult for the tester. Ideally, the testerwould develop a test hierarchy paralleling the class hierarchy. Unfortunately, manual maintenance andexecution of such hierarchies is difficult and there is no tool support available.

JUnitDoc provides a solution to the problems of class documentation and testing by integratingJUnit [2] and Javadoc‡. The underlying idea is simple: embed test cases in the documentation.Typically, there are a few cases for each likely question about API behaviour. In practice, the test casesserve roughly the same role that FAQs (‘frequently asked questions’) do on many Web sites. The testcases are embedded in the code as Javadoc comments. As shown in Figure 1, JUnitDoc extracts thetest cases in two forms: (1) Javadoc HTML files for class documentation and (2) JUnit drivers for classtesting. GenDoc extracts the embedded test cases in Javadoc HTML files. GenDriver builds a JUnittest driver based on the test cases. GenDriver provides support for inheritance testing by automaticallytraversing the class hierarchy and accumulating test cases for each method. The result is a test hierarchywhich parallels the class hierarchy.

The ‘FAQ approach’ to using test cases for documentation has four main benefits.

1. Precise (though partial) documentation. The test cases contain both inputs and expected outputsin executable form. Therefore, they are formal specifications of required behaviour for selectedinputs.

2. Guaranteed consistency of code and documentation. A single command can run all the test cases,automatically revealing inconsistencies between actual and documented behaviour.

3. Good fault detection. While the primary purpose of the FAQ test cases is communication, theyare also useful for quality assurance and regression testing. For example, the test cases canprovide the kind of unit tests advocated in extreme programming [3,4].

4. Helpful examples of use. When first using an API, programmers often spend a lot of time gettingthe first simple example to run. The test cases provide complete, executable examples suitablefor copying and editing.

An early version of the FAQ approach was presented in [5,6], including two examples, initialprototype tool support, and an informal comparison with formal API specification. In this paper,

‡http://java.sun.com/j2se/javadoc.

Copyright c© 2005 John Wiley & Sons, Ltd. Softw. Test. Verif. Reliab. 2005; 15:235–256

Page 3: Tool support for executable documentation of Java class hierarchies

TOOL SUPPORT FOR EXECUTABLE DOCUMENTATION 237

C.java

GenDriver javac

CTest.java C.classC.html

browser

CTest.class

javac

GenDoc

Figure 1. JUnitDoc system flowchart.

the tool support has been integrated with the Javadoc documentation and JUnit testing frameworks,and, most importantly, extended to support the testing of inheritance hierarchies. This paper containsdetailed material on the use of JUnitDoc in documenting and testing inheritance hierarchies. A sampleclass hierarchy is introduced in Section 2, the JUnitDoc features are presented in Section 3, witha detailed example in Section 4. Section 5 presents a small controlled experiment comparingthe readability of JUnitDoc documentation with a formal specification written in Object-Z [7].Related work is presented in Section 6.

2. THE List FAMILY

This section presents a small class hierarchy for use in illustrating the JUnitDoc features. Also includedis a test driver developed manually and used for comparison with drivers developed with JUnitDocsupport.

2.1. Three classes

In Section 4, the detailed application of JUnitDoc to three classes from the Java libraries§ is discussed.The AbstractList class specifies 34 methods which provide access to indexed collections.

§http://java.sun.com/.

Copyright c© 2005 John Wiley & Sons, Ltd. Softw. Test. Verif. Reliab. 2005; 15:235–256

Page 4: Tool support for executable documentation of Java class hierarchies

238 D. HOFFMAN, P. STROOPER AND S. WILKIN

public class AbstractList extends AbstractCollection implements List {void add(int index, Object element) {...}void add(Object element) {...}List subList(int fromIndex, int toIndex) {...}// ...

}

public class LinkedList extends AbstractList {public Object removeFirst() {...}// ...

}

public class ArrayList extends AbstractList {public void ensureCapacity(int minCapacity) {...}// ...

}

Figure 2. AbstractList, LinkedList, and ArrayList method prototypes.

Elements can be inserted at, or retrieved from, a particular position, or searched for by value.The examples use three methods from AbstractList, whose prototypes are shown in Figure 2:add(i, e) inserts element e at position i, shifting subsequent elements to the right, add(e) appendselement e at the end of the list, and subList(fromIndex, toIndex) extracts the sublist from positionfromIndex to position toIndex, leaving the original list unchanged. As in the C++ Standard TemplateLibrary [8], the element at fromIndex is included in the extracted sublist but the element at toIndex isnot.

As its name suggests, AbstractList is an abstract class and cannot be instantiated.The LinkedList class extends AbstractList, using a linked list as the underlying data structureand providing six additional methods. The examples use one of these methods: removeFirst()removes and returns the first element from the list. The ArrayList class also extendsAbstractList, using an array as the underlying data structure. One additional method is provided:ensureCapacity(n) increases the size of the underlying array, if necessary, to ensure that it canstore at least n elements.

2.2. A simple driver

A simple Java driver for the LinkedList.subList method serves to motivate the JUnitDoc toolpresented in the next section. Following the FAQ approach, the driver is organized around typicalquestions that users might have about the behaviour of subList:

1. What are the legal values for fromIndex and toIndex?2. What sublist is extracted for the legal values?

The driver in Figure 3 creates a list with three elements and then uses eight test cases to answerthe questions. For question 1, cases 1–4 show that fromIndex must not be negative, and can be less

Copyright c© 2005 John Wiley & Sons, Ltd. Softw. Test. Verif. Reliab. 2005; 15:235–256

Page 5: Tool support for executable documentation of Java class hierarchies

TOOL SUPPORT FOR EXECUTABLE DOCUMENTATION 239

than or equal to toIndex, but must not exceed toIndex. Cases 5 and 6 show that toIndex may equal butmust not exceed the list size. The second question is answered with two cases. Case 7 shows that, iffromIndex = toIndex, the empty list is extracted. Case 8 shows the normal case, where the extractedlist includes the element at fromIndex but not the one at toIndex.

While the driver in Figure 3 is systematic, it is clumsy as a communication mechanism. The readermust jump back and forth between the driver code and the driver output to determine the behaviourfor each case. The driver could be augmented to include code to perform additional checking, but thiswould make it bulky and even less suitable for documentation purposes. Finally, it is not clear how toadd test cases for ArrayList without significant duplication, making driver maintenance expensive.

The next section shows how JUnitDoc can be used to overcome these problems.

3. THE JUnitDoc TOOL

JUnitDoc is based on Javadoc, a tool for generating class documentation from code comments. Javadoccomments begin with ‘/**’ (rather than just ‘/*’) and can precede each class, interface, field, andmethod declaration. The Javadoc tool generates HTML documentation containing the declarativeportions of the code with the comments interleaved. Javadoc-generated HTML is now the acceptedmeans for documenting both the Java standard libraries and custom code.

To control the appearance of the generated documentation, Javadoc comments typically containHTML commands. Because these commands are often repetitive, Javadoc provides ‘tags’:a mechanism for generating the HTML for commonly occurring items. A tag is identified by aleading ‘@’. For example, the @param and @return tags generate HTML to format methodparameter and return value comments. The Javadoc taglet API allows a user to introduce new tags andto write Java code to generate the desired HTML commands. By default, Javadoc generates HTML.For users who prefer a different format, e.g. XML, the doclet API gives users access to the class, field,and method declarations of a Java file, and to the Javadoc comments associated with each tag.

As summarized in Figure 1, JUnitDoc uses taglets and doclets to process test cases embedded inJavaDoc comments. GenDoc is a taglet which inserts the test cases in Javadoc HTML files. GenDriveris a doclet which builds a JUnit test driver based on the comments. With both tools, the @testcasestag is used to identify the statements to be processed.

3.1. Test case templates

Test case templates provide a succinct way of specifying test cases and support the FAQ approach intwo ways.

1. Each template contains both method calls and expected behaviour, improving readability andeliminating the need for a log file.

2. Test cases contain a lot of similar code, which is generated by JUnitDoc. Consequently, JUnitDoctemplates are compact, usually ‘one-liners’.

Each template is embedded in a Javadoc comment, and identified by keywords preceded by the #character, tagged for manipulation by the JUnitDoc preprocessor. There are two types of templates:

Copyright c© 2005 John Wiley & Sons, Ltd. Softw. Test. Verif. Reliab. 2005; 15:235–256

Page 6: Tool support for executable documentation of Java class hierarchies

240 D. HOFFMAN, P. STROOPER AND S. WILKIN

import java.util.*public class SubListTest {public static void main(String[] args) {

LinkedList a = new LinkedList();a.add("a"); a.add("b"); a.add("c");

// ***** What are the legal values for fromIndex and toIndex?try { a.subList(-1,1); } // CASE 1catch (Exception x) { System.out.println("1: " + (x.getClass()).getName() ); }

try { a.subList(0,1); } // CASE 2catch (Exception x) { System.out.println("2: " + (x.getClass()).getName() ); }

try { a.subList(1,1); } // CASE 3catch (Exception x) { System.out.println("3: " + (x.getClass()).getName() ); }

try { a.subList(2,1); } // CASE 4catch (Exception x) { System.out.println("4: " + (x.getClass()).getName() ); }

try { a.subList(1,3); } // CASE 5catch (Exception x) { System.out.println("5: " + (x.getClass()).getName() ); }

try { a.subList(1,4); } // CASE 6catch (Exception x) { System.out.println("6: " + (x.getClass()).getName() ); }

// ***** What subList is extracted for the legal values?System.out.println("size: " + a.subList(1,1).size()); // CASE 7

System.out.println("size: " + a.subList(1,3).size()); // CASE 8System.out.println("at position 0: " + a.subList(1,3).get(0));System.out.println("at position 1: " + a.subList(1,3).get(1));

}

}

(a)

1: java.lang.IndexOutOfBoundsException4: java.lang.IllegalArgumentException6: java.lang.IndexOutOfBoundsExceptionsize: 0size: 2at position 0: bat position 1: c

(b)

Figure 3. subList driver source code (a) and output (b).

Copyright c© 2005 John Wiley & Sons, Ltd. Softw. Test. Verif. Reliab. 2005; 15:235–256

Page 7: Tool support for executable documentation of Java class hierarchies

TOOL SUPPORT FOR EXECUTABLE DOCUMENTATION 241

value-checking and exception-monitoring. The form of a value-checking test case is

#valueCheck actval # expval [# valueType] #end

where actval (actual value) and expval (expected value) are expressions of the same type. JUnitDocgenerates code to compare actval and expval while monitoring the exception behaviour. An errormessage is displayed if actval and expval are different or if an exception is thrown during thecomparison. In either case, driver execution continues. For example, the following template couldreplace case 7 in Figure 3:

#valueCheck a.subList(1,1).size() # 0 #end

To add flexibility, JUnitDoc supports user-defined comparison for #valueCheck templates. For astandard type, such as int or String, JUnitDoc uses its own comparison methods to compare actvaland expval. The tester can override a comparison method, or provide a comparison method for a newtype, by specifying a valueType field in a value-checking test case. The tester then has to provide,in class valueType, one method for comparing values of that type and another for printing an errormessage if the two values are not equal. This can, for example, be used to provide case-insensitivecomparison for strings (the normal comparison is case-sensitive).

To support the testing of exceptions, the general form of an exception-monitoring test case is

#excMonitor action [# expexc [# excType]] #end

where action is a fragment of Java code and expexc is a Java exception object. The generated codeexecutes action while monitoring the exception behaviour. An error message is displayed if expexc isnot thrown or if another exception is thrown. For example, the following template could replace case 1in Figure 3:

#excMonitor a.subList(-1,1); # new IndexOutOfBoundsException() #end

In an exception-monitoring template, expexc can be omitted, in which case an error message is printedif any exception is thrown. JUnitDoc also supports user-defined comparison for #excMonitortemplates. By default, two exceptions are considered equal if the exception objects are of the sametype. The tester can override this comparison by providing custom comparison and error messagemethods, in class excType.

The code generated by JUnitDoc is straightforward but tedious to write and maintain manually.For example, Figure 4 shows the code generated by the #excMonitor test case just presented.

To support encapsulation, class designers typically do not allow class users, including class testers,to stimulate and observe object state directly. Instead, the state is accessible only using public methods.Occasionally special test methods or fields are supplied, allowing the tester to access the state directly.While this special code violates encapsulation, the white box testing it supports can be valuable.JUnitDoc supports such white box testing by allowing any accessible method or field to appear intest templates.

3.2. Inheritance

In object-oriented unit testing, most public methods will have an associated block of test cases.When the method is inherited, it is not obvious where to write the test cases. If the tests are

Copyright c© 2005 John Wiley & Sons, Ltd. Softw. Test. Verif. Reliab. 2005; 15:235–256

Page 8: Tool support for executable documentation of Java class hierarchies

242 D. HOFFMAN, P. STROOPER AND S. WILKIN

try {list.subList(-1,1);

} catch (Throwable exception) {if (BoundsException.getClass() != exception.getClass()) {

fail("Actual exception: "+exception.getClass().getName()+" Expected exception: "+BoundsException.getClass().getName());

}}

Figure 4. JUnitDoc generated code for an #excMonitor template.

repeated in each subclass, the redundancy makes maintenance tedious and error-prone. For example,both LinkedList and ArrayList have an add method. Since the method specifications areidentical, the add test cases must be duplicated. If instead the tests are written in the superclass,different problems arise. Because AbstractList is uninstantiable no object is available for testexecution. Furthermore, implementations in a subclass will not be tested by tests in the superclass.JUnitDoc solves these problems by helping the tester develop a test hierarchy paralleling the classhierarchy.

In JUnitDoc, test cases may appear above a method or a class, areas respectively named themethod block and class block. A block can contain any legal Java statement, including methods andinner classes. The @testcases blocks are extracted for two purposes: documentation and testing.As documentation, they are transformed with the Javadoc tool and the GenDoc taglet into HTML.The contents of the test case blocks are restricted to the class they came from; inheritance plays no parthere. For the test driver, all @testcases blocks are extracted with GenDriver into one test class.In other words, for each subclass the test driver contains the collected test code of all the superclassesof that class. The order of the code and visibility are dependent on the class hierarchy.

For each method m(paramList) with a test block, a test method called testm paramList is added tothe driver. Extraction proceeds top down from the source, so the order of test methods follows that ofthe original structure. The concatenated test method name ensures uniqueness among overloaded classmethods. GenDriver extracts class blocks to a location in the driver so that they are globally visible tothe rest of the driver.

The power of JUnitDoc manifests itself with inherited statements. Consider a subclass Child and itssuperclass Parent. When GenDriver is run on Child, it will include every class and method block fromboth classes in the driver, in superclass–subclass order. For example, Parent’s class block immediatelyprecedes Child’s. Similarly, for a method with signature m(paramList), the Parent method blockprecedes the Child method block in the generated test.

As the inherited behaviour is not always desired, JUnitDoc provides an option to suppress it forselected method blocks. For a method with signature m(paramList) and method blocks in both Childand Parent, only the Child method block is used. If the method is in the Parent but not the Child, oronly the Parent has method blocks, it is still inherited.

Copyright c© 2005 John Wiley & Sons, Ltd. Softw. Test. Verif. Reliab. 2005; 15:235–256

Page 9: Tool support for executable documentation of Java class hierarchies

TOOL SUPPORT FOR EXECUTABLE DOCUMENTATION 243

4. BUILDING TEST HIERARCHIES WITH JUnitDoc

Inheritance allows a programmer to factor out common code and place it in a superclass. Inheritancecan be applied to method implementations and also to method declarations. AbstractList, forexample, contains the declarations for all but a few of the methods implemented in the LinkedListand ArrayList classes. The AbstractList developers determined the method prototypes andbehaviour for these inherited methods. The LinkedList and ArrayList developers providedimplementations to satisfy the AbstractList specifications, and added a few new methods as well.

With JUnitDoc, the tester builds a test hierarchy which parallels the inheritance hierarchy. As inthe inheritance hierarchy, common test cases are factored out, typically appearing as high in thehierarchy as possible. As programmers familiar with inheritance, the current authors have found thatthe JUnitDoc methodology is easy to grasp.

4.1. Methodology

JUnitDoc test hierarchies have one node for each class in the inheritance hierarchy. For class C, thecorresponding node in the test hierarchy consists of the class and method blocks in C.java. For eachnode, four kinds of code are considered.

• Test assisters. These are utility methods and variables shared between the test blocks. This codeis usually written in the class block so that it is accessible in all the test blocks.

• Test cases for methods introduced by this class. These test cases thoroughly document and testthe new methods. Careful attention is given to boundary conditions and exceptions.

• Test cases for methods inherited and modified. For subclass methods that add functionality to thesuperclass methods, additional test cases are written. For methods that change the functionality,it may be necessary to suppress the test method block from the superclass and override it in thesubclass.

• Test cases for methods inherited without change. While the inherited tests are usually sufficient,sometimes new test cases are written to exercise a particular implementation. For example,suppose that LinkedList is implemented as a singly linked list. Then, because removingthe last element in a linked list is tricky, test cases might be added to the remove method tofocus on the last element.

While the methodology is described in terms of classes, JUnitDoc test cases can be attached tointerfaces as well.

4.2. Test cases

AbstractList test cases

• Test assisters. As shown in Figure 5, the AbstractList class block introduces four variables.The first, list, is initialized to null because AbstractList cannot be instantiated.In each concrete subclass, list must be reassigned. The remaining three variables in theAbstractList class block create exception objects to improve readability in the test casesto follow.

Copyright c© 2005 John Wiley & Sons, Ltd. Softw. Test. Verif. Reliab. 2005; 15:235–256

Page 10: Tool support for executable documentation of Java class hierarchies

244 D. HOFFMAN, P. STROOPER AND S. WILKIN

* ...* @testcases* AbstractList list = null; // must be instantiated in concrete subclass* Exception BoundsException = new IndexOutOfBoundsException();* Exception NoElementException = new NoSuchElementException();* Exception ArgumentException = new IllegalArgumentException();*/public abstract class AbstractList {

...* @testcases* list.clear();* // Where can we add?* list.add("0"); list.add("1");* #excMonitor list.add(-1,"a"); # BoundsException #end* #excMonitor list.add(0,"a"); #end* #valueCheck list.remove(0) # "a" #end* #excMonitor list.add(list.size(),"a"); #end* #valueCheck list.remove(list.size()-1) # "a" #end* #excMonitor list.add(list.size()+1,"a"); # BoundsException #end*/

public void add(int index, Object element){ .. }

...* @testcases* list.clear();* list.add("a"); list.add("b"); list.add("c");** // What are the legal values for fromIndex and toIndex?* #excMonitor list.subList(-1,1); # BoundsException #end* #excMonitor list.subList(0,1); #end* #excMonitor list.subList(1,1); #end* #excMonitor list.subList(2,1); # ArgumentException #end* #excMonitor list.subList(1,3); #end* #excMonitor list.subList(1,4); # BoundsException #end** // What subList is extracted for the legal values?* #valueCheck list.subList(1,1).size() # 0 #end* #valueCheck list.subList(1,3).size() # 2 #end* #valueCheck list.subList(1,3).get(0) # "b" #end* #valueCheck list.subList(1,3).get(1) # "c" #end*/

public List subList(int fromIndex, int toIndex){ .. }}

Figure 5. Test cases for AbstractList.

Copyright c© 2005 John Wiley & Sons, Ltd. Softw. Test. Verif. Reliab. 2005; 15:235–256

Page 11: Tool support for executable documentation of Java class hierarchies

TOOL SUPPORT FOR EXECUTABLE DOCUMENTATION 245

add

public void add(int index,Object element)

Inserts the specified element at the specified position in this list (optional operation). Shifts theelement currently at that position (if any) and any subsequent elements to the right (adds one totheir indices).

This implementation always throws an UnsupportedOperationException.

Specified by:add in interface List

Parameters:index - index at which the specified element is to be inserted.element - element to be inserted.

Throws:UnsupportedOperationException - if the add method is not supported by this list.ClassCastException - if the class of the specified element prevents it from being added tothis list.IllegalArgumentException - if some aspect of the specified element prevents it frombeing added to this list.IndexOutOfBoundsException - index is out of range (index < 0 || index > size()).

Test Cases:

list.clear();// Where can we add?

list.add("0"); list.add("1");#excMonitor list.add(-1,"a"); # BoundsException #end#excMonitor list.add(0,"a"); #end#valueCheck list.remove(0) # "a" #end#excMonitor list.add(list.size(),"a"); #end#valueCheck list.remove(list.size()-1) # "a" #end#excMonitor list.add(list.size()+1,"a"); # BoundsException #end

Figure 6. Documentation for AbstractList.add.

• Test cases for methods introduced by this class. The test cases for two of the 34 AbstractListmethods are shown in Figure 5. The generated documentation is shown in Figure 6. The casesfor add focus on the index boundaries. First the list is cleared and two elements are added.Valid indexes for adding to a list start at 0, so by adding at −1, one past the boundary is tested.An #excMonitor template wrapping the call shows a BoundsException is expected.Next the extremes of valid add indexes are tested: 0 and list.size. After the elements areadded, the object is removed, in a #valueCheck template, to check it has the correct value.Adding at list.size+1 is tested identically to the −1 test. The tests for subList are directtranslations of cases 1–8 from the Java driver shown in Figure 3.

Copyright c© 2005 John Wiley & Sons, Ltd. Softw. Test. Verif. Reliab. 2005; 15:235–256

Page 12: Tool support for executable documentation of Java class hierarchies

246 D. HOFFMAN, P. STROOPER AND S. WILKIN

* ...* @testcases* list = new LinkedList();* LinkedList llist = (LinkedList) list;*/

public class LinkedList implements List {...* @testcases* llist.clear();* // What can we legally remove?* llist.add("0"); llist.add("1");* #valueCheck llist.removeFirst() # "0" #end* #valueCheck llist.removeFirst() # "1" #end* #excMonitor llist.removeFirst(); # NoElementException #end*/

public Object removeFirst() {...

}...

...* @testcases* llist.clear();* // Can null be added to the list?* #excMonitor llist.add(0,null); #end*/

public void add(int index, Object element) {...

}...

}

Figure 7. Test cases for LinkedList.

• Test cases for methods inherited and modified. There are no such methods.• Test cases for methods inherited without change. There are no such methods.

LinkedList test cases

• Test assisters. As shown in Figure 7, the list reference is initialized to point to a LinkedListobject in the LinkedList class block. A second reference variable, llist, references thesame object but as type LinkedList to avoid compiler errors.

• Test cases for methods introduced by this class. One method LinkedList adds to thoseinherited by AbstractList is removeFirst. To check that the object removed was the firstin the list, two objects are added and removeFirst is called in a #valueCheck template.The call is repeated, testing removal of the remaining object. With the list empty, the third callto removeFirst should throw NoElementException.

Copyright c© 2005 John Wiley & Sons, Ltd. Softw. Test. Verif. Reliab. 2005; 15:235–256

Page 13: Tool support for executable documentation of Java class hierarchies

TOOL SUPPORT FOR EXECUTABLE DOCUMENTATION 247

• Test cases for methods inherited and modified. LinkedList.add supplements the cases fromAbstractList.add by adding a test for null elements (null elements are illegal inAbstractList but permitted in LinkedList). The test shows that null can be addedto a LinkedList by leaving out an expected exception in the #excMonitor; any thrownexception causes the case to fail.

• Test cases for methods inherited without change. The test cases for the remaining 33AbstractList methods are left unchanged.

ArrayList test cases

• Test assisters. As for LinkedList, the list reference is instantiated and a reference of typeArrayList is introduced.

• Test cases for methods introduced by this class. Test cases are added for the ensureCapacitymethod.

• Test cases for methods inherited and modified. There are no such methods.• Test cases for methods inherited without change. The test cases for the 34 AbstractList

methods are left unchanged.

4.3. Discussion

The benefits drawn from testing with JUnitDoc are similar to those gained in development whenswitching from a flat structure to a class hierarchy. A superclass provides tests that can be inherited,reducing duplicated code and improving maintainability. In addition, since the tests are specificationbased, often they can be written before the implementation. For example, an abstract method orinterface method cannot have a method body. Full test code, however, can be written alongside thefunction prototype, supplementing documentation and forcing implementors to provide the correctbehaviour, not just the correct method prototype. This compliance can be checked automatically, forall the subclasses, just by generating and running the JUnitDoc driver.

When writing test cases, either JUnit methods or test templates may be used. From a codingperspective, the difference between writing a value check using templates or JUnit is negligible.For example, the following template

#valueCheck list.remove(0) # "a" #end

can be rewritten in JUnit syntax as

assertEquals(list.remove(0), "a");

JUnitDoc value check templates do provide one important benefit. JUnit’s assert statements assumethat all classes provide an equals method. Without one, objects are compared by reference. Even ifprovided, the default equals method in a class may be unsuitable for testing. With test templates, acustom comparison routine can be written for any class while leaving the original class unmodified.

Exception checking is far simpler with JUnitDoc than with JUnit alone. Figure 8(a) shows the JUnitequivalent of the following template:

#excMonitor list.sublist(-1,1); # BoundsException #end

Copyright c© 2005 John Wiley & Sons, Ltd. Softw. Test. Verif. Reliab. 2005; 15:235–256

Page 14: Tool support for executable documentation of Java class hierarchies

248 D. HOFFMAN, P. STROOPER AND S. WILKIN

try {list.subList(-1,1);

} catch (Throwable exception) {if (BoundsException.getClass() != exception.getClass()) {

fail("Actual exception: "+exception.getClass().getName()+" Expected exception: "+BoundsException.getClass().getName());

}}

(a)

try {list.subList(-1,1);fail("Should raise an IndexOutOfBoundsException");

} catch(IndexOutOfBoundsException e) {// pass

}

(b)

Figure 8. Testing for an exception in JUnit: (a) template equivalent; (b) JUnit FAQ suggested style.

At eight lines, the JUnit version is clumsy to maintain as test code and useless as documentation.The issue of exception testing has been addressed in the JUnit FAQ: ‘How do I implement a testcase for a thrown exception?’ [9]. One proposed solution is shown in Figure 8(b). With this method,the failure text must be maintained to reflect the exception being checked and the failure messageomits the name of the actual exception thrown. Further, an #excMonitor template can use customexception comparison. For example, sometimes checking messages or other parameters passed withan exception is required. Also, in cases where a method ambiguously throws different exceptions, acustom comparison routine could accept any one of them as correct.

This section has described the use of JUnitDoc for test cases that follow the FAQ approach: eachtest case is aimed at both documentation and quality control. It is important to note that JUnitDoc canbe effective simply as a test tool. The tester gains significant power from the ability to build a testhierarchy supporting a simple kind of inheritance.

5. JUnitDoc VERSUS Object-Z: A CONTROLLED EXPERIMENT

While the previous section focused primarily on the use of JUnitDoc in testing, this section presentsa small controlled experiment comparing the readability of FAQ documentation with an Object-Z [7]specification of a Dependency Management System. The two characteristics that are compared areaccuracy—how well the readers understand the specification, and cost—how much time it takes thereaders to understand the specification.

Copyright c© 2005 John Wiley & Sons, Ltd. Softw. Test. Verif. Reliab. 2005; 15:235–256

Page 15: Tool support for executable documentation of Java class hierarchies

TOOL SUPPORT FOR EXECUTABLE DOCUMENTATION 249

The experiment was designed and implemented as part of an honours project by a student who wasnot familiar with the FAQ approach prior to the project. Because the experiment was carried out beforethe JUnitDoc tool was fully developed, the documentation is presented as it was used in the experiment,which differs slightly from the format presented in the previous sections. The authors believe that thechange in format would not significantly change the results of the experiment.

5.1. Experimental design

To limit the influence of the ‘learning effect’, the experiment used a post-test two-group randomizeddesign [10]: the participants were randomly split into two groups and each group was presented withone of the two specifications and a list of questions to be answered. The questions were the same forboth groups.

5.1.1. Hypotheses

The following two hypotheses were tested in the experiment.

• H1: the use of the FAQ approach will, on average, result in a higher level of performance thanthe use of the corresponding Object-Z specification.

• H2: the FAQ specification will require, on average, less time to comprehend than thecorresponding Object-Z specification.

In each case, the null hypothesis is that there is no difference (in level of performance or time tocomprehend the specification) between the two approaches.

5.1.2. Population

The participants in the experiment were 24 students in a fourth year computer science course on formalspecification and testing at the University of Queensland. The formal specification part of the coursetaught the students Object-Z and the experiment was conducted at the start of the lecture in which theFAQ approach was introduced. A brief questionnaire was included as part of the experiment to manageany potential variability between the two groups of participants. For example, participants were askedto rate their own confidence with Object-Z and 92% of the participants rated themselves as having‘some confidence’ or being ‘confident’ at reading Object-Z specifications. Only one student claimedthat he had any previous exposure to the FAQ approach. English was the first language for 67% of theparticipants.

5.1.3. Target module

A Dependency Management System (DMS) is a standard tool for managing dependencies of items.For example, it can be used to track dependencies between theorems in a theorem-proving tool to avoidcircular dependencies. The version of the DMS that was used in the experiment is a simplified versionof the DMS presented in [11]. Only one participant had seen a version of this case study previously.

The DMS manages a set of nodes and a set of dependencies between these nodes, and contains anumber of operations that query the state of the DMS. The DMS specification was selected because it

Copyright c© 2005 John Wiley & Sons, Ltd. Softw. Test. Verif. Reliab. 2005; 15:235–256

Page 16: Tool support for executable documentation of Java class hierarchies

250 D. HOFFMAN, P. STROOPER AND S. WILKIN

DMS[X]

nodes : FXddo : X ↔ X

�: X ↔ X

ddo ⊆ nodes × nodes�= ddo+�x : X • x � x

Operation Dependents returns all of the dependents of the specified node.

Dependentsx? : X

ns! : FX

x? /∈ nodes → NodeNotPresentException

x? ∈ nodesns! = {n : nodes|n � x?}

Figure 9. Object-Z specification for Dependents.

is moderately complex, but the number of operations was reduced from fifteen to nine to reduce thetime required for the experiment. An extended version of the Object-Z notation was used to supportthe specification of exceptions (following [12]).

To give an impression of the Object-Z specification and the FAQ documentation used in theexperiment, Figures 9 and 10 show parts of these specifications for the Dependents operation.The state of the Object-Z specification consists of nodes, a finite set of nodes, ddo, a relation ofdirect dependencies, and the ‘secondary state variable’ �, which maintains the transitive closure ofthe dependency relation. The fact that � is a secondary variable means that its value can be derivedfrom the values of the primary state variables (as specified by the state invariant) and that updates ofthis variable do not need to be specified elsewhere. The state invariant, shown below the line, specifiesthat all elements that form part of ddo must be in nodes× nodes, that � is the transitive closure of ddo,and that there are no circular dependencies.

The Dependents operation shows the extension to Object-Z for exceptions, which consists of anexception part between the variable declarations at the top and the operation predicate at the bottom.The operation has one input parameter, x?, representing a node for which the set of dependents mustbe returned in the output parameter ns!. The exception part specifies that the operation should signal

Copyright c© 2005 John Wiley & Sons, Ltd. Softw. Test. Verif. Reliab. 2005; 15:235–256

Page 17: Tool support for executable documentation of Java class hierarchies

TOOL SUPPORT FOR EXECUTABLE DOCUMENTATION 251

// **** Dependents **********************************************************************************// ** Method Summary: public String Dependents(char n)// ** Output: returns, as a string, all the nodes that directly or transitively// depend on node n// ** Exceptions: throws NodeNotPresentException if node n is not in the system.**************************************************************************************************

// **** How do we find out the dependents of a node?

// case 1: node does not exist in graphdms = new DMS();#excMonitor dms.AddNode(’a’); dms.AddNode(’b’);dms.AddDependence(’a’,’b’); #end

#excMonitor dms.Dependents(’c’); # NodeNotPresentException #end

// case 2: node exists in graphdms = new DMS();#excMonitor dms.AddNode(’a’); dms.AddNode(’b’); dms.AddNode(’c’);dms.AddDependence(’a’,’b’); dms.AddDependence(’b’,’c’); #end

#valueCheck dms.Dependents(’c’) # new char[] {’a’,’b’} # StringCompare #end#valueCheck dms.Dependents(’a’) # new char[] {} # StringCompare #end

Figure 10. FAQ specification for Dependents.

the exception NodeNotPresentException if x? is not a current node. The predicate part specifies thenormal behaviour of this operation, which is to return the current nodes n that depend on x? (directlyor transitively).

Figure 10 shows the FAQ documentation for Dependents. For the first test case, the DMSis loaded with two nodes and one dependency; the test case checks that Dependents throwsNodeNotPresentException when the parameter is not a current node. For the last two test cases,the DMS is loaded with three nodes and two dependencies; the test cases check that a node with twoand zero dependencies returns the correct dependent nodes. Since Dependents returns a string ofnodes, the last two test case templates use the StringCompare valueType to compare this stringwith an array of characters. The comparison is successful if the string contains the same characters thesame number of times, though not necessarily in the same order, as the array.

5.1.4. Procedure

At the start of the lecture, before the experiment, the students were presented with a 20-minuteintroduction to the FAQ approach and with the method for specifying exceptions in Object-Z.The students were then randomly split into two groups of 12 and each group was given a handoutcontaining a consent form, an instruction sheet, one specification (either Object-Z or JUnitDoc), anda questionnaire. In addition to the questions on the background of the students, the questionnaire

Copyright c© 2005 John Wiley & Sons, Ltd. Softw. Test. Verif. Reliab. 2005; 15:235–256

Page 18: Tool support for executable documentation of Java class hierarchies

252 D. HOFFMAN, P. STROOPER AND S. WILKIN

DMS dms = new DMS();

dms.AddNode(’A’); dms.AddNode(’B’); dms.AddNode(’C’);dms.AddNode(’D’); dms.AddNode(’E’); dms.AddNode(’F’);dms.AddNode(’G’); dms.AddNode(’I’);

dms.AddDependence(’A’,’B’); dms.AddDependence(’A’,’C’);dms.AddDependence(’F’,’B’); dms.AddDependence(’C’,’F’);dms.AddDependence(’F’,’I’);

Question 1The following call is made to the Dependency Management System above:

dms.Dependents(’F’);

What will be returned by executing this call?1. nodes ‘C’ and ‘A’2. nodes ‘I’ and ‘B’3. node ‘A’4. none of the above

Question 8List all the nodes n for which the call AddDependence(’I’,n) will raise a DependencyException.

Figure 11. Part of experiment questionnaire.

contained five multiple-choice questions and four more open-ended questions about the DMS system.The questions were the same for both groups of participants. All questions except one were related toone instance of the DMS, which was defined in the questionnaire as a sequence of calls, as shown inFigure 11. The figure also shows one multiple-choice question and one open-ended question from thequestionnaire.

Timing of the experiment commenced after the consent forms had been collected and the instructionsheets had been read. Participants were instructed to signal the experimenter on completing thequestionnaire so that the elapsed time could be recorded. The answers to the questions were marked ona pass/fail basis.

5.2. Results

On average, the Object-Z group answered 5.17 out of nine questions correctly with a standard deviationof 2.41, while the FAQ group answered 6.92 questions correctly with a standard deviation of 2.50.The statistical significance of this result was evaluated using the Mann–Whitney test [10] and was notfound to be statistically significant within the 0.05 confidence interval. Thus, the null hypothesis thatthere is no difference in the performance between the two approaches cannot be rejected based on thesedata.

The Object-Z group outperformed the FAQ group on two questions, but the difference was notstatistically significant in either case. The FAQ group outperformed the Object-Z group on the other

Copyright c© 2005 John Wiley & Sons, Ltd. Softw. Test. Verif. Reliab. 2005; 15:235–256

Page 19: Tool support for executable documentation of Java class hierarchies

TOOL SUPPORT FOR EXECUTABLE DOCUMENTATION 253

seven questions, and the results were statistically significant for three of these questions, includingquestions 1 and 8 shown in Figure 11. The most notable of these is question 8, which none ofthe Object-Z group answered correctly, whereas five out of twelve from the FAQ group answered itcorrectly.

Regarding the time to complete the questionnaire, the Object-Z group took 26.3 min on average tocomplete the questionnaire with a standard deviation of 8.5. The FAQ group took 25.8 min on averagewith a standard deviation of 5.4. In this case, the difference in time is not statistically significant withinthe 0.05 confidence interval, but it is interesting to note that there is much less variation in the timetaken by the FAQ group.

It was also evaluated whether there was a positive relationship between the accuracy of the resultsand the time taken for both groups, but no statistically significant relationship was found.

5.3. Validity of results

The following threats to the internal validity of the results were considered: maturation, instrumentationand assignment. The threat of maturation is concerned with a change in attitude of the participantsand it was minimized by keeping the specification size and the number of questions relativelysmall. The threat of instrumentation is concerned with the quality of experimental materials, whichwere analysed and reviewed by several people before the experiment. The threat of assignment wasminimized by assigning the participants randomly to the two groups.

The threats to the external validity of the results that were considered are the generalizability ofthe setting and the results. The results were obtained using a non-trivial software component in auniversity environment with undergraduate and masters students. Further experimentation would beneeded to generalize these results, for example, for industrial software developers. However, since thisgroup of students was quite recently trained in the use of formal methods, it is likely that the resultswould be more favourable for the FAQ approach in an industrial setting.

6. RELATED WORK

The use of examples in documentation is an old idea. Today, use cases [13] are probably the best knowntechnique for software documentation based on examples. While use cases are usually informal and notexecutable, they can be made executable, as research on SCR requirements documents has shown [14].The JUnitDoc test cases can be thought of as executable API use cases.

Hsia et al. [15] present a systematic, formal method for scenario analysis that supports requirementsanalysis and change, and acceptance testing. The method is extended to serve as a starting point for aformal model for scenario-based acceptance testing [16,17]. The systematic approach allows a set ofcomplete and consistent scenarios to be derived for acceptance testing. Similarly, Chang et al. [18] andChen et al. [19] describe a method for generating test scenarios for integration and system testing fromformal, Object-Z specifications and usage profiles.

Briand and Labiche [20] present a scheme for deriving tests for object-oriented systems from designartifacts such as use cases. Class documentation prepared with JUnitDoc documentation could be usedas input in their approach.

Copyright c© 2005 John Wiley & Sons, Ltd. Softw. Test. Verif. Reliab. 2005; 15:235–256

Page 20: Tool support for executable documentation of Java class hierarchies

254 D. HOFFMAN, P. STROOPER AND S. WILKIN

Using test cases in documentation involves test case selection, a central topic in testingresearch [21–23]. The approach presented in this paper is also consistent with proposals for extremeprogramming [3,4], where API test cases [2] and the use of tests for documentation purposes playan important role. The JUnit testing framework [24] is in common use in the extreme programmingcommunity and supports the testing of Java classes. It has been applied in a number of applicationdomains, including Enterprise JavaBeans [25].

In an approach similar to that of JUnitDoc, Deveaux et al. [26] combine embedded textualdocumentation and semi-formal specification to support self-testable classes in Java (the same approachhas also been applied to Eiffel). The main difference between the two approaches is that the tests in theJUnitDoc approach are included mainly for documentation purposes (which means that readability is aprime concern), whereas in the approach of Deveaux et al. the tests are used primarily for verificationand validation. Another difference is that the approach of Deveaux et al. is based around semi-formalspecifications using design-by-contract [27,28].

Techniques for programming by example have long been studied in the artificial intelligence (AI)research community. For example, Winston [29] examines the importance of ‘hit’ and ‘near miss’examples in machine learning. In this AI work, however, a machine generalizes from examples, whilethe goal with JUnitDoc is to get humans to generalize from examples.

Engelmann and Carnine [30] provide an extensive treatment of how to select examples and counter-examples to produce a chosen generalization in the mind of the reader. They emphasize efficiency—using as few examples as possible—and accuracy—choosing examples to minimize the probabilityof misunderstanding. Their work is directly relevant to the work described here because the goals arethe same: precise communication with humans of a general rule through a small number of specificexamples.

On the topic of testing inheritance hierarchies, Perry and Kaiser [31] show that code inherited froma superclass may need to be retested in the context of a subclass. Harrold et al. [32] extend this workby considering what member functions must be retested, based on how a subclass is derived fromits superclass. Smith and Robson [33] use regression analysis to determine which member functionsshould be tested and then perform the tests guided by how the superclass was tested. Inheritedroutines that are not affected by the derived class are not retested. Both Fiedler [34] and Cheathamand Mellinger [35] discuss subclass testing, but neither approach tries to reuse the superclass’s testsuite. Several authors have proposed implementing a test hierarchy that parallels the implementationhierarchy. For example, both Binder [36] and Firesmith [37] propose test patterns that suggest this.Murphy et al. [38] describe the ACE tool, which generates drivers for C++ and Eiffel classes.ACE provides a mechanism for dealing with derived classes but does not embed test cases in codecomments or use test cases as examples in documentation.

Although there is considerable argument as to whether formal methods require mathematicalsophistication, very few experimental evaluations of the readability of formal specifications have beenreported in the literature. Some argue that the mathematics for specification is easy [39], while othersargue that this is not quite the case [40]. The only substantial experimental study that the current authorsare aware of is that of Finney et al. [41], which evaluated the effects of natural language comments,variable naming, and structuring on the comprehensibility of Z specifications. Kneuper [42] correctlypoints out that it is not only the ability of the developers to use formal methods that needs to beconsidered, but also their willingness to do so. The present authors concur and note that, while it isunlikely that formal specifications will be used for API documentation in the next five to ten years,

Copyright c© 2005 John Wiley & Sons, Ltd. Softw. Test. Verif. Reliab. 2005; 15:235–256

Page 21: Tool support for executable documentation of Java class hierarchies

TOOL SUPPORT FOR EXECUTABLE DOCUMENTATION 255

the JUnitDoc test cases presented in this paper are formal, partial specifications that can be understoodby developers without the need for further training.

7. CONCLUSIONS

This paper has described the JUnitDoc tool, which integrates Javadoc and JUnit to support Java APIdocumentation and testing. The tool supports the testing of class hierarchies by building a test hierarchythat parallels the implementation hierarchy. A detailed example showed that JUnitDoc provides benefitsfor testing analogous to those gained in object-oriented development; JUnitDoc allows the tester toexploit commonality and thus reduce the amount of duplicated test code. A small controlled experimentshowed that the JUnitDoc approach is at least as effective as a formal specification, even though thereaders had only a very brief introduction to the JUnitDoc approach and notation, compared with moresubstantial training in the formal notation.

ACKNOWLEDGEMENTS

The authors thank Marissa Miller for carrying out all the preparation and the experiment described in Section 5,Tim Miller and Leon Moonen for their comments on an earlier version of the paper, and the anonymous refereesfor their thoughtful comments.

REFERENCES

1. McIlroy MD. Mass-produced software components. Software Engineering: Concepts and Techniques (Proceedings of the1968 NATO Conference on Software Engineering), Buxton JM, Naur P, Randell B (eds.). Petrocelli-Charter: New York,1976; 88–98.

2. Jeffries RE. Extreme testing. Software Testing & Quality Engineering 1999; March/April: 23–26.3. Beck K. Embracing change with extreme programming. IEEE Computer 1999; 32(10):70–77.4. Beck K. Extreme Programming Explained: Embrace Change. Addison-Wesley: Boston, MA, 1999.5. Hoffman D, Strooper P. Prose + test cases = specifications. Proceedings of the 34th International Conference on

Technology of Object-Oriented Languages and Systems (TOOLS’00). IEEE Computer Society Press: Los Alamitos, CA,2000; 239–250.

6. Hoffman D, Strooper P. API documentation with executable examples. The Journal of Systems and Software 2003;66(2):143–156.

7. Duke R, Rose G. Formal Object-Oriented Specification Using Object-Z. Palgrave Macmillan: Basingstoke, U.K., 2000.8. Musser DR. Saini A. STL Tutorial and Reference Guide. Addison-Wesley: Boston, MA, 1996.9. JUnit. JUnit frequently asked questions. http://junit.sourceforge.net/doc/faq/faq.htm [March 2005].

10. Wohlin C, Runeson P, Host M, Ohlsson MC, Regnell B, Wesslen A. Experimentation in Software Engineering: AnIntroduction. Kluwer Academic: Norwell, MA, 2000.

11. Carrington D, MacColl I, McDonald J, Murray L, Strooper P. From Object-Z specifications to ClassBench test suites.Software Testing, Verification and Reliability 2000; 10(2):111–137.

12. McDonald J, Strooper PA. Translating Object-Z specifications to passive test oracles. International Conference on FormalEngineering Methods (ICFEM’98). IEEE Computer Society Press: Los Alamitos, CA, 1998; 165–174.

13. Jacobson I. Object-Oriented Software Engineering: A Use Case Driven Approach. Addison-Wesley: Boston, MA, 1992.14. Miller S. Specifying the mode logic of a flight guidance system in CoRE and SCR. 2nd ACM Workshop on Formal Methods

in Software Practice. ACM Press: New York, 1998; 44–53.15. Hsia P, Samuel J, Gao J, Kung D, Toyoshima Y, Chen C. Formal approach to scenario analysis. IEEE Software 1994;

11(2):33–41.

Copyright c© 2005 John Wiley & Sons, Ltd. Softw. Test. Verif. Reliab. 2005; 15:235–256

Page 22: Tool support for executable documentation of Java class hierarchies

256 D. HOFFMAN, P. STROOPER AND S. WILKIN

16. Hsia P, Gao J, Samuel J, Kung D, Toyoshima Y, Chen C. Behavior-based acceptance testing of software systems: A formalscenario approach. Proceedings of International Computer Software and Applications Conference. IEEE Computer SocietyPress: Los Alamitos, CA, 1994; 293–298.

17. Hsia P, Kung D, Sell C. Software requirements and acceptance testing. Annals of Software Engineering 1997; 3:291–317.18. Chang KH, Liao S-S, Seidman SB, Chapman R. Testing object-oriented programs: From formal specification to test

scenario generation. The Journal of Systems and Software 1998; 42(2):141–151.19. Chen C-Y, Chang KH, Chapman R. Test scenario and test case generation based on Object-Z formal specification.

Proceedings of the 11th International Conference on Software Engineering and Knowledge Engineering (SEKE’99).Knowledge Systems Institute, 1999; 207–211.

20. Briand L, Labiche Y. A UML based approach to system testing. The Journal of Software and Systems Modeling 2002;1(1):10–42.

21. White LJ, Cohen EI. A domain strategy for computer program testing. IEEE Transactions on Software Engineering 1980;6(3):247–257.

22. Weyuker, EJ, Ostrand TJ. Theories of program testing and the application of revealing subdomains. IEEE Transactions onSoftware Engineering 1980; 6(3):236–246.

23. Richardson DJ, Clarke LA. Partition analysis: A method combining testing and verification. IEEE Transactions on SoftwareEngineering 1985; 11(12):1477–1490.

24. Fowler M. Building tests. Refactoring: Improving the Design of Existing Code. Addison-Wesley: Boston, MA, 1999;chapter 4.

25. Nygard MT, Karsjens T. Test infect your Enterprise JavaBeans. Java World 2000; May.26. Deveaux D, Frison P, Jezequel J-M. Increase software trustability with self-testable classes in Java. Proceedings of the

2001 Australian Software Engineering Conference. IEEE Computer Society Press: Los Alamitos, CA, 2001; 3–11.27. Jezequel J-M, Meyer B. Design by contract: The lessons of Ariane. IEEE Computer 1997; 30(1):129–130.28. Meyer B. Object-Oriented Software Construction (2nd edn). Prentice-Hall: Englewood Cliffs, NJ, 1997.29. Winston P. The Psychology of Computer Vision. McGraw-Hill: New York, 1975.30. Engelmann S, Carnine D. Theory of Instruction: Principles and Applications (2nd edn). ADI Press: Eugene, OR, 1991.31. Perry DE, Kaiser GE. Adequate testing and object-oriented programming. Journal of Object-Oriented Programming 1990;

2(5):13–19.32. Harrold MJ, McGregor JD, Fitzpatrick KJ. Incremental testing of object-oriented class structures. Proceedings of the 14th

International Conference on Software Engineering. ACM Press: New York, 1992; 68–80.33. Smith MD, Robson DJ. A framework for testing object-oriented programs. Journal of Object-Oriented Programming 1992;

5(3):45–53.34. Fiedler SP. Object-oriented unit testing. Hewlett-Packard Journal 1989; 40(2):69–74.35. Cheatham TJ, Mellinger L. Testing object-oriented software systems. Proceedings of the 1990 ACM 18th Annual Computer

Science Conference. ACM Press: New York, 1990; 161–165.36. Binder RV. Testing Object-Oriented Systems—Models, Patterns, and Tools. Addison-Wesley: Boston, MA, 1999.37. Firesmith DG. Pattern language for testing object-oriented software. Object Magazine 1996; 5(9):32–38.38. Murphy G, Townsend P, Wong PS. Experiences with cluster and class testing. Communications of the ACM 1994; 37(9):39–

47.39. Hall A. Seven myths of formal methods. IEEE Software 1990; 7(5):11–19.40. Finney K. Mathematical notation in formal specification: Too difficult for the masses? IEEE Transactions on Software

Engineering 1996; 22(2):158–159.41. Finney K, Rennolls K, Fedorec A. Measuring the comprehensibility of Z specifications. The Journal of Systems and

Software 1998; 42(1):3–15.42. Kneuper R. Limits of formal methods. Formal Aspects of Computing 1997; 9(4):379–394.

Copyright c© 2005 John Wiley & Sons, Ltd. Softw. Test. Verif. Reliab. 2005; 15:235–256