1 spears school of business tuesday nov 16, 2004 12:00 noon msis/iris research seminar fall 2004...

54
Tuesday Nov 16, 2004 12:00 Noon MSIS/IRIS Research Seminar Fall 2004 Romano DFD Heuristic Evaluations with GSS 1 Spears School of Business Improving design artifact reviews with group support systems and an extension of heuristic evaluation techniques Nicholas C. Romano, Jr. [email protected] Oklahoma State University Co-Researchers/Authors Tom L. Roberts [email protected] University of Kansas Paul Benjamin Lowry [email protected] Kevin and Deborah Rollins Center for e-Business Marriott School Brigham Young University

Upload: nathaniel-poole

Post on 28-Dec-2015

213 views

Category:

Documents


0 download

TRANSCRIPT

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

1

Spears School of Business

Improving design artifact reviews with group support systems and

an extension of heuristic evaluation techniques

Nicholas C. Romano, [email protected]

Oklahoma State University

Co-Researchers/Authors

Tom L. [email protected]

University of Kansas

Paul Benjamin [email protected]

Kevin and Deborah Rollins Center for e-BusinessMarriott School

Brigham Young University

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

2

Spears School of Business

Talk Outline• Evaluation Techniques• Heuristic Evaluation (HE)• Design Artifact Reviews• Research Questions• H-DFD Methodology• Theory and Hypotheses• Method and Procedures• Results• Discussion

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

3

Spears School of Business

Evaluation Techniques• Heuristic Evaluation

• Cognitive Walkthroughs

• Formal Inspections

• Pluralistic Walkthroughs

• Structured Walkthroughs

• Published Guidelines

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

4

Spears School of Business

Heuristic Evaluation (HE)• Usability Engineering Technique

• Analysts quickly evaluate interface usability based on a set of heuristics (Nielsen & Molich 1990.)

• More effective (time and # of bugs found)than other evaluation approaches (Jeffries, et al. 1991, Nielsen & Molich 1990, Shaw 1993)

• Fits group work Naturally (Nielsen & Molich 1990)

• Optimal HE team size is 3 to 5 (Nielsen and Landauer 1993)

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

5

Spears School of Business

HE Process

• Pre-HE: Adoption of (usability) heuristics • Step 1: Each group member evaluates independently• Step 2: Group members share their results and develop a

final set of joint recommendations • Post-HE: Results are communicated to design

and development team

Note: HE does not focus on finding every possible (usability) conflict; instead, HE focuses on finding the important conflicts that are most likely to affect end users.

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

6

Spears School of Business

HE Literature ReviewSeveral research streams have investigated improving HE• Design-oriented HE evaluations (Garzotto et al. 1995)

• HE with Cognitive Walkthroughs (Sears 1997 )

• HE Web page evaluations (Levi & Conrad 1996)

• Functional-domain expert HE (Muller et al. 1998)

• Creating Website attractiveness heuristics (Sutcliffe 2001)

• Creating collaborative software heuristics (Baker et al. 2001)

• HE-based usability instrument (Agarwal &Venkatesh 2002)

• Improving HE with group support systems (Lowry & Roberts 2003)

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

7

Spears School of Business

Design Artifact ReviewsGiven the significant promise of HE in software engineering,

especially when combined with GSS, we believe HE can be extended to improve design artifact reviews, such as evaluation of dataflow diagrams (DFDs).

This is important because flaws in DFDs and other design artifacts, contribute to a significant number of flaws over the life of a systems development project.

Artifact reviews are similar to HE for usability in that groups of analysts try to root out problems and improve the reviewed artifacts.

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

8

Spears School of Business

Design Artifact Reviews: Why DFDs?

• DFDs are still more commonly used and students continue to prefer to use DFDs rather than UML Case diagrams (Nelson & Millet 2004)

• Typically DFDs Done before Data Modeling (ERDs)

• Data Flow Problems are more problematic than others

• Heuristics defined relatively easily compared to ERDs

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

9

Spears School of Business

Research QuestionOverarching Question:Can design artifact reviews of DFDs benefit from a Heuristic-Based methodology that leverages GSS?

Sub-questions1. Can useful DFD Heuristics be Defined?2. Will DFD Heuristics leveraging GSS add value to H-DFD Reviews?3. Can H-DFD with GSS be useful to Distributed Teams?

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

10

Spears School of Business

H-DFD MethodologyWe propose a new methodology to improve the design

artifact reviews commonly used to evaluate DFDs.

It is an extension of HE for usability evaluation that focuses on reviewing DFDs to identify design flaws.

We term this methodology Heuristic DFD (H-DFD).

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

11

Spears School of Business

DFD Heuristic Development

SAD Text Books &Reference Manuals

DFDHeuristics

Forty Years ofDFD Published Articles

1: Reduce complexity through decomposition 2: Processes don’t consume or create data 3: No process-less data 4: One flow, one set of data 5: Data flows in one direction 6: Useful descriptions for all objects 7: Simplify diagram 8: Stop on singularity (when to stop)

H-DFD Review Sheet

So we believe that the answer to our first sub questionCan useful DFD Heuristics be defined?Is Yes, as we have already begun to successfully use these in classes

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

12

Spears School of Business

Theory and Hypotheses

To answer the second and third sub questions we needed to we need to describe the process for H-DFD and Target Constructs and Measures

And also define conditions for different levels of the control variables of GSS support and proximity

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

13

Spears School of Business

IndividualLists of Flaws

Generic DFD Heuristic Evaluation ProcessSAD Text Books &Reference Manuals

DFDHeuristics

Step 1.

IndividualIdentifies and

Categorizes Flaws

Step 2.

Team Discusses Flaws,

removes duplicatesand re-categorizes

Step 3.

Design TeamAddresses

Identified Flaws

Forty Years ofDFD Published Articles DFD Design

Artifact

Final GroupList of Flaws

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

14

Spears School of Business

Constructs and Measures for HDFD Type of measure Component Description

Production Gross production The total number of reported flaws, regardless of legitimacy (correct or incorrect) and duplication.

Production Net production The total number of flaws reported by a group that are actual DFD flaws (no incorrect flaws are included) and do not include duplicates.

Production Gross work The amount of activity (work) that occurs in step two, based on the number of changes, additions, deletions, and re-categorizations from step one. Gross work is calculated by examining the changes and additions to the DFD flaws logs of the combined total items from step one compared to the total items from step two.

Production Real work The amount of work in step two that is correct and useful. This measures net increases in flaw categorization, elimination of incorrect flaws, and elimination of duplicates.

Quality Incorrect flaws (errors)

The total number of reported DFD flaws that turn out not to be actual DFD flaws (incorrect flaws) or those that are incorrectly assigned to a heuristic category.

Quality Duplicates The total number of DFD flaws that are reported more than once.

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

15

Spears School of Business

Step 1Individual

HDFDFlaw

Identificationand

Categorization

Step 3Scoring of

Flaw Reportsby trained

judges

DFDs

Actual Flaws

DuplicateFlaws

Gross Work

DFDHeuristics

IncorrectFlaws

GrossFlaws

Changes

Additions

Deletions

Re-categorizations

Real Work

IncorrectFlaws

Actual Flaws

DuplicateFlaws

Step 2Team

DiscussesRemove

Duplicatesand

Re-categorize

Process and Constructs for Nominal Group H-DFD Evaluation

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

16

Spears School of Business

We posit that net production is critical in HDFD

Goal of DFD production is to correctly model how data flows through the processes of a business as accurately as possible.

The Higher the net production, the more accurately the model represents the business and the more valuable the artifact review results will be to software design and development teams.

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

17

Spears School of Business

Step 1IndividualIdentifies

and Categorizes

Flaws

Step 3Scoring of

Flaw Reportsby trained

judges

IndividualLists of Flaws

Step 2Group

DiscussesRemoves Duplicates

andRe-categorizes

DesignArtifact

Final GroupList of Flaws

Nominal Group Fully Interactive GroupTraditionalH-DFD

Production BlockingConformance Pressure

Evaluation ApprehensionReluctance

Power Distance

Increased work to remove duplication

Group ProcessLosses Increased

Conceptual Model of why unsupported HDFD may reduce Productivity

Significant Duplication

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

18

Spears School of Business

Process Losses Explained• Production Blocking: in an interacting group only

one member can speak at the same time • Conformance Pressure: The feeling that ones

needs to conform to the group or to higher status individuals; rather than hold their own position.

• Evaluation Apprehension: The fear that one will be evaluated based on the ideas they state.

• Reluctance: Reticence to express ideas in a group setting.

• Power Distance: The perceived difference between higher and lower level individuals that affects whether or not one is willing to state ideas in a group setting or may cause them to conform.

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

19

Spears School of Business

Step 1IndividualIdentifies

and Categorizes

Flaws

IndividualLists of Flaws

Step 2Group

DiscussesRemoves Duplicates

andRe-categorizes

DesignArtifact

Final GroupList of Flaws

GSS H-DFD GSS-Nominal Group

Tacit Group AwarenessThrough Group memory

Minimization of Duplication InteractivityMotivation

Idea TriggeringSynergy

Positive Social FacilitationParallelismAnonymity

Reduced work due to

minimized duplication

GSS-Interactive Group

Conceptual Model of why HDFD with GSS may Enhance Productivity

Group ProcessGains Increased

Step 3Scoring of

Flaw Reportsby trained

judges

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

20

Spears School of Business

Group Awareness and Memory Explained

• Group awareness is the ability to know what other team members are working on, forms a tacit form of communication and coordination that improves interactivity and overall group results.

• Group memory is implemented by storing all of a group’s comments so that each individual can see the comments contributed by all group members.

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

21

Spears School of Business

Process Gains Explained• Interactivity: The ability of the group to to communicate

efficiently and effectively

• Motivation: An increase in participation due to the ability to see others work and the removal of the possibility of negative feedback on contributions.

• Idea Triggering: When seeing another person’s contribution triggers a new idea that would not have been thought of without seeing this seed idea.

• Anonymity enables group members to contribute to group discussions and collaborations without being identified

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

22

Spears School of Business

Process Gains Explained• Synergy: ability of a group to produce more and

better ideas than would be produced by the sum of individual group members working alone

• Positive Social Facilitation: when the work of others positively affects one’s contribution

• Parallelism: the ability of group members to contribute information simultaneously.

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

23

Spears School of Business

Conditions to Compare Experimentally

Condition Step one comm. Step two, comm.

Mode & tools

(A)Control Nominal groups; no direct communication and no group awareness provided.

FtF FtF; Excel

(B) Treatment No direct communication but group awareness provided.

FtF FtF; GS

(C) Treatment No direct communication but group awareness provided.

CMC Distributed; GS; NetMeeting

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

24

Spears School of Business

GSS H-DFD groups should have more net production than nominal non-GSS H-DFD groups:

H1a: GSS groups will report fewer incorrect DFD flaws than nominal non-GSS groups.

H1b: GSS groups will report fewer duplicate DFD flaws than nominal non-GSS groups.

H1c: GSS groups will report more actual DFD flaws than nominal non-GSS groups.

H1d: GSS groups will report fewer gross flaws than nominal non-GSS groups.

Hypotheses about Production for Step 1 of H-DFD

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

25

Spears School of Business

Distributed GSS H-DFD groups should have more net production than nominal non-GSS H-DFD groups:

H2a: Distributed GSS groups will report fewer incorrect DFD flaws than nominal non-GSS groups.

H2b: Distributed GSS groups will report fewer duplicate DFD flaws than nominal non-GSS groups.

H2c: Distributed GSS groups will report more actual DFD flaws than nominal non-GSS groups.

H2d: Distributed GSS groups will report fewer gross flaws than nominal non-GSS groups.

Hypotheses about Proximity for Step 1 of H-DFD

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

26

Spears School of Business

Distributed GSS groups will have similar gross and net production as FtF GSS groups:

H3a: Distributed GSS groups will produce similar levels of gross work as FtF GSS groups.

H3b: Distributed GSS groups will properly categorize as many DFD flaws as FtF GSS.

H3c: Distributed GSS groups will properly eliminate as many incorrect DFD flaws as FtF GSS groups.

H3d: Distributed GSS groups will properly eliminate as many duplicates as FtF GSS groups.

Hypotheses about Proximity for Step 2 of H-DFD

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

27

Spears School of Business

Non-GSS groups will produce more gross work than FtF GSS groups:

H4a: FtF non-GSS groups will produce more gross work than FtF GSS groups.

GSS groups will have more time to do less work than non-GSS groups, they should be able to produce more value-added work. Therefore in step two of HDFD, non-GSS groups will have lower net production than FtF GSS groups:

H4b: FtF non-GSS groups will properly categorize fewer DFD flaws than FtF GSS groups.

H4c: FtF non-GSS groups will properly eliminate fewer incorrect DFD flaws than FtF GSS groups.

H4d: FtF non-GSS groups will properly eliminate fewer duplicates than FtF GSS groups.

Hypotheses about Production for Step 2 of H-DFD

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

28

Spears School of Business

Treatment Control Post Measure

Only

A1 OA1-4 A2 OA5-10

XB A1 OB1-4 A2 OB5-10

XC A1 OC1-4 A2 OC5-10

Legend

X = Treatment

O = Outcome Measurement

A = Activities

Experimental Design

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

29

Spears School of Business

Step 1

IndividualIdentifies

and Categorizes

FlawsUsingHDFD

Actual Flaws

DuplicateFlaws

Gross Work

DFDHeuristics

IncorrectFlaws

GrossFlaws

Changes

Additions

Deletions

Re-categorizations

effectiveWork

IncorrectFlaws

Actual Flaws

DuplicateFlaws

Nominal Group H-DFD Experimental Procedures and Measures

O1

O2

O3

O4

O10

O9

O8

Correct

Changes

Additions

Deletions

Re-categorizations

Incorrect

Step 2

GroupDiscusses

FlawsRemoves Duplicates

andCategorizes

usingHDFD

O5

O6

ThreeTrained Judges Score Flaw

Reports

Participants Randomly Assigned To Teams

PotentialParticipants

IneffectiveWork

ConditionTool

TrainingandTask

Description

FlawedDFDs

O7

H-DFDHeuristics

Lessonand

Exercises

Participants

H-DFDList

Teams Randomly Assigned

To Condition

TrainedTeams

We followed a Detailed Script for the entire process

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

30

Spears School of Business

I H D C

G F B A

I1I2

I3 I4

H3H4

H1 H2

D1D2

D3 D4

G2G1

G3 G4

F3F4

F1 F2

B1B2

B3 B4

C3C4

C1 C2

A3A4

A1 A2

NM R587Group 46DFD

NM R668Group 46DFD

NM R533Group 46DFD1

NM R560Group 47DFD

NM R452Group 47DFD126

NM R695Group 48DFD

NM R371Group 48 DFD127

NM R344Group 47 DFD 139

NM R506Group 48DFD

NM R479Group 45 DFD

NM R641Group 45DFD

NM R749Group 45DFD

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

31

Spears School of Business

Summary ResultsHypothesis Measure Hyp. Support Tukey

(p)μA

Control (sd)

μB FtF GSS (sd)

μC Distributed GSS (sd)

1a: Control >FtF GSS incorrect DFD flaws Yes 0.07 23.09 (7.5) 16.67 (8.4) —

1b: Control >FtF GSS duplicate DFD flaws Partial 0.10 6.91 (2.8) 5.20 (3.2) —

1c: Control <FtF GSS real DFD flaws No 0.40 11.09 (5.4) 12.47 (3.7) —

1d: Control >FtF GSS gross DFD flaws Yes 0.00 41.64 (5.3) 34.33 (7.6) —

2a: Control >Disitrbuted GSS incorrect DFD flaws No 0.07 23.09 (7.5) — 18.47 (10.7)

2b: Control >Disitrbuted GSS duplicate DFD flaws Yes 0.02 6.91 (2.8) — 4.73 (4.3)

2c: Control <Disitrbuted GSS real DFD flaws No 0.44 11.09 (5.4) — 12.40 (4.9)

2d: Control >Disitrbuted GSS gross DFD flaws Yes 0.03 41.64 (5.3) — 35.60 (9.6)

3a: FtF GSS = Disitrbuted GSS gross work Yes 0.81 — 15.13 (12.3) 16.73 (12.2)

3b: FtF GSS = Disitrbuted GSS incorrectly categorized flaws Partial 0.10 — 3.13 (2.3) 2.07 (1.8)

3b: FtF GSS = Disitrbuted GSS correctly categorized flaws Yes 1.00 — 10.87 (3.3) 10.87 (4.3)

3c: FtF GSS = Disitrbuted GSS incorrect DFD flaws Yes 0.34 --- 13.27 (6.6) 14.73 (9.7)

3d: FtF GSS = Disitrbuted GSS duplicate DFD flaws Yes 0.99 — 1.87 (1.7) 1.93 (1.8)

4a: Control >FtF GSS gross work Yes 0.00 28.09 (12.7) 15.13 (12.3) —

4b: Control <FtF GSS correctly categorized flaws No 0.26 10.64 (5.9) 10.87 (3.3) —

4b: Control >FtF GSS incorrectly categorized flaws No 0.97 14.9 (6.5) 13.0 (4.8) —

4c: Control >FtF GSS incorrect DFD flaws Yes 0.03 17.82 (6.2) 13.27 (6.6) —

4d: Control >FtF GSS duplicate DFD flaws No 0.83 2.18 (3.5) 1.87 (1.7) —

** p = 0.05, df(2,39) 10* Hypotheses Supported2 partially Supported6 Not supported

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

32

Spears School of Business

Detailed Results For supported Hypotheses• Experimental Results for H1a: Comparing the number of

incorrect DFD flaws in Step 1 reported by Nominal groups using GSS with nonGSS nominal groups. (Proximate GSS vs. Control)

• Results of one-way ANOVA p = 0.07 df(2, 39) (a = 0.05) Statistically Significant

• GSS (mean 16.67 incorrect flaws (S.D. 8.4)); nonGSS (mean 23.09 incorrect flaws (S.D. 7.5))

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

33

Spears School of Business

Detailed Results For supported Hypotheses• Experimental Results for H1d: Comparing the number of

gross DFD flaws in Step 1 reported by proximate nominal groups using GSS with nonGSS nominal groups. (Proximate GSS vs. Control)

• Results of one-way ANOVAp = 0.00 df(2, 39) (a = 0.05) Statistically Significant

• GSS (mean 34.33 gross flaws (S.D. 7.6)); nonGSS (mean 41.64 gross flaws (S.D. 5.3))

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

34

Spears School of Business

Detailed Results For supported Hypotheses

• Experimental Results for H2b reported number of duplicate DFD flaws from Step 1 (Distributed GSS vs. Control)

• Results of one-way ANOVA p = 0.02 df(2, 39) (a = 0.05) Statistically Significant

• Dist. GSS (mean 4.73 duplicate flaws (SD 4.3)); nonGSS (mean 6.91 duplicate flaws (SD 2.8))

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

35

Spears School of Business

Detailed Results For supported Hypotheses• Experimental Results for H2d reported number of gross

DFD flaws from Step 1 (Distributed GSS vs. Control)

• Results of one-way ANOVA p = 0.03 df(2, 39) (a = 0.05) Statistically Significant

• Dist. GSS (mean 35.60 gross flaws (SD 9.6)); nonGSS (mean 41.64 gross flaws (SD 5.3))

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

36

Spears School of Business

Detailed Results For supported Hypotheses

• Experimental Results for H3a gross work from Step 2 (Proximate GSS vs. Distributed GSS)

• Results of one-way ANOVA p = 0.81 df(2, 39) (a = 0.05) NOT Statistically Significant

• Dist. GSS (mean 16.73 gross work (SD 12.2)); GSS (mean 15.13 gross work (SD 12.3))

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

37

Spears School of Business

Detailed Results For supported Hypotheses• Experimental Results for H3b1 incorrect

categorization as many DFD flaws from Step 2 (Proximate GSS vs. Distributed GSS)

• Results of one-way ANOVAp = 0.10 df(2, 39) (a = 0.05) Statistically Significant– Partial Support

• Dist. GSS (mean 2.07 incorrect cat. (SD 1.8)); GSS (mean 3.12 incorrect cat. (SD 2.3))

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

38

Spears School of Business

Detailed Results For supported Hypotheses

• Experimental Results for H3b2 correct categorization as many DFD flaws from Step 2 (Proximate GSS vs. Distributed GSS)

• Results of one-way ANOVAp = 1.00 df(2, 39) (a = 0.05) NOT Statistically Significant

• Dist. GSS (mean 10.87 correct cat. (SD 4.3)); GSS (mean 10.87 correct cat. (SD 3.3))

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

39

Spears School of Business

Detailed Results For supported Hypotheses• Experimental Results for H3c incorrect DFD flaws from

Step 2 (Proximate GSS vs. Distributed GSS)

• Results of one-way ANOVAp = 0.34 df(2, 39) (a = 0.05) NOT Statistically Significant

• Dist. GSS (mean 14.73 incorrect flaws (9.7 SD)); GSS (mean 13.27 incorrect flaws (SD 6.6))

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

40

Spears School of Business

Detailed Results For supported Hypotheses• Experimental Results for H3d duplicate DFD flaws

from Step 2 (Proximate GSS vs. Distributed GSS)

• Results of one-way ANOVAp = 0.99 df(2, 39) (a = 0.05) NOT Statistically Significant

• Dist. GSS (mean 1.93 duplicate flaws (1.8 SD)); GSS (mean 1.87 duplicate flaws (SD 1.7))

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

41

Spears School of Business

Detailed Results For supported Hypotheses

• Experimental Results for H4a amount of gross work from Step 2 (Control vs. Proximate GSS)

• Results of one-way ANOVAp = 0.00 df(2, 39) (a = 0.05) Statistically Significant

• Control (mean 28.09 Gross Work (12.7 SD)); GSS (mean 15.13 Gross Work (SD 12.3))

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

42

Spears School of Business

Detailed Results For supported Hypotheses

• Experimental Results for H4cin correct DFD flaws from Step 2 (Control vs. Proximate GSS)

• Results of one-way ANOVAp = 0.03 df(2, 39) (a = 0.05) Statistically Significant

• Control (mean 17.82 incorrect flaws ( 6.2 SD)); GSS (mean 13.27 incorrect flaws (6.6 SD ))

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

43

Spears School of Business

Discussion

• For Step One:

• Results indicate that nominal non-GSS groups produced more incorrect flaw reports and gross flaws than FtF GSS groups did (H1a and H1d) and non-GSS groups produced more duplicates and gross flaws than distributed GSS groups (H2b and H2d).

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

44

Spears School of Business

Discussion

• For Step Two:• Indicate that gross work and real work were

similar between FtF and distributed GSS groups (H3a-H3d) and non-GSS groups produced more gross work and more incorrect flaw reports than FtF GSS groups (H4a and H4c).

• Strikingly Almost TWICE as much gross work!

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

45

Spears School of Business

Key Findings

1: In step one, nominal non-GSS groups produced more gross flaws, more duplicates, and more incorrect flaw reports than FtF GSS groups…

suggesting that GSS improved group productivity in FtF groups

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

46

Spears School of Business

Key Findings2: In step one, non-GSS groups produced more gross

flaws and duplicates than distributed GSS groups….

suggesting that the GSS improvements were also gained by distributed groups.

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

47

Spears School of Business

Key Findings3: Distributed GSS groups were just as effective in

step two as their FtF GSS counterparts…

suggesting that distributed work with GSS did not introduce more process losses than seen in FtF groups

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

48

Spears School of Business

Key Findings4: Both FtF GSS and distributed GSS groups had far

less gross work in step two, as compared to the FtF non-GSS groups, which worked frenetically yet still produced more incorrect flaw reports in this step.

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

49

Spears School of Business

Key FindingsResults for duplicate DFD flaws are particularly interesting.

We gave no specific directions about avoiding duplicates to any subject in any treatment until step two, yet FtF GSS and distributed GSS groups intuitively avoided more duplicates than non-GSS groups.

We attribute this outcome to the increased group awareness afforded by GSS. Use of GSS significantly reduced the number of duplicates generated and thus, considerably reduced the tedium of sorting through duplicates during step two.

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

50

Spears School of Business

Key Findings• Our findings illustrate that the introduction of GSS into H-

DFD groups significantly improves outcomes as compared to non-GSS H-DFD.

• What is particularly notable about this finding is none of the group members in step one were able to communicate directly with each other.

• Despite the lack of direct communication, GSS groups were able to build consensus much earlier in terms of reported DFD flaws and bug categories while simultaneously avoiding the high levels of duplication experienced by the nominal FtF groups.

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

51

Spears School of Business

Additional Contribution• Another contribution of this research is that it provides a

streamlined design methodology to improve the review of DFDs.

• Design artifact reviews are an important technique in improving systems development, but these reviews have traditionally been performed without heuristics and without GSS support.

• This research develops a useful set of understandable DFD heuristics and shows how the use of these heuristics in combination with GSS can improve the results of design artifact reviews of DFDs.

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

52

Spears School of Business

Limitations• The primary limitation of this research is the use of novice student

groups.

• However, one of the strengths of using heuristics is that they lend themselves to use by novices.

• We expect that experts using heuristics would provide even stronger results.

• We assert that For future research, these results should be tested using expert groups and larger group sizes.

• We also believe that it would be useful to create similar heuristic methodologies for other design artifact reviews, such as the production of use cases and sequence diagrams from UML.

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

53

Spears School of Business

Thank you for the opportunity to present to the Faculty and

Students of the MSIS Department in the Spears

School of Business at OSU.

Nicholas C. Romano, Jr.

Tuesday Nov 16, 2004 12:00 Noon

MSIS/IRIS Research Seminar Fall 2004Romano – DFD Heuristic Evaluations with GSS

54

Spears School of Business

ReferencesR. Agarwal and V. Venkatesh, "Assessing a firm's web presence: A heuristic evaluation procedure for the measurement of usability," Information Systems

Research (ISR), vol. 13(2), 2002, pp. 168-186.

K. Baker, S. Greenberg, and C. Gutwin, "Heuristic evaluation of groupware based on the mechanics of collaboration," Lecture Notes in Computer Science, vol. 199(2254), 2001, pp. 123-140.

F. Garzotto, L. Mainetti, and P. Paolini, "Hypermedia design, analysis, and evaluation issues," Communications of the ACM (CACM), vol. 38(8) August, 1995, pp. 74-86.

Jeffries, R., Miller, J. R., Wharton, C., and Uyeda, K. M. (1991). “User Interface Evaluation in the Real World: A Comparison of Four Techniques,” in Scott P. Robertson Gary M. Olson Judith S. Olson (Eds.) Proceedings of the SIGCHI conference on Human factors in computing systems: Reaching through Technology, April 27 – May 2, New Orleans, LA, ACM Press, 119-124.

M. D. Levi and F. G. Conrad, "A heuristic evaluation of a world wide web prototype," Interactions, vol. 3(4), 1996, pp. 50-61.

P. B. Lowry and T. L. Roberts, "Improving the usability evaluation technique, heuristic evaluation, through the use of collaborative software," in Dennis Gallette and Jeanne Ross (Eds) Proceedings of the 9th Annual Americas Conference on Information Systems (AMCIS), Tampa, Florida, August 4-5, 2003, Association for Information Systems, pp. 2203-2211.

M. J. Muller, L. Matheson, C. Page, and R. Gallup, "Methods and Tools: Participatory heuristic evaluation," Interactions, vol. 5(5), 1998, pp. 13-18.

R. Nelson and I. Millet, "Data flow diagrams versus use cases: Student reactions," in Nicholas C. Romano, Jr. (Ed.) Proceedings of the 10th Annual Americas Conference on Information Systems (AMCIS), New York, New York, August 6-8, 2004, pp. 2888-2894

J. Nielsen and T. K. Landauer, "A mathematical model of the finding of usability problems,“ in Stacey Ashlund Kevin Mullet Austin Henderson Erik Hollnagel Ted White (Eds.) Proceedings of INTERACT '93 and CHI '93, Amsterdam, The Netherlands, April 24-29, 1993, ACM Press, pp. 206-213.

J. Nielsen and R. Molich, "Heuristic evaluation of user interfaces," in Proceedings of Computer Human Interaction (CHI), Seattle, WA, April 1-5, 1990, pp. 249-256.

Roberts, T., L., P. B. Lowry and N. C. Romano Jr. (2005). Improving design artifact reviews with group support systems and an extension of heuristic evaluation techniques. in R. H. Sprague Jr. and J. F. Nunamaker Jr., Eds. Proceedings of the Thirty-Eighth Hawaii International Conference on Systems Sciences, Waikoloa Village, Kona, Hawaii, USA, IEEE Computer Society Press, forthcoming.

A. Sears, "Heuristic walkthroughs: Finding the problems without the noise," Journal of Human-Computer Interaction, vol. 9(3), 1997, pp. 213-234.

Shaw, D. (1993). “CD-ROM Interfaces for Information Retrieval: Heuristic Evaluation and Observations of Intended Users,”in Williams, Martha, (Ed.) Proceedings of the 14th National Online Meeting, May 3-5, New York, NY: Learned Information Inc.

A. Sutcliffe, "Heuristic evaluation of website attractiveness and usability," in Lecture Notes in Computer Science, vol. 2220, C. Johnson, Ed. Berlin: Springer-Verlag, 2001, pp. 183-198.