an empirical investigation into dss structures and environments

18
ELSEVIER Decision Support Systems 13 (1995) 141-158 D0mm Sunx Symm An empirical investigation into DSS structures and environments " J. Michael Pearson a,*, j.p. Shim b a Department of Management, Kansas State University, Kansas, USA b Department of Management and Information Systems, Mississippi State Unieersity, Mississippi, USA Abstract In order to insure that a decision support system meets the needs of the user, it is necessary to understand the environment in which the DSS operates and the factors that make up the environment. This research identifies specific DSS structures and their relationship with key environmental factors. Five specific DSS structures were identified and tested against ten environmental factors identified in literature as impacting DSS structures. All five of the DSS structures were significantly influenced by different combinations of the ten environmental factors tested. A framework for DSS development is also presented. Keywords: Decision support systems; DSS development; Environmental factors 1. Introduction The development of decision support systems (DSS) is a complex process that has been investi- gated by many researchers [3,7,8,20,24,25]. Many methods, including prototyping and the systems life cycle approach, have been suggested for the development of DSS. Each of these methods has strengths and weaknesses that make each appro- priate for certain applications. Typically, these methods have not taken the environment or the '~ The authors gratefully acknowledge Professor Whinston and the anonymousreferees for their helpful and constructive comments on earlier drafts of this paper. * Corresponding author. role the DSS is to perform into consideration adequately [3]. The primary purpose of this research is the identification of specific DSS structures. These structures, based on the capabilities provided by the DSS, provide the foundation from which a design-relevant taxonomy is developed. The tax- onomy presented is similar to the one developed by Alter [1]. A second objective of this study is to identify the environment in which each DSS structure exists. This follows the "outside-in" ap- proach suggested by Ariav and Ginzberg [3]. The third objective is to present a framework that can be used for the design of future DSS. The frame- work is based on the concepts of systems theory and requires the identification of key environ- 0167-9236/95/$09.50 © 1995 Elsevier Science B.V. All rights reserved SSDI 0167-9236(93)E0042-C

Upload: jmichael-pearson

Post on 25-Aug-2016

218 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: An empirical investigation into DSS structures and environments

ELSEVIER Decision Support Systems 13 (1995) 141-158

D0mm Sunx Symm

An empirical investigation into DSS structures and environments "

J. Michael Pearson a,*, j .p . Sh im b

a Department of Management, Kansas State University, Kansas, USA b Department of Management and Information Systems, Mississippi State Unieersity, Mississippi, USA

Abstract

In order to insure that a decision support system meets the needs of the user, it is necessary to understand the environment in which the DSS operates and the factors that make up the environment. This research identifies specific DSS structures and their relationship with key environmental factors. Five specific DSS structures were identified and tested against ten environmental factors identified in literature as impacting DSS structures. All five of the DSS structures were significantly influenced by different combinations of the ten environmental factors tested. A framework for DSS development is also presented.

Keywords: Decision support systems; DSS development; Environmental factors

1. Introduct ion

The development of decision support systems (DSS) is a complex process that has been investi- gated by many researchers [3,7,8,20,24,25]. Many methods, including prototyping and the systems life cycle approach, have been suggested for the development of DSS. Each of these methods has strengths and weaknesses that make each appro- priate for certain applications. Typically, these methods have not taken the environment or the

'~ The authors gratefully acknowledge Professor Whinston and the anonymous referees for their helpful and constructive comments on earlier drafts of this paper.

* Corresponding author.

role the DSS is to perform into consideration adequately [3].

The primary purpose of this research is the identification of specific DSS structures. These structures, based on the capabilities provided by the DSS, provide the foundation from which a design-relevant taxonomy is developed. The tax- onomy presented is similar to the one developed by Alter [1]. A second objective of this study is to identify the environment in which each DSS structure exists. This follows the "outside-in" ap- proach suggested by Ariav and Ginzberg [3]. The third objective is to present a framework that can be used for the design of future DSS. The frame- work is based on the concepts of systems theory and requires the identification of key environ-

0167-9236/95/$09.50 © 1995 Elsevier Science B.V. All rights reserved SSDI 0167-9236(93)E0042-C

Page 2: An empirical investigation into DSS structures and environments

142 J. M. Pearson, J.P. Shim/Decision Support Systems 13 (1995) 141-158

mental factors prior to the development of the DSS.

2. Study background

During the early 1970's, decision support sys- tems emerged as a practical approach for apply- ing computers and information to the decision problems faced by management. These early de- cision support systems were different from earlier computerized systems in that emphasis was on decision making effectiveness rather than opera- tional efficiency. The late 1970's saw the DSS movement emphasize interactive computer-based systems that helped decision makers utilize data bases and models to solve semi-structured and unstructured problems. The emphasis was not so much on the decision process, but rather on the support of personal computing and the tools nec- essary for fast application development. The 1980's brought about computer-based systems employing a variety of new technologies to im- prove the effectiveness of managerial, organiza- tional, and professional activities. A wide range

of user-friendly software was produced under the DSS label. Emphasis shifted towards providing the decision maker with balanced support in the areas of decision making, design, and implemen- tation of DSS [16].

During the past two decades, there has been considerable disagreement as to what specifically constitutes a DSS. This issue has been debated by many notable authors [2,6,12,15,18,23,25]. At this point in time, there seems to be a consensus that a DSS is composed of three interrelated parts: a data management component, a model manage- ment component, and a dialogue management component. Each component provides a set of capabilities to the decision maker and improves the effectiveness with which he//she works.

Previous research in DSS has typically focused on a single set of specific issues. For example, several studies have dealt with specific decision situations and the type of services provided by DSS [13,19]; others have examined the compo- nents, tools, and technologies needed to provide certain types of decision support [6]; and still another group of researchers emphasized the process of DSS design, implementation, and use

1. TASK STRUCTURE - Structured - Semi-structured - Unstructured

I0. INTERACTION W I T I t ~ OTHER CBIS - S t a n d A l o n e ~ I

- CBIS - Internal - CBIS - External

9. ROLE OF USER IN DECISION PROCESS ~ "1 = Primary - Secondary

S. EXPERTISE IN PROBLEM AREA - High - Low

2. MANAGEMENT LEVEL SUPPORTED - Strategic - Managerial - Operational

[ Data ~ Model 1 Management Management [

I t

T > Dialogue I Management

I 7. COMPUTER SKILL OF USER

- High - Low

F i g . 1. E n v i r o n m e n t a l f a c t o r s t h a t

3 . DECISION PHASE SUPPORTED - lntellignnce - Design - Choice

/ Y

4. NUMBER OF PROBLEMS SUPPORTED - Problem Specific DSS - General DSS

I

? 5. USAGE PATTERN - Sub~'rip6on - Terminal = Clerk - ~ t ~ - ~

6 . N U M B E R O F U S E R S S U P P O R T E D

- O n e U l e r

- More than One U s e r

i n f l u e n c e t h e s t r u c t u r e o f D S S c o m p o n e n t s .

Page 3: An empirical investigation into DSS structures and environments

J. M. Pearson, J.P. Shim~Decision Support Systems 13 (1995) 141-158 143

[15,20]. Ariav and Ginzberg [3] have suggested that this segmented research path has not pro- vided an adequate explanation of the relationship that exists between the structural aspects of DSS and the services it is expected to provide or the characteristics of a particular DSS environment.

Ariav and Ginzberg [3] have also suggested that successful DSS development can occur only if the environment in which the system must function and the role the DSS is to perform is given primary consideration. They suggested that it is only after these two elements have been considered that the specific capabilities of the DSS can be identified. Ariav and Ginzberg [3] indicated that a description of a DSS environ- ment should include only those items that impact system structure. They identified two critical di- mensions of a DSS environment: Task Character- istics and Access Pattern. Task characteristics in- clude the structure inherent to the task sup- ported, the management-level supported, which phase of the decision making process is sup- ported, and the number of applications sup- ported. Access pattern includes the method of user interaction, the number of individuals sup- ported by the DSS, the expertise of the user in computer usage, the expertise of the user in the problem area supported, the role of the user in the decision making process, and the relationship of the DSS to "neighbouring" information sys- tems [3]. Fig. 1 provides a conceptual diagram of the relationship between the two dimensions (10 factors) identified by Ariav and Ginzberg [3] and the structure of a DSS.

3. Research hypotheses

Two hypotheses are addressed in this study. The first hypothesis implies that unique DSS structures exist and that it is possible to identify these structures. Identification of the DSS struc- tures is based on the capabilities provided by the database, model, and dialogue management com- ponents.

HI: Unique DSS structures exist and can be identified.

The second hypothesis suggests that the envi- ronment in which a DSS structure exists is unique

and can be described by the ten environmental factors identified by Ariav and Ginzberg [3].

H2: Each DSS structure exists within a unique environment.

4. Methodology

The study used a questionnaire (see Appendix A) developed from previous surveys and DSS literature. The instrument was validated in sev- eral stages. The validated questionnaire was mailed to 1613 randomly selected nonacademic members of The Institute of Management Sci- ence (TIMS). This frame was selected because of the interdisciplinary nature of the group and their interest in computers and the decision making process. Academicians were not included because it was considered unlikely they were currently interacting with a DSS. It is believed that the selected frame provides an accurate representa- tion of the population of interest. Each question- naire included a cover letter, definitions page, and questionnaire. The definitions page con- tained words (including the term "DSS") that could possibly be misunderstood by the respon- dent. The questionnaire was composed of three parts. The first part contained 24 items designed to test the significance of the ten factors identi- fied in literature as comprising a DSS environ- ment. The second part of the questionnaire con- tained 20 items that solicited information about the specific capabilities provided by the DSS used by each respondent. Each item in the first two parts of the questionnaire had a 7-point Likert scale ranging from "strongly disagree" to "strongly agree." The third part of the question- naire contained 13 questions concerning the de- mographics of the respondent. The questions identified each respondent's experience with DSS, training received, participation in DSS develop- ment, and satisfaction with DSS.

5. Data analysis

In order to test the two hypotheses, a four (4) step process was used. The first step identified

Page 4: An empirical investigation into DSS structures and environments

144 Z M. Pearson, J.P. Shim/Decision Support Systems 13 (1995) 141-158

the demographics associated with the participants of this study. The second step tested the validity and reliability of the research instrument. The third step used a hierarchical clustering proce- dure to classify the respondents' DSS into spe- cific DSS structures. The fourth step used multi- ple regression to determine which of the ten (10) environmental factors identified by Ariav and Ginzberg [3] were significant within the specific DSS structures.

5.1. Descriptive statistics of study

Two hundred and seventy-three individuals participated in the study (16.93% response rate). Not all returned questionnaires were usable since

many were from individuals who were not current or past users of DSS. The mailing provided 158 usable questionnaires. Non-response bias was ex- amined by comparing results obtained from the survey with known values of the target popula- tion. The similarity between TIMS members and the respondents within this study provided evi- dence that the sampling distribution was homoge- neous with the population distribution.

The demographics associated with the respon- dents are detailed in Table 1. The respondents were typically well-educated males who are cur- rently working in middle and upper management. Approximately three-fourths of the respondents were satisfied with the DSS while almost all of the respondents indicated they were generally satisfied with the results provided by their DSS.

Table 1 Demographics associated with study

Sample size:

Sex:

Male Female

Education level:

Doctorate Masters Bachelor Other

Primary work area:

Marketing M S / O R Public programs Production Administrative MIS/DSS Finance R and D Other

Management position:

Upper Middle Supervisory Not applicable

158

145 (91.7%) 13 (8.3%)

52 (32.9%) 97 (61.4%)

8 (5.1%) 1 (0.6%)

32 (20.3%) 41 (25.9%)

3 (1.9%) 5 (3.2%)

10 (6.3%) 15 (9.6%) 16 (10.1%) 20 (12.6%) 16 (10.1%)

29 (18.4%) 71 (44.9%) 23 (14.5%) 35 (22.2%)

DSS Usage:

Daily Weekly Monthly

User participation in DSS development:

Design only Construction only Implementation only Design/construction Design/implementation Construct./implement. All phases Not involved

DSS performance:

Excellent Satisfactory Poor Not applicable

User satisfaction:

All the time Most of the time Some of the time Rarely Never

42 (26.6%) 64 (40.5%) 52 (32.9%)

11 (7.0%) 1 (0.6%)

17 (10.8%) 8 (5.1%)

13 (8.3%) 7 (4.4%)

80 (50.6%) 21 (13.2%)

58 (36.7%) 91 (57.6%)

5 (3.2%) 4 (2.5%)

13 (8.3%) 108 (68.3%) 30 (19.0%)

1 (0.6%) 6 (3.8%)

Page 5: An empirical investigation into DSS structures and environments

J. M. Pearson, J.P. Shim~Decision Support Systems 13 (1995) 141-158 145

5.2. Tests of reliability and validity

Both parts of the questionnaire were tested for content and construct validity. Several individuals familiar with the DSS concept were provided a draft of the questionnaire and asked their opin- ion as to the coverage and clarity of the question- naire. Evaluation of the instrument continued until these individuals and the researchers were satisfied the questionnaire provided a fair repre- sentation of factors relevant to the DSS area.

Factor analysis and inter-item correlations were used to determine how well the research instrument measured the dimensions developed in the pre-test. Analysis indicated that related items within the questionnaire were loading in the expected manner.

Cronbach's coefficient alpha was calculated for the ten environmental factors identified by Ariav and Ginzberg [3]. As shown in Table 2, eight of the ten factors proved to be reliable measures of the intended dimensions. The two unreliable fac- tors (EXPERTISE IN FUNCTIONAL AREA and ROLE OF USER IN DECISION MAKING PRO- CESS) were deleted from further analysis. Cron- bach's alpha was calculated for the remaining

Table 2 Coefficient alphas for the dimensions tested

Items Alpha

Task structure 0.7899 Management level supported 0.7448 Decision phase supported 0.7754 Number of problems supported 0.8565 Usage Pattern 0.7050 Number of users supported 0.7292 Computer skill of user 0.7530 Expertise in functional area * Role of user - 0.3566 Interaction with other CBIS 0.7236 Reliability coefficients 24 items Alpha = 0.7161 Database management component 0.7699 Model management component 0.8458 Dialogue management component 0.7659 Reliability coefficients 20 items Alpha = 0.8463

Note: * Expertise in functional area had only one question on questionnaire.

eight environmental factors. This provided an new overall alpha of 0.7161.

Table 2 also shows the Cronbach's coefficient alpha for each of the three factors tested in the second part of the questionnaire (database, model, and dialogue management components). The second part of the questionnaire proved to be a reliable measure of the intended dimensions. An overall alpha was also calculated for this part of the questionnaire and was 0.8463.

5.3. DSS structures

A hierarchical clustering procedure was used to group the respondents' DSS according to capa- bilities provided by the database, model, and dialogue management components. Ward's mini- mum variance method was used to create the initial cluster solution. Analysis indicated that a five-cluster solution was appropriate. Cluster so- lutions beyond this point were too small to pro- vide meaningful results. Sample replication was used to test the internal validity of the five-cluster solution. Most (82 of 100) of the replicated items maintained their original clusters. High levels of agreement between replicated items and the orig- inal cluster solution provides evidence of the in- ternal validity of the solution obtained using Ward's minimum variance method.

Multiple discriminant analysis was used to check the external validity of the cluster solution. The results indicated that the clusters differed significantly on variables such as identification of opportunities, whether the respondent was a di- rect user of DSS, whether the DSS supported multiple users, the respondent's skill using DSS, interaction with other computers systems, the number of problems supported, frequency of DSS usage, satisfaction with DSS performance, and the training received on the DSS. Differences between clusters on variables other than those used for cluster formation provides strong evi- dence of the external validity of the cluster solu- tion.

The demographics associated with the specific DSS structures are presented in Table 3. Analysis indicates that the respondents have similar demo- graphic profiles but that they differ somewhat in

Page 6: An empirical investigation into DSS structures and environments

146 J. M. Pearson, J.P. Shim/Decision Support Systems 13 (1995) 141-158

Table 3 Demographics associated with specific DSS structures

DSS #1 DSS #2 DSS #3 DSS #4 DSS #5 Structure size: 39 23 25 39 32

Sex: Male 37 (94.9%) 21 (91.3%) 23 (92.0%) 35 (89.7%) 29 (90.6%) Female 2 (5.1%) 2 (8.7%) 2 (8.0%) 4 (10.3%) 3 (9.4%)

Education level: Doctorate 12 (30.8%) 8 (34.8%) 9 (36.0%) 15 (38.5%) 8 (25.0%) Masters 26 (66.7%) 15 (65.2%) 14 (56.0%) 21 (53.8%) 21 (65.6%) Bachelor 1 (2.6%) 0 (0.0%) 2 (8.0%) 2 (5.1%) 3 (9.4%) Other 0 (0.0%) 0 (0.0%) 0 (0.0%) 1 (2.6%) 0 (0.0%)

Primary work area: Marketing 4 (10.3%) 3 (13.0%) 4 (16.0%) 10 (25.6%) 11 (34.4%) MS/OR 14 (35.9%) 4 (17.4%) 7 (28.0%) 11 (28.2%) 5 (15.6%) Public programs 0 (0.0%) 0 (0.0%) 2 (8.0%) 0 (0.0%) 1 (3.1%) Production 1 (2.6%) 1 (4.3%) 1 (4.0%) 2 (5.1%) 0 (0.0%) Administrative 1 (2.6%) 4 (17.4%) 2 (8.0%) 2 (5.1%) 1 (3.1%) MIS/DSS 0 (0.0%) 4 (17.4%) 4 (16.0%) 4 (10.3%) 3 (9.4%) Finance 5 (12.8%) 1 (4.3%) 3 (12.0%) 2 (5.1%) 5 (15.6%) R and D 8 (20.5%) 4 (17.4%) 0 (0.0%) 5 (12.8%) 3 (9.4%) Other 6 (15.4%) 2 (8.7%) 2 (8.0%) 3 (7.7%) 3 (9.4%)

Management position: Upper 7 (17.9%) 3 (13.0%) 4 (16.0%) 6 (15.4%) 9 (28.1%) Middle 18 (46.2%) 12 (52.2%) 12 (48.0%) 17 (43.6%) 12 (37.5%) Supervisory 5 (12.8%) 6 (26.1%) 5 (20.0%) 4 (10.3%) 3 (9.4%) Not applicable 9 (23.1%) 2 (8.7%) 4 (16.0%) 12 (30.8%) 8 (25.0%)

DSS usage: Daily 9 (23.1%) 5 (21.7%) 4 (16.0%) 11 (28.2%) 13 (40.6%) Weekly 15 (38.5%) 11 (47.8%) I1 (44.0%) 13 (33.3%) 14 (43.8%) Monthly 15 (38.5%) 7 (30.4%) 10 (40.0%) 15 (38.5%) 5 (15.6%)

User participation in dss development: Design 0 (0.0%) 1 (4.3%) 3 (12.0%) 4 (10.3%) 3 (9.4%) Construction 1 (2.6%) 0 (0.0%) 0 (0.0%) 0 (0.0%) 0 (0.0%) Implementation 4 (10.3%) 3 (13.0%) 2 (8.0%) 3 (7.7%) 5 (15.6%) Not involved 9 (23.1%) 2 (8.7%) 3 (12.0%) 4 (10.3%) 3 (9.4%) Design/construction 0 (0.0%) 2 (8.7%) 2 (8.0%) 3 (7.7%) 1 (3.1%) Design/implementation 4 (10.3%) 1 (4.3%) 2 (8.0%) 4 (10.3%) 2 (6.3%) Construction/ 0 (0.0%) 3 (13.0%) 2 (8.0%) 2 (5.1%) 0 (0.0%)

implementation All phases 21 (53.8%) 11 (47.8%) 11 (44.0%) 19 (48.7%) 18 (56.3%)

DSS performance: Excellent 9 (23.1%) 7 (30.4%) 6 (24.0%) 12 (30.8%) 24 (75.0%) Satisfactory 26 (66.7%) 15 (65.2%) 18 (72.0%) 26 (66.7%) 6 (18.8%) Poor 2 (5.1%) 1 (4.3%) 1 (4.0%) 0 (0.0%) 1 (3.1%) Not applicable 2 (5.1%) 0 (0.0%) 0 (0.0%) 1 (2.6%) 1 (3.1%)

User satisfaction: All of the time 2 (5.1%) 1 (4.3%) 4 (16.0%) 1 (2.6%) 5 (15.6%) Most of the time 26 (66.7%) 16 (69.6%) 14 (56.0%) 29 (74.4%) 23 (71.9%) Some of the time 8 (20.5%) 4 (17.4%) 7 (28.0%) 7 (17.9%) 3 (9.4%) Rarely 1 (2.6%) 0 (0.0%) 0 (0.0%) 0 (0.0%) 0 (0.0%) Never 2 (5.1%) 2 (8.7%) 0 (0.0%) 1 (2.6%) 1 (3.1%)

Page 7: An empirical investigation into DSS structures and environments

J. M. Pearson, J.P. Shim~Decision Support Systems 13 (1995) 141-158 147

the frequency with which they used their DSS and also in the satisfaction received from their DSS.

5.3.1. DSS structure #1 DSS structure #1 can be characterized as a

model-based DSS. The majority of respondents within this structure indicated their DSS pos- sessed strong modelling capabilities while the database and dialogue management components provided less developed capabilities. Several re- spondents (27 of 39) stated that their DSS al- lowed access to several models. Most of the indi- viduals (76.9%) responded that models within their DSS are controlled by special model man- agement software. Approximately one-half of the respondents indicated that models within their DSS could be integrated into other models. Two-thirds of the respondents felt these models supported strategic, tactical, and operational de- cisions. Two-thirds of the respondents also indi- cated that the model management component maintained a directory of models available to help the decision maker. A majority of the re- spondents (22 of 39) indicated their DSS allowed them to utilize model building tools and subrou- tines to develop new models.

A majority (67%) of the respondents indicated their DSS interacts with at least one database. Approximately one-half (46.2%) of the respon- dents felt the database accessed was exclusive to their DSS. The remainder of the respondents indicated the database could be accessed by sev- eral other sources within the organization. A number of respondents (89.7%) stated that their DSS did not have access to a data dictionary. Several respondents (64.1%) also indicated that access to DSS databases was not controlled by a database management system.

Over 56 percent of the respondents indicated that the user interface was controlled by a dia- logue management component. About 97.4% of the respondents indicated the dialogue manage- ment component was unable to support multiple dialogue styles. Although many respondents stated their DSS could not support multiple dia- logue styles, one-third indicated their dialogue management component was flexible. Only six

respondents stated their DSS was capable of tracking dialogue usage. Approximately three- fourths of the respondents (76.9%) indicated their DSS was not able to interact with the database management component and the model manage- ment component.

Analysis indicates that DSS structure #1 pro- vides support primarily to mid-level management for semi-structured problems, While respondents within this structure stated their DSS aided them in problem recognition, opportunity recognition, and alternative selection, the DSS was most use- ful in analyzing alternative solutions. Typically DSS structure #1 provided support to multiple users within the organization, but generally was considered to be problem specific. The typical respondent used the DSS directly and in an inter- active mode. These individuals also considered themselves to be skilled computer and DSS users. Approximately one-half of the DSS accessed other computer systems within the organization, but only a small portion allowed access to computer systems outside the organization.

5.3.2. DSS structure #2 DSS structure #2 exhibits strong database

management capabilities, moderate dialogue sup- port, and a weak model management subsystem. All 23 respondents within this DSS structure stated their DSS interacts with a database. Ap- proximately one-half of these respondents felt the database accessed was exclusive to their DSS. A majority of the respondents (60.9%) indicated their DSS could extract data from several differ- ent sources. The remainder (39.1%) responded that data could be accessed through one database only. Several respondents (19 of 23) indicated that database functions were controlled by a database management system. A majority of the respondents (56.5%) stated their DSS had access to a data dictionary.

All respondents within this DSS structure indi- cated their DSS had a dialogue management component that controlled the user interface. Many (17 of 23) responded that the dialogue management component was flexible and able to interact with the database management compo- nent and model management component. De-

Page 8: An empirical investigation into DSS structures and environments

148 J. M. Pearson, J.P. Shim/Decision Support Systems 13 (1995) 141-158

spite indications of a well-developed dialogue management component, only 4 of 23 suggested their DSS was able to support multiple dialogue styles and only 21.7 percent thought their DSS was able to track dialogue usage.

The weakest component of DSS structure #2 was the model management component. Approxi- mately one-half of the respondents did not be- lieve their DSS could support multiple models. Many individuals (16 of 23) indicated their DSS did not allow them to use model building blocks a n d / o r subroutines to develop more complex models. Only four respondents indicated their DSS was able to integrate multiple models. Most of the individuals (87%) responded that models within their DSS were not controlled by special model management software. All of the respon- dents indicated their DSS did not provide a di- rectory of models within their system. The re- spondents (19 of 23) did indicate, however, that models within their DSS could interact with the data management component and that available models primarily supported tactical and opera- tional decisions.

DSS structure #2 supports both mid- and up- per-level management. Respondents indicated this DSS structure best supports semi-structured and unstructured decisions. Respondents also stated their DSS was most useful in the identifi- cation of problems a n d / o r opportunities. Typi- cally this DSS structure supported multiple users within the organization, but was generally consid- ered problem specific. The typical respondent used this DSS structure directly and in an inter- active mode. Although most of these individuals considered themselves to be skilled computer and DSS users, most (82.6%) of the respondents used staff intermediaries to access their DSS. Approxi- mately three-fourths of the DSS allowed access to computer systems within the organization, while only 13 percent allowed access to computer sys- tems outside of the organization.

5.3.3. DSS structure #3 DSS structure #3 is the weakest o f the five

DSS structures identified. Responses provided by this group indicated this DSS structure did not provide many of the capabilities commonly asso-

ciated with DSS. Almost all of the respondents (92%) agreed that their DSS interacts with a database, and approximately one-half (56%) of the respondents indicated the database was ex- clusive to their DSS. Approximately one-third of respondents (32%) stated their DSS could access data from several different sources. The respon- dents within this group appeared to be equally split, however, as to whether database activities were controlled by a data management system and as to whether or not it had access to a data dictionary.

All of the respondents indicated their DSS did not support multiple dialogue styles. Many re- spondents (72%) stated that their DSS did not provide for a flexible user interface. Only one-half of the respondents indicated their DSS had any kind of dialogue management subsystem. Re- spondents were equally split over the ability of their DSS to interact with the model manage- ment component and the database management component. Nearly all of the respondents (23 of 25) indicated that their DSS did not track dia- logue usage.

The weakest component of DSS structure #3 was the model management component. Many respondents (72%) suggested their DSS was un- able to support or maintain multiple models. The respondents also indicated their DSS did not provide the capability to build new models through model building blocks a n d / o r subrou- tines. All of the respondents stated there was not a model directory available within their DSS and 92 percent of the respondents indicated it was not possible to integrate multiple models within their DSS. Models that existed within this DSS structure supported tactical (60%) and opera- tional (52%) decisions. All of the respondents indicated their DSS did not have special model management software to control model opera- tions within their DSS.

DSS structure #3 provides support to lower- and mid-level management. Respondents within this structure were unable to clearly identify whether their DSS supported strategic, tactical, or operational decisions. Respondents did indi- cate, however, that their DSS was most useful in analyzing decision alternatives. Typically this DSS

Page 9: An empirical investigation into DSS structures and environments

J. M. Pearson, J.P. Shim/Decision Support Systems 13 (1995) 141-158 149

structure supported multiple users within the re- spondent's organization, but was considered problem specific. Slightly over one-half of the respondents used the DSS directly (56%) and in an interactive mode (56%). Although most of these individuals considered themselves to be skilled computer and DSS users, almost all (96%) of the respondents used staff intermediaries to access their DSS. Approximately one-half of the DSS accessed other computer systems within the organization, while only 12 percent allowed ac- cess to computer systems outside of the organiza- tion.

5.3.4. DSS structure #4 DSS structure #4 provides the user with mod-

erate database capabilities, strong model capabili- ties, and strong dialogue capabilities. Most of the respondents (92.3%) indicated that their DSS in- teracted with at least one database. The majority of respondents (74.4%) stated that the database was not exclusive to their DSS. Several members of this group agreed that database management functions were handled by a database manage- ment system. Many (69.2%) of the same individu- als indicated that the database management com- ponent provided a data dictionary capability. Nu- merous respondents (69.2%) also stated that data could be extracted from several sources through the data management component.

The capabilities provided by the model man- agement component are comparable to that of the database component. Respondents (30 of 39) indicated their DSS allowed them to access sev- eral models. Individuals within this group sug- gested that the models supported strategic (89.7%), tactical (92.3%), and operational (74.4%) decisions. Several of the individuals (28 of 39) responded that models within their DSS were controlled by model management software. Many of the respondents (28 of 39) indicated they were able to use model building blocks a n d / o r sub- routines to develop more sophisticated models to support their decision making. Only six individu- als indicated their DSS could not support the integration of multiple models. Several individu- als (22 of 39) responded that the model manage- ment component maintained a directory of mod-

els available to help the decision maker. Two- thirds of the respondents indicated the model management component interacted with the database management component of their DSS.

Respondents within this structure indicated less satisfaction with the dialogue management component of their DSS. Several individuals (71.8%) stated that the dialogue management component provided a flexible user interface. Al- though most of the respondents indicated that their user interface was flexible, only 38.5 percent of the respondents indicated that their DSS could support multiple dialogue styles. Many of the respondents (76.9%) indicated that the dialogue management component interacted with the model management component and the database management component. Most of the respon- dents (34 of 39) within this DSS structure indi- cated that the user interface was controlled by a dialogue management subsystem. Approximately one-half (22 of 39) of the respondents were un- sure whether the dialogue management compo- nent tracked dialogue usage.

Analysis indicated that DSS #4 primarily sup- ports mid-level management for semi-structured problems. Respondents indicated their DSS aids them in problem recognition, opportunity recog- nition, alternative analysis, and alternative selec- tion. The typical DSS within this structure sup- ports multiple users and is generally considered problem specific. Approximately three-fourths of the respondents use the DSS directly and in an interactive mode. Almost all of the respondents indicated they were skilled in the use of their computer and DSS. Only one-third of these re- spondents indicated they used a staff intermedi- ary to access their DSS. Most of the DSS allow access to other computer systems within the orga- nization, while approximately only one-fourth al- low access to computer systems outside of the organization.

5.3.5. DSS structure #5 DSS structure #5 is the most developed of the

DSS structures identified. All three components were rated very highly by the respondents within this group. Almost all of the respondents (30 of 32) stated that their DSS interacts with a

Page 10: An empirical investigation into DSS structures and environments

150 J. M. Pearson, J.P. Shim~Decision Support Systems 13 (1995) 141-158

Table 4 Summary of DSS environments and capabilities

DSS #1 DSS # 2 DSS # 3 DSS # 4 DSS #5

Task structure: Structured Limited Semi-structured Yes Unst ructured No

Management-level supported: Lower managemen t No Middle managemen t Yes Upper managemen t No

Decision phase supported: Identify problems Yes Identify opportunit ies Limited Analyze alternatives Yes Choosing alternatives Yes

Number of problems supported: General DSS No Problem specific DSS Yes

Usage pattern: Direct use of DSS Yes Interactive DSS Limited Staff intermediary No

Number of users supported: Supports single user No Supports multiple users Yes

Computer skill of user: Skilled computer user Yes Skilled DSS user Yes

Interaction with other CBIS: Other internal computers Limited Other external computers No

Database management: Interacts with a database Limited Database exclusive to DSS Limited Functions handled by DBMS No Query facility available No Data dictionary present No Extraction of data from Limited

several sources possible

Model management: Several models available Limited Support strategic decisions Limited Support tactical decisions Yes Support operational decisions Limited Models serve as building blocks Limited DSS has model directory Limited Models can be integrated Limited DSS has model managemen t Yes

system Models interact with DSS Limited

database

No Limited Limited Yes Yes Limited Yes Yes Yes No Limited No

Yes Limited No No Yes Yes Yes Yes No Yes Limited Yes

Yes Limited Yes Yes Yes No Yes Yes Limited Yes Yes Yes Limited Limited Yes Yes

No No No No Yes Yes Limited Yes

Yes Limited Limited Yes Yes Limited Yes Yes Yes Yes No No

No No No Yes Yes Yes Yes No

Yes Yes Yes Yes Yes Yes Yes Yes

Yes Limited Yes Yes No No No Limited

Yes Yes Yes Yes Limited Limited No No Yes Limited Limited Limited Yes Limited Limited Yes Limited Limited Limited Yes Limited No Limited Yes

Limited No Yes Yes No No Yes Yes Yes Limited Yes Yes Yes Limited Yes Yes No No Yes Yes No No Limited Limited No No Yes Yes No No Yes Limited

Yes No Limited Yes

Page 11: An empirical investigation into DSS structures and environments

Z M. Pearson, J.P. Shim/Decision Support Systems 13 (1995) 141-158 151

Table 4 (continued)

DSS #1 DSS #2 DSS #3 DSS #4 DSS #5

Dialogue management (DMS): Data management subsystem Limited Yes Limited Yes Yes

exists Flexible user interface No Yes No Yes Yes Tracks user dialogue No No No Limited No Interacts with other DSS No Yes Limited Yes Yes

components Supports several dialogue styles No No No No Limited

database. Many of these individuals (71.9%) indi- cated that this database was not exclusive to their DSS. The database could be accessed by other individuals within the organization. Most of the respondents (62.5%) suggested their database ac- tivities were controlled by a database manage- ment systems (DBMS). The respondents (25 of 32) also indicated that the DBMS provided a query facility with which the respondents could access data within their database. A number of individuals (84.4%) stated that their DSS had access to a data dictionary. This differs from the respondents within the other DSS structures as they were unable to identify a data dictionary capability. Many respondents also indicated that the database management component of their DSS allowed them to access data from several sources.

Almost all of the respondents (30 of 32) indi- cated that the model management component supported multiple models and that integration (29 of 32) of these models was possible. Access to multiple models and the ability to integrate mod- els allowed the respondent to use the DSS in support of strategic (93.7%), tactical (87.5%), and operat ional (84.4%) decisions. Respondents (27 of 32) within this group indicated they were able to use model building blocks a n d / o r subroutines to develop more complex models. All of the re- spondents stated that the model management component could interact with the available databases. A majority of the respondents (62.5%) was able to identify a formal model management system within their DSS. The respondents were split, however, as to whether a model directory was available within their DSS.

Most of the respondents (30 of 32) indicated their DSS provided a flexible user interface. All

of the respondents within this DSS structure indi- cated that the user interface was controlled by a dialogue management component. Many respon- dents (90.6%) also indicated that the dialogue management component interacts with both the database management component and the model management component. Respondents appeared to be equally divided about whether or not their DSS supported multiple dialogue styles. One half of the respondents (16 of 32) indicated that their DSS did not track dialogue usage; six individuals were not sure; and ten respondents stated that their DSS did track dialogue usage.

DSS #5 provides support to both lower-level and mid-level management . Analysis indicates the DSS supports structured and semi-structured de- cisions. Respondents stated their DSS was most useful in identifying problems and opportunities. The respondents also indicated their DSS pro- vided support in the analysis of and the selection of alternative solutions. Typically, DSS structure #5 supported only one user and was generally considered problem specific. The typical respon- dent indicated that he or she was skilled in the use of computers and their DSS. They also used their DSS directly and in an interactive mode. Less than one-fourth of the respondents used a staff intermediary to access their DSS. Most of the DSS accessed other organizational computer systems, while over one-half allowed access to computer systems outside of the organization. Table 4 summarizes the different environments in which the DSS structures operate and the capa- bilities provided by each DSS structure.

5.4. Environmental factors and DSS structures

Multiple regression was used to identify which environmental factors were most significant within

Page 12: An empirical investigation into DSS structures and environments

152 J. M. Pearson, J.P. Shim/Decision Support Systems 13 (1995) 141-158

the five DSS structures identified in Step 3. Fac- tor scores were used to create composite mea- sures for both the dependent (DSS Structure) and the independent variables (the eight reliable environmental factors).

Table 5 presents the results of the multiple regression analysis. Each of the five DSS struc- tures identified earlier in this study had a unique combination of environmental factors identified as significant. Analysis indicated that Interaction with Other CBIS (0.0009) was the most significant environmental factor within DSS structure #1 while Number of Users Supported (0.0547) was also moderately significant for this DSS structure. Other environmental factors were not significant for this particular DSS structure. The environ- ment for DSS structure #2 was found to have two significant environmental factors: Management Level Supported (0.0005) and Computer Skill of User (0.0234). DSS structure #3 had the largest number of significant environmental factors: Management Level Supported (0.0030), Task Structure (0.0002), and Interaction with Other CBIS (0.0161). Usage Pattern (0.0591) was also found to be moderately significant within DSS structure #3. DSS structure #4 was found to have one significant environmental factor: Inter-

action with Other CBIS (0.0161) and one moder- ately significant environmental factor: Computer Skill of User (0.0515). No other factors were sig- nificant for this DSS structure.

DSS structure #5 was found to have two sig- nificant environmental factors: Management Level Supported (0.0232) and Number of Users Sup- ported (0.0421). Two environmental factors were not found to be significant for any of the five DSS structures. These were Decision Phase Supported and Number of Problems Supported. Results of the cluster analysis and multiple regression sup- port the hypotheses that distinct DSS structures exist and that unique combinations of environ- mental factors exist for the five DSS structures.

6. Findings and implications

This study addressed two hypotheses. The first hypothesis stated that unique DSS structures ex- ist and that it is possible to identify these struc- tures. Analysis of data indicated that five distinct DSS structures could be identified. Multiple dis- criminant analysis was used to verify that the five DSS structures differed significantly. As indi- cated earlier, each item used to ascertain the

Tab le 5

Signif icant e n v i r o n m e n t a l fac tors wi th in DSS s t ruc tu res

E n v i r o n m e n t a l fac tor DSS # 1 DSS # 2 DSS # 3 DSS # 4 DSS # 5

M a n a g e m e n t level s u p p o r t e d 1.619 4.133 * 3.356 * 0.393 - 2 . 3 7 1 * (0.1142) (0.0005) (0.0030) (0.6970) (0.0232)

Dec i s ion phase s u p p o r t e d - 0.132 - 0.855 1.062 0.849 - 0.065 (0.8954) (0.4301) (0.3008) (0.4028) (0.9428)

T a s k s t ruc tu re - 0.321 - 0.554 2.618 * 1.006 - 1.255 (0.7498) (0.5859) (0.0002) (0.3228) (0.2177)

Usage pa t t e rn 1.070 1.004 2.002 1.505 - 0.146

(0.2916) (0.3281) (0.0591) (0.1432) (0.8851) N u m b e r of p rob lems s u p p o r t e d 0.063 - 0.334 - 0.246 - 0.726 1.082

(0.9504) (0.7422) (0.8082) (0.4736) (0.2868) C o m p u t e r skill of user - 0.259 - 2.455 * - 0.486 - 2.031 - 0.052

(0.7969) (0.0234) (0.6324) (0.0515) (0.9585)

N u m b e r of users s u p p o r t e d 1.986 1.157 0.490 0.723 - 2 . 1 0 7 * (0.0547) (0.2615) (0.6297) (0.4756) (0.0421)

In t e r ac t i on wi th o the r CBIS 3.600 * 0.168 4.545 * 2.551 * 0.854

(0.0009) (0.8683) (0.0161) (0.0161) (0.3991) F-Stat is t ic 12.963 6.508 14.356 9.500 5.075

(0.0009) (0.0161) (0.0001) (0.0013) (0.0114)

* Signif icant fac tor at the 0.05 level.

Page 13: An empirical investigation into DSS structures and environments

J. M. Pearson, J.P. Shim/Decision Support Systems 13 (1995) 141-158 153

structure of the respondents' DSS differed signif- icantly across the five DSS structures. Therefore, this hypothesis is supported.

The second hypothesis suggested that the envi- ronment in which a DSS exists is unique to that particular DSS structure. Multiple regression was used to test this hypothesis. Analysis indicated that five different combinations of environmental factors were significant for the five DSS struc- tures. Therefore, this hypothesis is also sup- ported.

Identification of these unique environments is an important step in the development of the DSS framework to be presented later in this paper. It is important to remember, however, not to misin- terpret the significant environmental factors within a particular DSS environment. These vari- ables represent factors that the respondents con- sider important for their particular DSS. They do not indicate causality for the existence of a par- ticular structure a n d / o r capability.

Two environmental factors did not signifi- cantly influence any of the five DSS structures. These were: Decision Phase Supported and Num- ber o f Problems Supported. There are a number of possible reasons why these environmental factors were not significant within a specific DSS envi- ronment. One possible explanation has to do with the segmented research path cited by Ariav and Ginzberg [3]. Typically, each DSS study has taken one or two environmental factors and carefully examined its impact on DSS. It is possible that when these environmental factors are analyzed in isolation they have a significant impact on DSS structure. When considered in combination with other environmental factors, as in this study, they may not have the significance as suggested by previous studies. Another possible explanation would be that the research instrument did not adequately measure these environmental factors. This is not considered likely, however, since sta- tistical analysis indicated the research instrument was both valid and reliable when measuring the intended dimensions. The final possibility is that the DSS used by the respondents did not provide an accurate representation of the population. This explanation is also considered unlikely since spe- cial care was taken in the selection of the sam-

piing frame. Comparison of sample demographics and population demographics also suggest that the sample does closely correspond to the in- tended population.

7. Conclusion

The use of computer technology to improve decision making has been a topic of considerable interest during the past several decades. Re- search in this area has typically focused on how computer technology can improve the efficiency with which a manager makes a decision and the effectiveness of that decision. Much of this re- search has been segmented as researchers look at one or two specific factors that could influence the composition of decision support technology. This segmented approach cannot provide an un- derstanding of the complex relationships that of- ten exist between a system and its environment.

As stated earlier, this study had three objec- tives. The first objective was to develop a design- relevant taxonomy. Analysis of data indicated that five specific DSS structures can be identified. Each structure provides the DSS user with a unique set of capabilities. Classifying DSS into this taxonomy can provide direction for future DSS research. The second objective was to iden- tify environmental factors that were significant within each of the DSS structures identified in the first part of the study. Data analysis identified five sets of environmental factors that influenced the five DSS structures identified in this study. The third objective was to develop a framework that can be used for the design of future DSS. Identification of the five specific DSS structures and the environment in which each DSS structure exists provides the basis for this framework. The developmental DSS framework suggested re- quires that the DSS builder go through six spe- cific steps. These steps are:

1. Identify the environment in which the DSS is to be built. As suggested by Ariav and Ginzberg (1985), the DSS builder should identify the spe- cific characteristics of the task(s) to be supported and the method in which the user will access the DSS. For example, the DSS builder should ascer- tain which level of management is being sup-

Page 14: An empirical investigation into DSS structures and environments

154 J. M. Pearson, J.P. Shim/Decision Support Systems 13 (1995) 141-158

ported? is the task structured or unstructured? how many users will the DSS support? the com- puter skill of the user(s)? All of these environ- mental factors must be considered before the DSS builder can begin the design and construc- tion of the DSS.

2. Ascertain the role the DSS is to have in support of the decision making process. The role or purpose of DSS has been defined as providing support for the decision-making process. How this is accomplished and the type of support required of the DSS must be identified before the DSS builder can decide what capabilities must be built into the DSS. For example, is the DSS to be used primarily for data retrieval or will the pri- mary function be to provide decision-making models to the user(s)?

3. Identify the specific capabilities required to support the decision maker within the environment identified in Step 1. DSS design literature typically identifies three major components necessary for a DSS: dialogue management component, model management component, and a data management component. Once task characteristics, access method, and the role of the DSS have been determined, the specific DSS capabilities neces- sary to support the decision maker can be identi- fied. For example, a user-friendly interface may be required because the user(s) has very low levels of computer expertise.

4. Develop a conceptual design of the DSS. Based on the information gathered in Steps 1 through 3, the DSS builder can begin the design of the DSS prototype. The DSS design can be based on the capabilities provided by the DSS structure whose environment most closely matches the environment in which the DSS is to be developed [See Table 4].

5. Based on the conceptual design, determine the resources required to build the DSS. Resources typically involved in a DSS include hardware, software, people, and data. The specific require- ments for each of these areas must be deter- mined and their usage secured before the DSS can be built. For example, the proposed DSS needs access to databases that do not currently exist within the organization. Steps must be taken to either create the necessary databases or to access them through an outside organization.

6. Build the DSS and provide for ongoing sup- port. Based on the conceptual design developed in Step 4, the DSS builder can begin to develop a prototype of the DSS. As mentioned earlier, the development of DSS is an evolutionary process that requires the DSS to change as the needs of the decision maker change. The DSS builder must provide for these changes by providing for the ongoing support of the DSS and the decision maker.

An individual who needs to build a DSS can utilize this framework as a starting point in the development process. By identifying environmen- tal factors critical to the process being supported and comparing these with the results obtained in this study, the builder of DSS will be able to determine the capabilities required within each DSS component to meet the needs of the DSS user. It is important to recognize that there may not be a perfect fit with the DSS structures and their environments identified within this study. When this occurs the DSS builder should identify the DSS environment that most closely fits the environment in which the DSS is to be devel- oped.

This study attempted to integrate past re- search in the area of decision support systems into a comprehensive framework that can be used by DSS researchers and DSS builders alike. Fu- ture DSS research should attempt to verify the existence of the structures identified within this study. Additional research is also needed to de- velop improved methods of testing how the envi- ronment impacts the development of DSS. Ef- forts in these directions will provide valuable information for future DSS research. The DSS builder should use the results of this study as a starting point for DSS development. Builders of DSS should identify the environment in which the DSS is to be operated, and then determine which of the five DSS structures is most appropri- ate.

Appendix A: Survey of factors influencing the structure of DSS

Please consider the following descriptive state- ments regarding you and DSS:

Page 15: An empirical investigation into DSS structures and environments

J. M. Pearson, J.P. Shim/Decision Support Systems 13 (1995) 141-158 155

Strongly Moderately Neither agree Moderately Strongly disagree disagree Disagree nor disagree Agree agree agree

1 2 3 4 5 6 7

SD MD D NN A MA SA

1. The DSS with which you are most familiar supports highly structured decisions. 2. The DSS with which you are most familiar supports semi-structured decisions. 3. The DSS with which you are most familiar supports unstructured decisions. 4. The DSS with which you are most familiar provides support for upper management. 5. The DSS with which you are most familiar provides support for middle management. 6. The DSS with which you are most familiar provides support for lower management. 7. The DSS with which you are most familiar helps you identify potential problems. 8. The DSS with which you are most familiar helps you identify opportunities. 9. The DSS with which you are most familiar is useful for analyzing alternatives.

10. The DSS with which you are most familiar is useful for choosing among alternatives. 11. You are a direct user of DSS. 12. Your usage of DSS is interactive. 13. You use a staff intermediary to access the DSS with which you are most familiar. 14. You are the only user of the DSS. 15. The DSS with which you are most familiar supports more than one user. 16. You are skilled in the use of computers. 17. You are skilled in the use of DSS. 18. You are skilled in the area in which the DSS provides support. 19. The DSS supports an area in which you are the sole decision-maker. 20. The DSS supports an area in which you provide support to the decision-maker. 21. The DSS with which you are most familiar interacts with other computer systems

within your company. 22. The DSS with which you are most familiar interacts with computer systems

outside your company. 23. The DSS provides support for many diverse problem areas. 24. The DSS supports only one specific type of problem.

Survey of the structure of DSS

1. The DSS with which you are most familiar interacts with a database. 2. The database is exclusive to that DSS. 3. The storage, retrieval, and control of the database is handled by a. atabase

management system (DBMS). 4. A query facility exists that allows other components of the DSS to access data

within the database. 5. The DSS with which you are most familiar has a data dictionary. 6. Extraction of data from several sources is possible. 7. The DSS with which you are most familiar has several basic models which

you can use. 8. The models in the DSS support strategic decisions. 9. The models in the DSS support tactical decisions.

10. The models in the DSS support operational decisions. 11. The models in the DSS are model building blocks and subroutines. 12. The DSS with which you are most familiar has a model directory. 13. Models within the DSS can be integrated when necessary. 14. The DSS has a model management system. 15. Models within the DSS can extract data from the DSS database. 16. The DSS with which you are most familiar has a user interface subsystem. 17. The user interface subsystem is flexible. 18. The user interface subsystem tracks dialogue usage. 19. The user interface subsystem interact with the database and the model base. 20. The user interface subsystem interacts with several different dialogue styles.

1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7

1 2 3 4 5 6 7

1 2 3 4 5 6 7 1 2 3 4 5 6 7

1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7

1 2 3 4 5 6 7

1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7

1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7

Page 16: An empirical investigation into DSS structures and environments

156 J. M. Pearson, J.P. Shim/Decision Support Systems 13 (1995) 141-158

Level of DSS usage by respondent 1. I am a c u r r e n t / p a s t user of DSS. - - - y e s

- - - n o

2. I have used a DSS within the past: - - - < 1 year - - - 1 - < 3 years - - - 3 - < 5 years - - - > 5 years - - - n o t applicable

3. I use the DSS: - - - d a i l y - - - w e e k l y - - - o n c e a month - - - n e v e r

4. The DSS I use was provided by In-house staff: - - - y e s - - - n o - - - n o t applicable

5. I was active in the following phases of DSS development: - - - d e s i g n - - - c o n s t r u c t i o n - - - i m - plementat ion - - - n o t involved

6. The results provided by the DSS with which I work are: - - - e x c e l l e n t - - - s a t i s f ac to ry - - - p o o r - - - n o t applicable

7. I am satisfied with the DSS: - - - a l l the time - - - m o s t the time - - - s o m e of the time - - - r a r e l y - - - n e v e r - - - n o t applicable

8. I make more effective decisions because of the DSS: - - - a l l the time - - - m o s t the time - - - some of the time

- - - n e v e r - - - n o t applicable 9. The training I received in how to use the

DSS was: - - - e x c e l l e n t - - - a d e q u a t e - - - p o o r - - - n o t available

10. The primary functional area that I work in is: - - - M a r k e t i n g - - - A d m i n i s t r a t i v e Accounting - - - P u b l i c Programs

- - - R and D - - - F i n a n c e - - - M S / O R M I S / D S S - - - P r o d u c t i o n - - - O t h e r

11. My position within the organization: top mgmt. - - - m i d d l e mgmt. - - - supe rv i so ry - - - n o t applicable

12. I am: - - - m a l e - - - f e m a l e 13. Highest degree I earned was: - - - D o c t o r a l

- - - M a s t e r s - - - B a c h e l o r s - - - O t h e r

Appendix B. Specific DSS structure capabilities

DSS #1 DSS #2 DSS #3 DSS # 4 DSS #5 Respondents per structure (n): 39 23 25 39 32

Database managemen t 1. Interacts with a database. 66.6% 100.0% 92.0% 92.3% 93.7% 2. Database exclusive to that DSS. 46.2% 52.2% 56.0% 25.6% 28.1% 3. Database functions handled by a DBMS. 35.9% 82.6% 60.0% 59.0% 62.5% 4. Query facility allows interaction by other DSS components . 28.2% 87.0% 44.0% 64.1% 78.1% 5. DSS has a data dictionary. 10.3% 56.5% 44.0% 69.2% 84.4% 6. Extraction of data from several sources possible. 46.2% 60.9% 32.0% 69.2% 90.6%

Model managemen t 7. Several models available for your use. 69.2% 47.8% 28.0% 76.9% 93.7% 8. Models support strategic decisions. 64.1% 39.1% 36.0% 89.7% 93.7% 9. Models support tactical decisions. 79.5% 91.3% 60.0% 92.3% 87.5%

10. Models support operational decisions. 69.2% 87.0% 52.0% 74.4% 84.4% 11. Models are building blocks and subroutines. 56.4% 30.4% 12.0% 71.8% 84.4% 12. DSS has a model directory. 66.6% 0.0% 0.0% 56.4% 43.7% 13. Models within DSS can be integrated. 48.7% 17.4% 8.0% 84.6% 90.6% 14. DSS has a model managemen t system. 76.9% 13.0% 0.0% 71.8% 62.5% 15. Models interact with DSS database(s). 43.6% 82.6% 28.0% 66.7% 100.0%

Dialogue managemen t 16. DSS has dialogue managemen t subsystem. 56.4% 100.0% 48.0% 87.2% 100.0% 17. User interface is flexible. 33.3% 73.9% 28.0% 71.8% 93.7% 18. Interface tracks dialogue usage. 15.4% 21.7% 8.0% 43.6% 31.3% 19. Interface interacts with other DSS components . 23.1% 73.9% 48.0% 76.9% 90.6% 20. Interface supports several dialogue styles. 2.6% 17.4% 0.0% 38.5% 50.0%

Page 17: An empirical investigation into DSS structures and environments

J. M. Pearson, J.P. Shim/Decision Support Systems 13 (1995) 141-158

Appendix C. Specific DSS structure environment characteristics

157

DSS #1 DSS #2 DSS #3 DSS #4 DSS #5 n = 39 23 25 39 32

Q1. Structured decisions 66.7% Q2. Semi-Struct. decisions 74.4% Q3. Unstructured decisions 39.9% Q4. Upper mgmt. support 30.8% Q5. Mid-mgmt. support 76.9% Q6. Lower mgmt. support 35.9% Q7. Identify problems 76.9% Q8. Identify opportunities 69.2% Q9. Analyze alternatives 92.3%

Q10. Choosing alternatives 76.9% Ql l . Direct user of DSS 74.4% Q12. Interactive DSS 64.1% Q13. Staff intermediary 25.6% Q14. Support single user 20.5% Q15. Support multiple user 79.5% Q16. Skilled computer user 92.3% Q17. Skilled DSS user 87.2% Q21. Other computer systems 48.7% Q22. Outside computer systems 17.9% Q23. General DSS 17.9% Q24. Problem specific DSS 87.2%

30.4% 52.0% 48.7% 75.0% 82.6% 60.0% 82.1% 81.2% 78.2% 40.0% 53.8% 37.5% 87.0% 52.0% 35.9% 34.4% 91.3% 80.0% 84.6% 90.6% 21.7% 80.0% 64.1% 71.9% 91.3% 60.0% 76.9% 90.6% 87.0% 32.0% 79.5% 93.7% 55.2% 76.0% 82.1% 93.7% 65.2% 54.0% 84.6% 90.6% 78.3% 56.0% 69.2% 82.1% 87.0% 56.0% 76.9% 81.2% 82.6% 96.0% 33.3% 15.6% 13.0% 16.0% 10.3% 81.2% 91.3% 80.0% 82.1~ 21.9% 100% 92.0% 100% 96.9% 87.0% 80.0% 92.3% 93.7% 73.9% 52.0% 71.8% 87.5% 13.0% 12.0% 23.1% 59.4% 8.7% 12.0% 33.3% 18.7%

95.7% 92.0% 61.5% 90.6%

References

[1] S. Alter, A Taxonomy of Decision Support Systems, Sloan Management Review, 19, 1 (Fall 1977) 39-56.

[2] S. Alter, Decision Support Systems: Current Practices and Continuing Challenges, Massachusetts: Addison- Wesley, 1980.

[3] G. Ariav, and M.J. Ginzberg, DSS Design: A Systemic View of Decision Support, Communications of the ACM, 28, 10 (October 1985), 1045-1052.

[4] I. Benbasat, and Y. Wand, Command Abbreviation Be- haviour in Human-Computer Interaction, Communica- tions of the ACM, 27, 4 (May 1984), 376-382.

[5] R.W. Blanning, What is Happening in DSS? Interfaces, 13, 5 (1983), 71-80.

[6] R.H. Bonczek, C.W. Holsapple, and A.B. Whinston, The Evolving Roles of Models in Decision Support Systems, Decision Sciences, 11, 2 (1980), 337-356.

[7] C.H.P. Brookes, A Framework for DSS Development, Information Systems Forum Research Report, Depart- ment of Information Systems, University of New South Wales, Sydney, Australia, 1984.

[8] E.D. Carlson, An Approach for Designing Decision Sup- port Systems, Database, (Winter 1979), 51-63.

[9] C.W. Churchman, The Systems Approach, New York: Dell, 1968.

[10] L.J. Cronbach, Coefficient Alpha and the Internal Struc- ture of Tests, Psychometrica, 16 (September 1951), 297- 334.

[11] D.C. Eriksen, A Synopsis of Present Day Practices Con- cerning Decision Support Systems, Information and Management, 7, 5 (1984), 243-252.

[12] M.J. Ginzberg, M.J. and E. Stohr. Decision Support Systems: Issues and Perspectives, In Decision Support Systems: Proceedings, NYU Symposium on DSS, edited by M.J. Ginzberg et al, 9-32, New York, 1982.

[13] G.A. Gorry, and M.S. Scott Morton, A Framework for Management Information Systems, Sloan Management Review, 13, 1 (Fall 1971), 55-70.

[14] M.D. Goslar, G.I. Green, and T.H. Hughes, Decision Support Systems: An Empirical Assessment for Decision Making, Decision Sciences, 17, 1 (1986), 79-91.

[15] P.G.W. Keen, Interactive Computer Systems for Man- agers: A Modest Proposal, Sloan Management Review, 18, 1 (Fall 1976), 1-17.

Page 18: An empirical investigation into DSS structures and environments

158 J.M. Pearson, J.P. Shim/Decision Support Systems 13 (1995) 141-158

[16] P.G.W. Keen, Decision Support Systems: The Next Decade, Decision Support Systems, 3, 3 (1987), 253-265.

[17] H.K. Klein, and R. Hirschheim, Fundamental Issues of DSS: A Consequentialist Perspective, Decision Support Systems, 1 (1985), 5-23.

[18] D.W. Kroeber, and H.J. Watson, Computer-based Infor- mation Systems: A Management Approach, 2nd ed., New York: Macmillan Publishing Company, 1986.

[19] J.D.C. Little, Models and Managers: The Concept of a Decision Calculus, Management Science, 16, 8 (April 1970), B466-485.

[20] J.H. Moore, and M.G. Chang, Design of Decision Sup port Systems, Database, 12, 1 (Fall 1980), 8-14.

[21] J.C. Nunnally, Jr., Psychometric Theory, New York: Mc- Graw-Hill Book Co., 1978.

[22] W. Remus, and J.E. Kottemann, Semi-Structured Recur- ring Decisions: An Experimental Study of Decision Mak- ing Models and Some Suggestions for DSS, MIS Quar- terly, 11, 2 (1987), 233-243.

[23] M.S. Scott Morton, Management Decision Systems: Computer Support for Decision Making, Harvard Uni- versity Press, 1971.

[24] R.H. Sprague, Jr., A Framework for the Development of DSS, MIS Quarterly, 4, 4 (1980), 1-26.

[25] R.H. Sprague, Jr. and E.D. Carlson, Building Effective Decision Support Systems, New Jersey: Prentice Hall, 1982.

[26] C.B. Stabell, Decision Support Systems: Alternative Per- spectives and Schools, Decision Support Systems, 3, 3 (1987), 243-251.

[27] E. Turban, Decision Support and Expert Systems: Man- agerial Perspectives, 2nd ed., New York: MacMillan Pub- lishing Company, 1990.

[28] P.R. Watkins, Perceived Information Structure: Implica- tions for Decision Support System Design, Decision Sci- ences, 13, 1 (1984), 38-59.

4

J.P. Shim is a professor of Manage- ment Science/Information Systems at Mississippi State University. He has been on the faculties of University of Wisconsin and Georgia State Univer- sity (Visiting Professor). Dr. Shim is the co-author of several books, in- cluding Micro Management Science, and Micro Manager. He has pub- lished articles in Computers and Op- erations Research, Long Range Plan- ning, Interface, OMEGA: The Inter-

national Journal of Management Science, Journal of the Op- erational Research Society, Social Science Computer Review, Socio-Economic Planning Sciences, Research Advances in Computers and the Social Sciences, O R / M S Today, Colle- giate Microcomputer, Ethical Issues in Information Systems, and numerous other professional journals. Dr. Shim has pre- sented numerous papers at national meetings and served as Track and Session chairman at numerous national and inter- national meetings. He has also served as a referee/editorial review board member for Management Science, MIS Quar- terly, Journal of MIS, Decision Sciences, Naval Logistics Research Quarterly, Financial Management, Computers and Operations Research, European Journal of Operational Re- search, Computers and Education, Journal of Microcomputer Systems Management, and others. His primary teaching and research interests are in the areas of microcomputer applica- tions of management science, Hypertext/Hypermedia, DSS/EIS, and MODM.

DSS/ES applications,

J. Michael Pearson is an assistant professor of Information Systems at Kansas State University. Dr. Pearson has served as a consultant to numer- ous business organizations through- out the United States. He has pre- sented several papers at regional and national meetings, including the 1992 International Conference on Informa- tion Systems (ICIS). His primary teaching and research interests are in the areas of organizational planning,

and the management of quality.