the value of information concept applied to data systems

12
OMEGA. The Int..i"1of Mgmt $¢i., Vol. 5. No. 5, 1977. Pergamon Press. Printed in Great Britain The Value of Information Concept Applied to Data Systems' DONALD V MATHUSZ Directorate of Combat Operations Analysis, US Army Combined Arms Combat Developments Activity (In revisedform March 1977) Cost-benefit analysis has a considerable literature in which information systems have been patently ignored. This reflects the considerable difficulties of applying the theory to information systems, and the state-of-the art remains relatively as Koopmans described it some 19 years ago (1957). A bar to further development would appear to be the lack of an applicable value-of-information concept. This paper seeks to clarify the issues and provide a robust theoretical and data analysis framework that will cover most situations. The approach here is to separate explicitly the dimensions of cost from those of information benefit, and examine the implications. The Null Information Benefit condition emerges as a special theoretical case, but potentially a most important one in applications. This case together with the Pareto optimum defines a large class of such problems that can be handled by the decision criteria and data analysis techniques tabulated and discussed here. The selection of input data techniques defines the limits of later project justification and may be crucial to the political viability of the projects throughout its life. Finally, the general management vs information systems management relationships are discussed in terms of this situation. 1. INTRODUCTION TO THE STATE-OF-THE-ART "If the cost of information processing is to be balanced against its contribution to decisions, a more general method for quantifying the several relevant aspects of the intuitive notion of information is needed." T. C. Koopmans--1957 THE THEORY of cost-benefit analysis is essentially incomplete in many respects, e.g. where it may involve a social welfare function [33]. The more common cost-benefit analysis of hardware systems is also by no means without its pit- falls [13, 15, 21, 22]. It seems only natural then, that computers have been turned to for what aid and comfort they can yield but with very limited success i The contents of this paper represent the views of the author and should not be considered as having ot~cial Department of the Army approval, either expressed or implied. 593 O~ECA 5/5"--~

Upload: donald-v

Post on 30-Dec-2016

213 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: The value of information concept applied to data systems

OMEGA. The Int..i"1 of Mgmt $¢i., Vol. 5. No. 5, 1977. Pergamon Press. Printed in Great Britain

The Value of Information Concept Applied to Data Systems'

DONALD V MATHUSZ Directorate of Combat Operations Analysis, US Army Combined Arms Combat

Developments Activity

(In revised form March 1977)

Cost-benefit analysis has a considerable literature in which information systems have been patently ignored. This reflects the considerable difficulties of applying the theory to information systems, and the state-of-the art remains relatively as Koopmans described it some 19 years ago (1957). A bar to further development would appear to be the lack of an applicable value-of-information concept. This paper seeks to clarify the issues and provide a robust theoretical and data analysis framework that will cover most situations. The approach here is to separate explicitly the dimensions of cost from those of information benefit, and examine the implications. The Null Information Benefit condition emerges as a special theoretical case, but potentially a most important one in applications. This case together with the Pareto optimum defines a large class of such problems that can be handled by the decision criteria and data analysis techniques tabulated and discussed here. The selection of input data techniques defines the limits of later project justification and may be crucial to the political viability of the projects throughout its life. Finally, the general management vs information systems management relationships are discussed in terms of this situation.

1. I N T R O D U C T I O N TO THE STATE-OF-THE-ART

"I f the cost of information processing is to be balanced against its contribution to decisions, a more general method for quantifying the several relevant aspects of the intuitive notion of information is needed."

T. C. Koopmans--1957

THE THEORY of cost-benefit analysis is essentially incomplete in many respects, e.g. where it may involve a social welfare function [33]. The more common cost-benefit analysis of hardware systems is also by no means without its pit- falls [13, 15, 21, 22]. It seems only natural then, that computers have been turned to for what aid and comfort they can yield but with very limited success

i The contents of this paper represent the views of the author and should not be considered as having ot~cial Department of the Army approval, either expressed or implied.

593

O~ECA 5 /5 " - -~

Page 2: The value of information concept applied to data systems

Mathusz--The Value of Information Concept Applied to Data Systems

[31]. It can be argued that there is a considerable evaluation potential in the man-computer interaction schemes as demonstrated in MIT's project MAC that could be applied to the cost-benefit problem, perhaps via on-line simula- tion [19, 27]. However, it is not the lack of computing power, but lack of a framework in which to put the computing power that is the essence of the problem.

In these analyses, the Management Information System (MIS) itself has been seriously neglected. Part of the reason may be the considerable confusion that exists as to how cost-benefit analysis is applicable to the MIS. Opinion ranges from that of advocating profit and loss (P&L) evaluations, to those that would not attempt any cost benefit analysis. At least, this latter view is implied by detailed expositions of the potential of MIS without mention of quantitative benefits or costs [I0, II, 12]. McDonough advocates setting up successive implementation bench marks versus expected costs, by which cumulative costs can be plotted against progress for management control purposes [30]. This still leaves unanswered the question of whether the project should have been undertaken at all.

Historically, justification of computers and their software as investments has followed two paths. In the first, the computer has been viewed as an overhead or general service to the organization. Its justification then was on the general grounds of added capability. This seems to have come about from the early wartime uses made of the computer by the scientific community. Under these circumstances, the only economic analysis required on a computer support proposal was that of examining the magnitude of the expense. Management then decided whether it should be funded or not (sometimes government contracts helped).

The Profit and Loss (P&L) approach evolved from the computer's substitution for some tangible clerical operation. The first electronic accounting machines were early examples of this evolution which has proceeded to the present large computer systems used for payroll, billing, etc. These computer systems essentially produce the same tangible product as previous clerical systems with, it is hoped, a lower cost. In this instance, the investment justification is some variant of P&L analysis that will estimate the expected return on investment, such as profitability accounting [17, 18].

The MIS cost-benefit problem then reduces to: do we avoid the problem by assuming it away, i.e. by simply calling the MIS an intangible overhead service ? Then, if we do this, what do we have left in the way of a comparative selection basis between candidate MIS projects ?

This paper seeks to establish a workable cost-benefit approach to the selection of candidate MIS projects, in which the methods and techniques are those that corporate management would ordinarily expect to see in project justifications. Aspects of the subject which might be theoretically useful but esoteric to most management, such as Bergsonian utility, have been omitted [2].

594

Page 3: The value of information concept applied to data systems

Omega, Vol. 5, No. 5

2. PRELIMINARY CONCEPTS

The cost-benefit analysis of MIS is by definition concerned with two streams of values, 2 i.e. the information stream and the costs stream. Because there are conceptual difficulties with the former, the entire analysis oftentimes falls by default into the costing area [23]. Here, frequently we can devise a computer system with net negative costs, i.e. cost avoidances which are ~ea t e r than the implementation and operating costs together. This money savings is then considered the benefit in practice. While this is not entirely in error, in the sense that it is a portion of the benefit, it neglects the information-benefit side and implies certain assumptions about the information characteristics involved. How do we evaluate the situation, for instance, where we replace a manual once- a-year, sales forecast, with twelve computerized forecasts, one issued every successive month, for the next 12 months ahead? It should be obvious that the mechanized system represents a considerable increase in potential corporate capability. If the P&L analysis shows a loss, do we conclude that the project should be rejected on that basis ?

2.1 Partition of information and cost-benefits To carry the argument to its conclusion, if the new system will not improve

the information in some way, e.g. the same reports will be generated with the same timeliness, then the marginal difference in information-benefit between the two systems is zero. We may then justifiably evaluate the system solely on the basis of comparative costs. It follows then, that whenever we use costs without consideration of information benefits, we implicitly assume the information-benefit to be the same for all the systems and the information- benefit differences between systems to be zero (null).

2.2 The information-benefit concept The information-benefit notion is: as a business (military) manager, what is

the benefit of having particular information ? To crystallize the concept, let us assume that there is $1000 buried in a box somewhere. How much benefit does knowing the location amount to? Clearly, this depends upon the recovery cost. I f it is somewhere in our back yard, so that the recovery cost is negligible the information value is $1000. In the same manner, any recovery that costs $800 reduces the information's value to $200, and so on.

The difficulty in applying this benefit notion to real MIS seems to be a compounding of two problems. a. Each manager (decision maker) has an individual input information profile

that corresponds to his observational and thinking habits [1 ].

2 We use the word benefit in a very general sense and reserve the word value to imply that the benefit can be evaluated in a common measure, such as dollars in business or reaction time in the military.

595

Page 4: The value of information concept applied to data systems

Mathusz--The Value of Information Concept Applied to Data Systems

b. Even if we accept subjective probability, there remains the problem which Vickery defines as uncertainty as contrasted to risk. In the case of risk, we can assign probabilities to known outcomes, but in uncertainty not all the outcomes are known [40].

As previously stated, if we restrict the project selection criteria to cash flow, we have assumed the information benefit problem away, by assuming the candidate MIS projects all yield the same value of information. This in effect, is a Null Information Benefit assumption. It should also be noted that the Null assumption does not necessarily require identical timeliness of intbrmation between systems.

It may be that the information is produced periodically, such that its useful- ness is unaffected by the precise moment of its production within the period, provided only that it is available when conventionally needed, e.g. end of the month. The same argument holds for the information content, although this may be manager-dependent as noted by Ackoff [1 ]. Thus, the Null Information Benefit assumption may be valid under a far larger number of business applica- tions than would seem likely in the first instance.

As an example let us consider the arbitrage situation. Here the objective is to buy and sell a given commodity in different markets, e.g. gold in Zurich and Hong Kong and profit by any price differences. The firm engaged in arbitrage operations does not necessarily wish to use the commodity itself, but profits by producing a price orderliness between markets (ideally). The value of the information is the potential profits implied in these price differences between markets. In this situation, the net potential profitableness is affected by the cost of producing the information (which should be fairly constant), and the expected commodity volumes and prices involved (fairly volatile). Here, the informa- tion's value can be estimated in terms of expected dollars. The timeliness of the information affects the commodity volumes that can be arbitraged during the active trading day. When the markets close however, the information on the final trading has constant value until the market opens again. Thus, in this simplified example, we would not pay more for a system that produced the final trading results seconds after the markets closed, as compared to one that yielded the same information an hour later, since both systems would allow us ample time to plan the next day's opening trades.

When the Null assumption clearly does not hold, the simplifying power of the Pareto optimum can be applied [2]. If one system has both lower costs and superior information benefits, it becomes the obvious choice. In the same manner, if both systems have equal costs, the one with superior information benefits is the obvious choice. This implies that systems with the same informa- tion benefit can be validly evaluated on costs alone. In some instances where neither costs nor benefits fit the Pareto criteria, it still may be possible to convert the information benefit into dollars. This is usually difficult for a MIS, but where it is possible, the evaluation is essentially then a costing process. In

596

Page 5: The value of information concept applied to data systems

Omega, Vol. 5, No. 5

those instances where neither the Pareto criteria is satisfied, nor is converting the information benefit into dollars possible, the final decision must be based on management experience as in the past.

2.4 Simulations--a robust approach to secondary effects In the large firm there may well be expenses or savings in departments other

than the MIS project proponent. These secondary effects may diffuse through- out the entire organization in a complex way. A model or simulation may be needed that will represent these interactions and summarize the effects in terms of aggregate cash flows. In those circumstances, simulations by digital com- puter have been useful, at least since the time of Forrester's basic book [14]. At this point, we would direct the reader's attention to Bonini's work, "Simulation of Information and Decision Systems in the Firm" [5]. Bonini demonstrates that simulations of this type can be done with results that appear to be very reasonable by the criteria most businessmen would use.

However, it is only reasonable to add that simulations require a high level of professional competence as well as a good deal of work. Under these circum- stances, preanalysis is desirable to verify the need for a simulation, before any commitment is made to this approach [16, 32].

3. THE COST ANALYSIS

The first cost analysis of a project should serve only to establish the order of the costs involved [41]. We would include in this a feasibility investigation concerned with the most difficult implementation problems of the project and their impact on expected costs [25]. This aspect of the study might involve, hardware capabilities and limitations, software availability, operations research to define the system and data analysis to establish the input data requirements and their availability [26, 34, 37, 38, 39].

3.1 The going concern analysis as comparative statics The going concern analysis is, by analogy, a kind of double snapshot com-

parison of the corporate economic situation. The question asked is: if the new system were fully installed and operating, what would the comparative cash flows be for some prior year, both with and without the system. Conventionally, this comparison period is the most recent prior year for which data is available. In essence, we are contrasting two economic situations, each of which is static in the sense that the corporate (agency) organization is in equilibrium once with the fully implemented system and once without it. The results of such an analysis might look like one period (year) of Fig. 1.

A natural extension of this approach, is to include a number of future periods (period analysis) in the comparison, but based upon projections of future work

597

Page 6: The value of information concept applied to data systems

Mathusz--The Value of Information Concept Applied to Data Systems

+1 oreco,te . 0rk vo, me v o~/[°ur~ e I

(b% -1. Systems Cost avoidances

savings $ (000) P&L savings

0 etc. Systems

costs Operating costs

Cumulative cash f l o w s $(ooo)

(c) Cumulative cash flows [ - ~

by y e a ~ l etc.

, " E s t i m a t e d poyback point

1 t 7 6 1 7 7 t r 8 1 ~ 9 1 8 o t BI I e t ¢

Year

F I G . 1. Estimated work colume, costs and savings and cash flows.

volumes. Under these circumstances, total costs or savings may be separated into 'hard' savings based on current volumes and cost avoidances based on the projections of future increased volumes. Results of such an analysis might appear as Fig. lb. It is worth noting that sampling techniques may be used to advantage to make the management information system relatively independent of increasing volume of input data for some applications [8, 24].

3.2 System capitalization and cash flow dynamics Our static comparison gave us the before and after views without having to

consider the implementation cost. If the results of this going concern analysis are satisfactory, it still remains to be seen if the development and implementa- tion costs are justified.

The cash flow analysis is essentially a period analysis showing the combined results of the static and transient costs by period over time. Figure la, b, and c, show the development from data volumes into period costs, savings and finally into the cumulative cash flow. In large expensive systems, the cash flow by

598

Page 7: The value of information concept applied to data systems

Omega, Vol. 5, No. 5

period is critical in itself. These cash flows are needed by both corporate plan- ning and the controller for investment feasibility analysis, while the chief executive will be chiefly interested in the payback and profit rates [4].

The question of the proper rate of interest is always raised. The answer most given is: if the funds are to be borrowed, then the loan rate is used; if internally financed, then the opportunity rate is used [29]. In practice, we suggest three or four rates be evaluated that include both the above rates. Then regardless of who challenges the rate ex post, we have figures or at least approximations to them immediately available. We can also use these results as a test of the sensitivity of the evaluation to the rate of interest, a feature often of interest to management.

4. M A N A G E M E N T VERIFICATION OF CASH FLOW ESTIMATES

The management of the project's cash flow should be based upon cash flow projections for which verifiable data can be obtained, i.e. data which can verify that the cash flow projections actually occur after the MIS project is imple- mented. This requires that certain actions be taken before and during the project. The advantages of this verifiable approach are twofold. a. The system's manager is protected from arbitrary ex-post judgments of

what the system should have and actually did do. b. Top management may expect a distinct reasonableness of cash flow benefit

claims. This implies that the cost measurements will be taken with, or before the systems analysis is complete, and definitely before any system implementation. Some exception to this rule is possible, but only at an ultimately higher veri- fication cost.

4. I Data techniques for the cost analysis Whenever we wish to change a system and measure the effects of the change

on that system, we measure the system before (pre-state) and after the change (post-state). This concept underlines the basis of the theory involved in desig- nated statistical experiments [9]. The techniques that are available to gather this measurement data may, for convenience, be divided into three classes, direct measurements, reconstructed data and models that synthetically represent the system over time.

The most straightforward of these techniques are those that measure directly each of the static pre- and post-states. Because these direct measurements are the most valid and lowest in cost, it is important to recognize that if this strategy is to be taken, the pre-state measurements must be made before any changes are implemented. As soon as the implementation of changes start, the system will

599

Page 8: The value of information concept applied to data systems

Mathusz--The Value o f lnformation Concept Applied to Data Systems

beNn reacting and either be destroyed or badly distorted for measurement purposes.

There is, however, another group of reasonably simple techniques which would allow us to synthesize the system as it existed before the change and make estimates of the pre-state system. These generally cost more than the direct measurements. Also, this group will be found less useful for attempting to measure the system dynamics, since each measurement point on the dynamic path will require a reconstruction of the system at that point. If we really must synthesize this path, simulation would be a more appropriate approach.

Finally, there are techniques of obtaining data that we may need ex ante, e.g. sales forecasts. These are not techniques in the comparative statics sense, but rather what Hicks calls "Pure Positive Theory" [20].

4.2 Requirements o f a valid post-state In order to make any valid post-state measurements, we must have a stable

system after implementation. This may be more difficult in practice than would first appear. To have this system, a minimum of the following conditions must be met. a. Programs fully debugged and procedures firm: we propose that a systems

program should not be considered debugged until it has operated success- fully with real data for a minimum of three complete processing cycles. (This is an arbitrary but useful rule of thumb.)

b. Files fully cleaned, corrected and controlled: quality control processing may be built into the programs as exception routines, and/or quality control exercised on new inputs.

c. Personnel are experienced with the new debugged system over several full cycles of work--which may be in the order of a year or more, e.g. corporate financial reviews are usually done on a quarterly basis.

d. Stable organization--in order for procedures to be firm, the organization itself must be stable with a minimum of personnel turnover and no re- organizations.

4.3 Reconstructed data, a. M T M Analysis :

b. Standard data : Personnel Machines

c. Vendor catalogues : contracts

Total jobs Machines Materials

techniques MTM, or similar analysis in details of the old vs new systems to yield contrast man hours per unit workload [27]. Utilizing standard data on previous workload via Standard Data, M T M Work Sampling or Position Spaces [27]. A search of old vs new catalogue/contracts prices to establish contrasting costs.

600

Page 9: The value of information concept applied to data systems

Omega, Vol. 5, No. 5

d. Expert opinion Anything

: While of doubtful validity, this approach may have the advantages of being the quickest, and lowest in cost-- thus best for some rough or unique political situations.

4.4 Direct measurement: techniques requiring a two state analysis Technique approach

a. Vendor contract cos t s

Total jobs Machines Materials

b. Earned standard hours

Personnel c. Actual hours

(recorded) Personnel Machines

d. Actual hours (unrecorded)

Personnel Machines

e. Overtime: (situa- tions of uneven high priority work)

Personnel Computer rent Machine rents (Others)

f. Position levels: (Situations of significant over- head cost changes)

Personnel g.

Pre-state analysis

Establish requirements in detail such that contract bids can be sought any time.

Record workload, earned hours [7].

Record workload and ac- tual hours [7].

Work sample over cycle and compute mean actual hours per unit workload and work- load volume. Record overtime per work- load volume and associated costs.

Record position structure as related to system work- load and associated costs.

Position spaces: Record number of people (full time activity) doing this work and work- (situations of so- load. Record associated called non- costs. measurable work- load)

Personnel

Post-state analysis

Obtain bids on the old and new requirements to con- tract the total and per unit work costs.

Measure unit workload and determine new standard and total and per unit costs. Measure new actual hours and workload. Contrast total and per unit costs.

Work sample or record actual hours per unit and work volume. Contrast total and per unit costs. Record overtime per work- load volume and associated costs. Contrast threshold work volume that produces overtime and unit and total cos ts .

Examine position structure as required by the new system. Contrast total and per unit costs.

Examine number of people required by new system. Contrast total and per unit costs.

601

Page 10: The value of information concept applied to data systems

Mathusz--The Value of Information Concept Applied to Data Systems

5. M A N A G E M E N T INVOLVEMENT IN THE M A N A G E M E N T I N F O R M A T I O N SYSTEM A N D

INFORMATION-BENEFIT

It is one of the standard laments of systems management that senior manage- ment does not become sufficiently involved in the creation of the management information system that is being designed to serve them. The problem that must eventually arise under these circumstances involves the credibility both of the information-benefit and cost estimates. This can become critical to the system's survival under the impact of the unusual internal political rnanoeuvering provoked by the interdepartmental nature of such systems.

While we see no easy solutions, the situation can only be helped by any definitive clarification to top management of the problems involved and by stimulating their involvement in the system's development. Stedry has made the candid observation that a cost control variance report is only as meaningful as the management's insight and action that results from it [36]. We have already noted that input information profiles change with managements. This has led to the concept of growing an information system about management [3]. The relevant point of this is that the capability to act on any given information is inherent in the manager himself. He must analyze his own management style and the input information profile this requires. He must then become involved enough in the system to ensure that his needs are met. It is nearly impossible for a given analyst to know a priori exactly what the system should give a particular manager. The manager's involvement is not only necessary but mandatory if the management information system is to be at all meaningful.

5.1 Management risk and decision (with uncertainty) Management decision takes over where analytics stop (as was always the

case in a well run organization). The technocrats, those in the Operations Research and Systems areas, must be candid enough to admit where the analysis ends, for any practical expenditures of time and money, and where management decision begins. To develop an honest rapport with higher management this aspect should be explicitly analyzed and presented for each specific project.

The technocrats must understand that even should their analytics be correct, there is still uncertainty to be taken into account (as defined before). One way to view the uncertainty is to consider that no analysis explicitly includes all causal factors, only the most significant of these. But it is possible that the non- significant factors omitted now, may contain one that will not only be significant, but even dominant in the future. For example, new legal interpretations of old laws may place a restraint on the organization in the future invalidating the entire post-state estimate. Many other surprises are possible and it is here that analysis must always be, in a sense, incomplete and management ability becomes

602

Page 11: The value of information concept applied to data systems

Omega, Vol. 5, No. 5

of p a r a m o u n t importance. Obviously, good analysis should decrease the surprise

content of the future and become an adjunct to a well educated manager ' s

decision criteria.

A C K N O W L E D G E M E N T

The author is grateful for the encouragement and helpful suggestions of Dean K. Karger of Rensselaer Polytechnic Institute, who reviewed earlier drafts of this paper.

REFERENCES

1. AcKon: RL (1962) Scientific Ptfethods: Optimizing Applied Research Decisions. Wiley, New York.

2. BAUMEL WJ (1965) Economic Theory and Operations Analysis. Prentice-Hall, New Jersey. 3. BENETT E (1964) Military Information Systems. Praeger, New Jersey. 4. BEYER R (1963) Meaningful Costs for Management Action. In New Decision-Making

Tolls for Managers (Eds BURSK EC and CHAPMAN JF). Harvard University Press, Massa- chusetts.

5. BONINI CP (1962) Simulation of Information and Decision Systems in the Firm. Prentice- Hall, New Jersey.

6. CAGAN C (1973) Data 2~lanagement Systems. Melville, Los Angeles, CA. 7. CARROLL P (1954) Time-Study for Cost Control. McGraw-Hill, New York. 8. CYERT R and DAVlDSON H (1962) Statistical Sampling for Accounting Information.

Prentice-Hall, New Jersey. 9. DAvis OL (1960) The Design and Analysis of lndustrial Experiments. Hafner, London.

10. DEARDEN J (1965) How to organize information systems. Harv. Bus. Rev. 43(2), 67-73. 11. DESMONDE WH (1954) Real-Time Data Processing Systems. Prentice-Hall, New Jersey. 12. DUDLEY CL, JR (1965) Management audits of EDP installations. Banking 58(3), 115-118. 13. EDWAROS NP (1964) On the Evaluation of the Cost-Effectiveness of Command and

Control Systems. In AFIPS Conference Proceedings. Cleaver-Hume, London. 14. FORRESTER JW (1962) Industrial Dynamics. Wiley, New York. 15. Fox PD and HANEY DG (1961) Some topics relating to military cost-effectiveness analysis.

Paper presented at the 29th National meeting of the Operations Research Society of America, Santa Monica, California.

16. GEISLER MA and STEGAR WA (1962) Determining preferred manager techniques in new systems through game-simulation. RM-3066-PR, RAND Corporation, Santa Monica, California.

17. GRAHM P (1965) Profit probability analysis of research and development expenditures. J. Ind. Engng 16(3), 186-191.

18. GRANT EL (1950) Principles of Engineering Economy. Roland, New York. 19. GREENBERGER M (1965) The priority problem. MAC-TR-22, Sloan School of Management,

Massachusetts Institute of Technology. 20. HicKs JR (1965) Capital and Growth. Oxford University Press, London. 21. KAZANOWSKI AD (1966) Cost effectiveness fallacies and misconceptions revisited. Paper

presented at the 29th National Meeting of the Operations Research Society of America, Santa Monica, California.

22. KNORR K (1966) On the cost-effectiveness approach to military R&D. P-3390, RAND Corporation, Santa Monica, California.

23. KOOPMANS TC (1957) The Construction of Economic Knowledge. In Three Essays on the State of Econamic Science. McGraw-Hill, New York.

24. MATHUSZ DV 0968) Data reduction through sampling in computer systems. Software Age, 2(4), 21-23.

25. MA THEsz D V (1969 ) Software modularization. Software Age 3(19-2 3 ) ; 39-4 3.

603

Page 12: The value of information concept applied to data systems

M a t h u s z - - T h e Value o f Information Concept Applied to Data Sys tems

26. MATHUSZ DV (1972) Simulation input data analysis: some considerations and definitions. ACM SIGShVI Q 3(3)

27. MAYNARD STEG~MARTON and SCHWAB (1948) ~Vlethods-Time ~'v[easurement. McGraw- Hill, New York.

28. MEADOW CT (1973) The Analysis of Information Systems. Melville, Los Angeles, Cali- fornia.

29. MEADOW C'I" (1970) Man-Machine Communications. Wiley, New York. 30. McDONOUG~ AM (1963) Information Economics and l~Ianagement Systems. McGraw-

Hill, New York. 31. NELSON AC et al. (1967) Evaluation of computer programs for system performance

effectiveness. Report N00140 666 0499, Research Triangle Institute, Triangle Park, North Carolina.

32. QUADE ES (1964) Analysis for military decisions. R-387-PR, RAND Corporation, Santa Monica, California.

33. PREST AR and TURVEY R (1966) Theories of cost benefit analysis. In Surveys of Economlc Theory, Vol III. MacMillan, London.

34. ROSEN S (1964) Programming systems and languages, a historical survey. In AFIPS Conference Proceedings, Vol. 25. Cleaver-Hume, London.

35. SOLOMON MB, JR (1966) Economics of scale and the IBM/360. In Communications of the ACM.

36. STEDRY AC (1960) Budget Control and Cost Behavior. Prentice-Hall, New Jersey. 37. TUKEY JW (1962) The future of data analysis. Ann. Math. Stat. 33(1), 1-67. 38. TUKEY JW (1970) Exploratory Data Analysis. Addison-Wesley, Reading, MA. 39. VAN HORN RL (1961) Systematic Methods for Programming Simplification. P-2447,

RAND Corporation, Santa Monica, California. 40. VICKERY WS (1964) Microstatics. Harcourt, Brace & World, New York. 41. WEINBERGER AH (1963) Economic Evaluation of R&D Projects. J. Chem. Engng 17(5 & 6)

18(i--4).

ADDRESS FOR CORRESPONDENCE: Donald V Mathusz Esq., 12726, Ben Fry Drive, Chester, Virginia 23831, USA.

604