computer-based national information systems: technology and

23
Chapter 13 Trends in Computer Technology

Upload: lyque

Post on 10-Feb-2017

213 views

Category:

Documents


0 download

TRANSCRIPT

Chapter 13Trends in Computer

Technology

ContentsPage

Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ....................123

Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ................123Hardware ..,... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ..............123Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ..............124Human-Computer Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ......126Communication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .................127

Processors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ...................127

Information Storage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ...............130Fast Memory Storage ● +.**.. ● ******. ● .9...** ● *..**** ● ******* ● 9 * * 131Intermediate Memory Storage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ....131Mass Memory Storage. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .........132Inexpensive Mass Storage .....** ..**.... .*. . .**. . .**. .*. ● . . * * * , . 132

Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ..................132Limits . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .................133Data Base Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ..............133Languages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .....................134Software Engineering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ..........134

Input-Output Technology. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ........135Graphics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ..............135Voice Communication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ............136Image Recognition. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .........136

Data Communication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .............137Digital Communicat ion Technology. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .137Digital Communication as Part of the System . . . . . . . . . . ..............138

Security Capabilities. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ..............138Classifications of Computer Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .138Specific Techniques of Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ...139Encryption . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ...........141Authorization ● ****.* ● .*****, ● ******* ● *...**. ...**.*. ● .....*. . . . 141Logging . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ..............142Operating Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ............142Data Base Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .............142

LIST OF FIGURES

Figure No. Page

8. Projections of Logic Cost per Gate. . . . . . . . . . . . . ...................1289. Increase in Capability of Semiconductor Chips From 1956 to 1980. .. ...128

1O. Drop in Average Computer System Cost per 100,000 CalculationsFrom 1952 to 1980. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ..........128

11. Cost and Access Time of Memory Technologies . ....................13012. Projections for Memory Cost per Character . . . . . . ...:..... . ........131

Chapter 13

Trends in Computer Technology

IntroductionComputer technology is advancing rapid-

ly, even explosively. To assess the impactsof information systems, the current state ofthis technology and where it is heading mustbe understood. The capability that informa-tion technology puts in the hands of the com-puter user determines the nature of the ap-plications that are developed.

Developmental trends are already shapingthe nature of the information-based servicesthat will be provided in the coming decade.New services that make use of the growingintegration of telecommunication and infor-mation technology are being designed andimplemented; and both industry and Gov-ernment are looking at a wide range of inno-vative products and services that will beused by the in-home information system ofthe future.

The nature of the technology as it evolveswill also influence the structure of the in-dustry associated with it. Companies thatspecialized only in components are begin-ning to market computer systems; com-

panies that sold only general purpose com-puter systems are beginning to market con-sumer products that incorporate thesesystems.

For the purposes of this study, emphasishas been placed on what information sys-tems can do rather than on the fundamentalnature of electronic technology. Many in-teresting developments now in the labora-tory, such as Josephson junctions, * may notfundamentally change either the nature ofthe systems that are designed or the pur-poses to which they will be put, particularlyover the next 10 years. Other developments,such as the microprocessor, are today revolu-tionizing the ways in which computers areused as well as the thinking about theirpotential social impacts. Overall, the an-ticipated changes from laboratory devel-opments over the next several years will bemore of degree than of form.

*A Josephson junction is a microscopic-size electroniclogic device that operates at the temperature of liquid helium.I t is very fast and uses very little power.

ConclusionsSeveral conclusions can be drawn from

OTA’s analysis of the trends in the develop-ment of computer technology over the next ●

10 or 15 years. Spurred by both industrialand Government research and development(R&D) programs, information systems areundergoing a revolution that will be mani-

Hardware

Computer electronics are experiencing anextraordinary drop in price, increase inpower, and reduction in size.

In 1977, the Privacy Protection Studyfested-by a proliferation of new products and Commission estimated that the cost of com-services affecting all sectors of American puting would drop by a factor of more thansociety. 100 during the 20-year period from 1970 to

123

124 . Computer-Based National Information Systems: Technology and Public Policy Issues

1990. This means that the million dollar com-puter of the 1960’s will cost less than a thou-sand dollars in the late 1980’s.

Concomitantly, during this same periodcalculating speed is expected to increase ahundredfold. In 1970, the largest processorsperformed 10 million operations per second,today they perform 100 million, and by 1990there will be a processor that will perform 1billion. In addition to greater speed, newdesigns can also greatly increase the powerof future computer systems.

The large computer that occupied one orseveral rooms in the late 1960’s will fit in adesk drawer by the late 1990’s, and a medi-um-size computer will fit in a briefcase oreven a coat pocket. These trends do not nec-essarily mean that in all cases the costs ofpurchasing, programing, and operating alarge computer application system will de-crease. Rather, more work will be done forthe equivalent number of dollars.

● There will be a great expansion in thenumber of computers being used in busi-ness, education, and the home.

This effect is already being seen. Thehome computer boom, which was the firstbig stimulus for the computer retailingstores, has fallen off slightly, only to giveway to a new marketing thrust aimed atsmall businesses. The hand calculator, whichhas become a ubiquitous tool, is already be-ing supplanted. A small hand-held computeris now available in the consumer market, andelectronic calculators are being built intowristwatches. Computers are also beingused as part of office automation.

● Computers will be used as components ina wide range of consumer products.

With the advent of an integrated circuitmicroprocessor that will sell in mass quan-tities for $1 or less, the use of the computerfor controlling everyday devices in the homeand at work will become commonplace. Com-puters are already being used or designed tocontrol such devices as clothes washers, sew-ing machines, home thermostats, automobile

engines, sprinkler systems, typewriters, fil-ing systems, electric lights, and cash regis-ters.

While many applications will involvesimply substituting electronic for mechani-cal control, the increased “intelligence” in-corporated in the products will be used toprovide such additional features as energyconservation or self-diagnosis of errors, andto give more flexible control to the user.

● New products and services based on com-puter and telecommunication technologywill become available.

In addition to adding computer control tofamiliar products, the new computer technol-ogy will be used to provide a new range ofproducts and services for the home and busi-ness. The video game and home computerare just the first of a long line of computer-based information products and servicesthat will appear. (Electronic funds transferand electronic mail, two examples of in-formation services, are examined in separateOTA reports.)

● There will be a continuing, rapid increasein the power of very large computer sys-tems.

Advances in speed, efficiency, and micro-electronics coupled with new design con-cepts will produce computers in the 1980’sthat are far more powerful than the biggestones now available. This type of power isuseful for a limited but important set of com-putational applications, e.g., improved sys-tems for weather prediction. Furthermore,systems that manage very large data basesrequire very powerful computer processors,particularly when sophisticated searchesand analyses must be made.

Software

● Software technology is expanding steadi-ly, although not as rapidly as the hard-ware.

Computer programs are sets of basic in-structions that tell the computer the steps to

Ch. 13— Trends in Computer Technology ● 125

take in doing a task. Programs can containmillions of instructions, and their design isas varied as the number of ways computersare used. While computer scientists havebeen successful in developing theoreticalmodels of computer hardware logic, effortsto build an equivalent theory of programshave not been rewarding to date. Thus, de-veloping systematic techniques for creating,testing, and monitoring computer softwarehas been a slow and tedious process. Someexperts maintain that programing is stillmore of an art than a science.

The continuing R&D on programing lan-guages and software engineering will pro-vide a flow of improved techniques and soft-ware tools, but the rate of improvement willnot match the explosive growth in hardwarecapability.

● New software techniques will allow com-puters to process a wider variety of data.

Traditionally, computers have processedeither numerical or alphabetic data struc-tured in very rigid formats. However, soft-ware for processing text, graphic images,and digitized voice is developing rapidly inaddition to software for processing dataalone. The result will be new families of prod-ucts and services affecting Government, in-dustry, and the citizen at home.

● Software technology is the limiting factorin controlling the rate at which new ap-plications appear.

The use of the new specialized hardwarethat is being designed is confined to veryrestricted purposes, or is merely putting ex-isting software ideas into hardware form forincreased efficiency. The software basis formost new computer applications in the1980’s exists now. There does not appear tobe much likelihood that a new concept ofcomputer design will change the way com-puters are used in the next decade. Rather,the changes will be in the scale of their useand in who will be using them.

G The predominant cost of information sys-tems will be for the software; the infor-

mation industry will become increasinglylabor intensive.

This conclusion follows directly from thelast two statements coupled with the laborintensive nature of programing. This trendwill influence the marketing practices ofcomputer manufacturers, who will increas-ingly find profit in the sales of completesystems—combinations of hardware andsoftware—rather than hardware by itself.

● Software reliability, security, and audit-ability will improve slowly.

Large systems, because they are complexand cumbersome, tend to suffer from thekinds of reliability problems that are notsolved by building more reliable hardware.The problems of assuring that a system isactually performing as intended and cannotbe compromised, accidentally or deliberate-ly, are inherent in the complexity of softwaredesign.

Furthermore, although computer softwareexperts are improving their understandingof how to increase the reliability of pro-grams, they are unable to keep pace with thegrowth in the size and complexity of thesystems being designed. Recently, systemdesigners have become more aware of theneed to design secure and auditable applica-tions, and users have become aware thatthey can demand such capabilities from theproducers. Thus, although some improve-ment is taking place, substantial progresswill depend on more R&D.

● New data base techniques will allowmassive data banks that serve multipleuses.

Data bases will grow; some will containtrillions of units of information. At the sametime, people will want to work with the datain more varied ways. Today, sophisticatedprograming is often required in order to han-dle efficiently each different type of query auser might want to make of the data base.

However, researchers are developingmethods to improve the efficient use of large

126 . Computer-Based National Information Systems: Technology and Public Policy Issues

data bases and to make them serve multipleneeds. This is being done through the devel-opment of more powerful query languages,new ways of organizing the data within themachine, and new hardware designs.

Human-Computer Inter face

People communicate with computers forthree basic reasons: to describe the task tobe done, to enter data for processing, and toderive results. Improvements in this tech-nology will result not only in less costlysystems, but also in a vast expansion of in-formation systems capabilities and in theirmore efficient use.

● There will be an increase in the direct useof computers by nonexperts.

Improvements in programing languageswill allow users to communicate more easilywith the computer. Historically, programingand system control languages have beencomplicated and time-consuming to learn.They often require understanding how acomputer operates. New, easy-to-learn butpowerful languages will increase the numberof people who will use computers directlywithout recourse to an intermediary expert.In addition, the proliferation of computer in-troductory courses in high schools will in-crease the number of people who have a basicknowledge of computer systems.

This trend will allow many more simpleapplications to be developed by users in-dividually or in modest organizational set-tings. However, in larger organizations, orfor applications that require integration withother systems, it will mean that much of theprograming for modern small systems willbe done by end users in industry and in thehome who are not subject to control by cen-tral data processing management. This maylead to such problems as misuse, poorlyfunctioning systems, or incompatibility.

● More data will be captured by computersand stored in machine-readable form.

The distribution of computing powerthrough the use of microprocessors linked

together over communication lines will in-crease the amount of data captured at thesource and stored in computer-readableform. Some types of information are cap-tured deliberately, as in the case of the com-puterized register in a retail store. Othertypes, which are captured only as a byprod-uct of automation, may be found to be usefulenough to keep. For example, the systemdata collected by word processing systemsmay be considered useful by managers inmonitoring secretarial efficiency. The pro-liferation of capturing such data may raiseserious policy issues.

● Output will be organized to present in-formation that is more directly useful tothe user.

It has been known from the earliest daysof computing that the form in which theresults of computations are presented candetermine, in great part, whether it is actual-ly used or whether the answer being soughtis found. Advances in both the hardware andprograming for image display and speech arebeing brought out of the laboratory and intothe commercial market.

Research in information display is discov-ering how to filter significant informationfrom insignificant data and present it to theuser in the most efficient way. There is now anew, burgeoning interest in the use of colorgraphics display, a technology long consid-ered the domain of the computer researchlaboratory, but too expensive for generaluse.

● There will be increased interface with in-formation systems by consumers in theirhomes and employees in their offices.

Many systems designed for entertain-ment, education, information retrieval, andcomputational services are beginning to beoffered through a combination of evolvingtelevision sets and telephone instrumentsbecause of easy-to-use software, data banks,and the distribution medium provided bycable television (CATV). As a result, there isa possibility that society as a whole will be

Ch. 13—Trends in Computer Technology 127

substantially affected, but to what extent ispresently unknown.

C o m m u n i c a t i o n

The rapidly increasing availability ofinexpensive digital data communicationthrough specialized networks such as Tele-net and Tymnet, coupled with the trend ofmanufacturers to design systems with elab-orate communication hardware and softwarebuilt in, are motivating growth in the use ofdistributed, communication-based systems.New satellite-based data communicationservices will further stimulate this trend.

s The use of distributed data bases anddistributed processing will grow rapidly.

Rather than centralizing data collectionand processing as in the past, the most effi-cient procedure in the future will be tolocalize these functions at the point wherethe data are originally captured. Organiza-tions will have computational capacity dis-

tributed among all of their offices. Allcomputer-based devices, even word proces-sors, will be linked into the central system.

The problems in managing such a distrib-uted system will be a major concern of largeorganizations. In particular, procedures forcontrolling access and for ensuring dataquality will be more difficult in a distributedenvironment. However, dealing with themeffectively will be crucial to the successfuloperation of communication-based systems.

s There will be increased availability of com-puter services to the home and businessover communication lines.

Many homes will have small computers orcomputer terminals, and those that do notwill likely contain telephones and televisionsets that have been equipped with computercontrol. All of these devices will providelinks from the home and office to a multitudeof information services provided over a vari-ety of communication channels such as tele-vision broadcast, telephone, or cable lines.

ProcessorsThe 1970’s have seen continual dramatic

improvements in the characteristics of thecomponents from which computers aremade. It is expected that this trend will con-tinue through the 1980’s, with computinghardware becoming remarkably inexpensiveand efficient.

The decline in cost per logic function from1960 projected to 1990 is shown in figure 8.In 1960, the price of a logic gate ranged from$1 to $10 per unit, depending on speed. By1990, that price is expected to range from afew thousandths of a cent to a few tenths ofa cent per gate. This continuing decline isbased in large part on the dramatic increasein capability of semiconductor chips, as il-lustrated in figure 9.

There has been a parallel increase in thespeed of processing. In 1960, the fastestmachine executed about 1 million instruc-

tions per second. By 1990, there probablywill be computers that will execute a billionor more instructions per second, a thousand-fold increase in speed.

This combination of increased speed anddecreased cost for logic components resultsin a steady decline in the cost of computa-tion. The drop in the costs of computing onIBM systems that are roughly equivalent,over the period 1952 through 1980, is shownin figure 10.

These gains have all been due to progressin integrated circuit technology, the processby which electronic components are printedon small chips of silicon. Using these chipsas components has resulted in a steadyshrinkage of the size of computers fromassemblages that filled several rooms to thecurrent desk-top versions. Mass productiontechniques have replaced the hand-wiring of

128 Computer-Based National Information Systems” Technology and Public Policy Issues

Figure 8.— Projections of Logic Cost per Gate

1,000

100

10

1

0,1

001

0.0011960 1970 1980 1990

Year

Office of Technology Assessment and Privacy ProtectIon StudyCommtsslon

Figure 9.— Increase in Capability ofSemiconductor Chips From 1956 to 1980

1950 - 1960 1970 1980 1990

Year

SOURCE Institute of Electrical and Electronic Enalneers /EEE .%ectrurn VOI17 June 1980 p 48 U S Manufactures of semiconductor chips Incl ude f I rms such as Intel Mot urola Texas Inst rum en Is Rockwel 1, Natlonal Semiconductor and Zlloq

Figure 10.— Drop in Average Computer System Costper 100,000 Calculations From 1952 to 1980

10.00Cost per100,000

. A$l ,26 Year calcu Iat ion ●

o

1.00

0.1

0.01

0.001

00001

1.00001

1952 $ 1 . 2 61958 0.261964 0.12

1950 1960 1970 1980Year

“Cost per 100,000 calculation IS based on data for the following IBM computersystems (with year In parentheses) 701 (1952), 7090 (1058), 360/50 (1964),370/168 (1970), 3033 (1976), 4300 ( 1980)

SOURCE Office of Technology Assessment and President’s ReorganlzatlonProject, Fecfera/ Dafa Processing Reorganlzaftorr Sfudy Basfc Reporfof the Scjence and Technology Team, Washington, D C June 1978,pp 2930

a decade ago, speeding up the manufactur-ing phase. Energy consumption, both tooperate the computer system directly andfor the necessary system cooling, hasdropped enormously.

The financial implications of these lattertrends are significant in that they arestimulating a rapid growth in computer ap-plications that will accelerate beyond theircurrent high rate in the 1980’s. Facility costshave historically been a major expense tousers installing computers. Now, many sys-tems take up only a desk top, require no spe-cialized environment control, and plug into anormal electrical wall socket. This flexibilitymeans that many computer systems can beinstalled at any site, from the home to thebusiness, with little or no added cost forpreparing the physical site.

Ch. 13—Trends in Computer Technology 129

Inexpensive very large-scale integration(VLSI) based computer hardware will alsolead to lower maintenance costs in the1980’s. Maintenance currently is estimatedto cost about 5 percent of the computer’spurchase price per year for medium-size com-puters. The figure is higher for smallermachines. A reduction in these costs willresult from lower failure rates, easier main-tenance by replacing larger modular units ofthe system, and built-in hardware redun-dancy and fault diagnosis.

Implications: These trends have severalimplications for computer hardware. In thefirst place, as illustrated in figure 10, therehas been and will continue to be a steadydrop in the cost of computing on general pur-pose, medium- to large-scale computing sys-tems.

Second, small inexpensive, personal desk-top computers have appeared on the marketin the last few years. These “microcom-puters, while modest by present standards,are quite powerful relative to the standardsof only a decade or two ago. The price ofthese systems will drop rapidly, and theircapacity will grow over the next decade.These small systems will likely drive thedevelopment of a true mass market for com-puters, a market analogous to the currentone for hand calculators.

In addition to making the microcomputerpossible, the ability to mass-produce chipsand to custom design them at the same time,using the automated production machines,means that there will be a proliferation ofspecial-purpose computers that are custom-made for specific applications. One of themotivations for the development of the so-called “general purpose” computer was thata computer system was expensive and dif-ficult to design and build. Thus, many dif-ferent user markets had to be identified for asingle design in order to provide a sufficientcustomer base to justify its production.

This view of the manufacturers was repro-duced in miniature within each computercenter, which accumulated as many different

types of applications as possible in order tojustify acquiring a large computing machine,and thereby benefit from economies of scale.

These days, however, it is feasible to buildspecial-purpose machines, because the spe-cialized markets have grown and because thecost of designing and marketing customtailored systems has dropped. As an exam-ple, a variety of firms (some quite small) aremarketing so-called “array processors, ”computers designed to calculate the solu-tions to systems of mathematical equationsthat display certain special characteristics.These processors are designed to be con-nected to a general purpose computer, and tobe called on only for the specific calculationsat which they excel. In the jobs for whichthey are intended, these array processors,which cost about as much as a large com-puter, are hundreds of times more powerfulthan the biggest computers available. Themarket for this machine exemplifies the in-creasing ability of small firms to enter thecomputer hardware business and carve out aniche for themselves.

Basic R&D in hardware design is pickingup again after a hiatus. It moved out of theacademic and pure research laboratory in the1960’s due to the high costs of building hard-ware and the lack of enthusiasm for new de-sign concepts on the part of manufacturers.Now, the costs and feasibility of experimen-tal research have improved dramatically.The result should be a proliferation in the1980’s of small specialized computers, tai-lored to particular needs. This trend willreduce computing costs even more thanwould result from the drop in componentcosts alone.

In general, experts expect that a continu-ing trend will be seen toward modular com-puter architecture. Logical functions will bedistributed over the entire system. The cen-tral processor will be a traffic director con-trolling the flow of work among the variousunits. There will be specialized computationunits like the array processor discussedabove, file management processors for han-

130 . Computer-Based National Information Systems Technology and Public PoIicy Issues

dling data bases, communications control ponents will not need to be in the same roomunits, specialized language processors, and or even the same city to be part of theothers. Because of widespread high-speed system.digital communications, these various com-

Information StorageA computer memory comes in a wide vari-

ety of sizes, speeds, and types. Any par-ticular system uses an assortment ofmemories to support its operation. Its mostsignificant design characteristics areretrieval time and capacity.

Retrieval time is the time it takes to get asegment of information from the memory.The technology currently available providesa range of times from 1 nanosecond (1billionth of a second) to a minute or more.Retrieval time includes the time both to findand to read the information.

Capacity is the amount of informationthat can be stored conveniently and eco-

nomically. It is measured in bits ( thesmallest fundamental unit of information),or bytes, an 8-bit grouping roughly equiv-alent in information content to a singlealphabetic character or two numerical digits.

There is a distinct tradeoff between speedand capacity. Therefore, in connecting acomputer that performs millions of instruc-tions in a second with a world that operatesin a much slower timeframe, a hierarchy ofmemory types is used. The current selectionof memory technology available to systemdesigners is shown in figure 11, and the pro-jected drop in cost for information storagethrough 1990 is shown in figure 12.

Figure 11.— Cost and Access Time of Memory Technologies

105

104

103

10’2

10

1

10-1

10-2

IOns IOOns 1}1s 10/1s 1 00/s 1 ms I O m s IOOms 1s 10s 100s

1.25

.125

.012

1 ,25m

.125m

.012m

1 ,2/1

12U

Access time

SOURCE Inst It u!e Gf E ect rlcal and E Iect ror IC Engneers IEEE Spectrum m 18 March 1981 p 4 I

Figure 12. —Projections for Memory Costper Character

100

10

1

10-’

10-2

l@-’

10-4

10-5

10-6

10-’

\Solid state random-access memories

(bubbles, semiconductors)

Conventionalmagnetic tape

t-Wdeo disc

11960 1970 1980 1990

Year

SOURCE: Off Ice of Technology Assessment and Privacy Protect Ion StudyCommission

There are two other important character-istics of various memory systems that affecttheir use in particular applications. Theseare volatility and writability.

Volatility refers to the long-term stabilityof the memory. Many storage systems losethe information they contain when the powersource is removed or interrupted. Otherskeep the information indefinitely until it ischanged.

Writability refers to the ability to insertinformation in the memory. Memory is clas-sified as read/write or read only, dependingon whether or not the computer can copynew data back into the storage in about thesame time scale as information can be read.Read only memories are written once and therecording is permanent. In this discussion,memory is roughly categorized as fast, in-termediate, and mass storage.

Ch. 13—Trends in Computer Technology ● 131

Fast Memory Storage

Fast memory storage constitutes the up-per left-hand group in figure 11. It is themost closely connected to the processor, andhence its speed must be consistent with thatof the computer hardware. Because of therelatively high cost of fast memory, it isusually the smallest memory in the system.

For many years, core memories based onthe magnetic properties of an iron compoundwere the principal technology used for fastmemory storage. However, this technologyseems to have reached its limit in cost,speed, and size. Most new memories now aredesigned around VLSI semiconductor tech-nology.

Intermediate Memory Storage

Intermediate memory storage, which isreasonably fast but inexpensive enough tobe used in large quantities, serves as a bufferbetween the high-speed semiconductor mem-ory and the very slow mass storage devices.In the long term, bubble memory m a ydevelop into the technology of choice for in-termediate memory storage. Information isstored as tiny magnetic domains that circu-late along a printed track in a semiconductormedium. Some bubble memory products arealready on the market, and new improvedones are being announced continually.

One of the advantages of bubble memoriesis that the domains are stable. Therefore,when a bubble memory is used in a terminal,for example, data can be entered and storedin it for later transmission without needingto keep the terminal hooked up continuouslyto a power supply. The technology of bubblememories is progressing rapidly. One manu-facturer has already announced plans tomarket a million-bit chip that will costaround $2,000 in small quantities.

High-speed magnetic disks are still widelyused for intermediate memory, and will con-tinue to be the most important technologyfor this decade. Steady improvements will beannounced throughout the 1980’s. Some ex-

132 ● Computer-Based National Information Systems: Technology and Public Policy Issues

perts predict that disks will eventually beused in a more finely stratified memoryhierarchy, rather than being replaced bybubble memories. This may depend onwhether progress in expanding the capacityof bubble memories continues as rapidly as ithas.

Mass Memory Storage

Progress seems to be slower in the area ofvery large storage devices. No radically newtechnology has appeared that seems topromise any breakthroughs. Magnetic tapewill continue to be used heavily over the nextdecade.

Many new mass storage technologies–e.g., video disk—are difficult and expensiveto write, but very cheap and easy to re-produce and read. Erasing and rewriting areoften impossible. Costs per bit of storage forarchival applications are low enough to becompetitive with paper storage. Incrementalimprovements over the next decade shouldstrengthen this advantage and stimulate atrend toward permanent document storagein electronic form.

Inexpensive Mass Storage

The rising market for small inexpensivecomputer systems is producing a demand forsmall, very cheap, bulk storage technology.The technology is different from that pro-vided for large systems. The size of bulkstorage needed is less, and there is moretolerance for slow search times. Currently,personal computers are supported by twotypes of bulk storage, magnetic tape cas-settes and “floppy” disks.

Tape readers are very cheap, as is the tapeitself. However, read time is quite slow, evenby personal computer standards. Floppydisk hardware is more expensive, althoughstill within the requisite price range. Thedisks themselves are cheap and easilystored.

Some manufacturers, particularly thosemarketing computer games, sell programspermanently written on solid-state read onlymemories (ROM). While relatively expensiveas a medium, ROM has the advantage of be-ing difficult to copy, thus discouraging thepirating of software. The ROM approach hasnot been well accepted by the marketplace,however.

SoftwareComputer software is also a form of

technology. Although it is not a tangibleproduct like a piece of hardware, softwareshares many of the same characteristics.

‘ Much R&D carried out by computer scien-tists concern software problems. It oftenresults in new concepts and basic techniquesfor organizing data and sequencing instruc-tions to enable a computer to accomplish aparticular task.

Very large programs are harder to ana-lyze, design, and produce than are equallylarge machines that run them. Consequent-ly, the state-of-the-art in software lagsbehind that of hardware.

Occasionally, a major breakthrough inprograming can motivate new machine de-signs. For example, a wave of new smallprocessors has appeared on the market tosupport signal processing applications.These processors are based on the “fastFourier transform” algorithm, an importantdiscovery made a few years ago that allowedthe solutions to certain types of mathemati-cal computations to be calculated 10 timesfaster than previously.* Even greater speeds

*The Fourier transform itself is a pre-20th century mathe-matical technique. The advance was a new way to performthe numerous and tedious computations the technique re-quires.

Ch. 13--- Trends in Computer Technology ● 133.

have been achieved by using special proces-sors to take advantage of this algorithm.

L imi ts

One area of research that will affect soft-ware development in the 1980’s is fundamen-tal research in computational complexity.Until recently, fundamental theory in com-puter science was concerned with computa-bility–whether a computer could calculatean answer to a problem. However, the classof problems that the computer could theo-retically solve was enormous. More recently,researchers have been exploring the morepractical questions of how easily a problemmay be solved, how fast, and how much com-puter power would be required. The answershave been surprising. Many categories ofproblems have been found to be almost im-possible to solve, that is, the best programpossible on the fastest computer availablewould take many years, even millions ofyears, to calculate an answer. Such resultshave already had practical application. Forexample, they have led to the developmentof new types of cryptographic codes with in-teresting new properties. Future researchmay lead to the ability to calculate theo-retically the breakability of a code.

Over the next decade of ever increasingcomputer power, some types of problems arelikely to remain intractable. Complexitytheory will help improve the understandingof these limits of computers as they are nowdesigned. Furthermore, it may possiblymotivate some radical new concepts in com-puter design.

Data Base Systems

There is a trend toward the use of large in-tegrated data base systems that supportmultiple applications. Researchers havebeen developing methods for organizing dataso that it can be accessed in many ways, notall of which may be predicted when the sys-tem is designed. Some of the most generaldata structures, such as relational databases, are not as yet efficient for very large

data bases, but are appearing in commercialproducts for use with medium-sized sys-tems. New developments in data handlingalgorithms, and new hardware designs spe-cifically tailored to the support of data ac-cess systems, should provide continually im-proving capabilities during the 1980’s.

Improved query languages will allow theinformation user to interact directly with thecomputer to obtain the needed data. The in-tegration of better report generators* withthe data base system will allow the output tobe produced in a more directly usable form.These advances will provide the system userwith access to the data base which is muchmore direct than is now the case. Currently,the normal practice is to work through an in-termediary, both to frame the initial inquiryand to interpret the results.

The capability to transfer information be-tween data bases over communication net-works will also be improved. This capability,already well-developed in some specific sys-tems, will allow more general exchangeamong systems connected to commercialdata networks. Standard protocols andforms will be agreed on. True distributeddata bases are systems in which the locationof any particular item of information in thedata base may be anywhere in a national oreven international network, and in which ac-cess to that data is handled totally by thesystem. These should start to appear com-mercially in the 1980’s,

The increasing size of data bases, themultiplicity of their uses, the wider remoteaccess to them, and their integration overtelecommunication lines will all presentproblems in preserving the security and in-tegrity of the information. It will be morechallenging to protect the privacy of per-sonal data or the security of economicallyvaluable information on such distributedsystems.

* Report generators retrieve information needed by a man-ager, perform moderate calculations on it, and organize it in areadable and usable form.

134 ● Computer-Based National Information Systems: Technology and Public Policy Issues

Languages

Languages are the means by which usersdescribe to the computer the kinds of opera-tions they want performed by the system.Machine language is the set of instructionsthat is wired into the computer. Althoughsome machine language programing is stilldone, it is very difficult and inefficient to usedue to the size and complexity of most appli-cations. Modern systems on a medium-sizedcomputer may contain more than a millionmachine language instructions.

Faced with these difficulties, programmersturn to a higher level language to write mostsoftware. As computers have grown largerand more complex, and as the tasks de-manded of them become more complicated,there has been a continual trend toward theuse of languages designed to be more rep-resentative of the way humans express prob-lem-solving procedures.

In addition to increasing programing effi-ciency, developers of higher level languageshave other goals. Some special languages aretailored to produce efficient codes for par-ticular classes of applications. Others havebeen developed based on the perception thatprograming a computer is a form of mentalproblem-solving, and that the language asthe vehicle for that effort is an intellectualtool that can directly help thought proc-esses. Finally, there has been a trend to de-velop languages that integrate users moreclosely with the system, thus lowering thedegree of expertise required to program thecomputer. By programing in these useroriented languages, persons who are notcomputer specialists can interact directlywith a system rather than through a profes-sional programer intermediary.

The rapid expansion in the use of largedata base systems that serve many usersover communication networks is driving amajor development effort in query lan-guages. They are the means by which a usergains access to a data base, describes thedesired information and specifies the formatin which it is to be presented.

A principal driving force toward furtherlanguage development is economics. Hard-ware costs are decreasing while labor costsare increasing. Developing languages thatwill increase the productivity of programmershas become a high priority. Ultimately, someresearchers envision automatic programingin which users specify the problem to besolved in a language close to that used intheir environment—engineering, law, medi-cine, and so on. The computer, using thisproblem statement, would write its own pro-gram.

Full development of such automatic pro-graming systems is far in the future, al-though simple systems for preparing man-agement reports and models derived fromdata bases are already in use. More short-term effort is being concentrated on de-veloping tools that help the programer usethe computer as an active agent assisting inthe creation of a system.

Progress is expected to be slow. It will beimpeded by the sheer difficulty of matchingthe essential ambiguity of human thoughtand language with the absolute rigidity ofcomputer language. It is also economicallydifficult to implement radical new languagesthat require substantial retraining of pro-grammers. Finally, there is a reluctance towrite programs in languages that are notwidely used and known, for the usefulness ofa program so written may be unnecessarilyrestricted.

Software Engineering

For most of the history of computers, pro-graming has been considered an art ratherthan a form of controlled design. Emphasiswas on the ingenuity and elegance of theproduct, rather than on more mundanemeasures of utility. Creativity will alwayshave a place in good programing, but its em-phasis conflicts with managerial imperativesfor economy, control, and predictability. Inparticular, when the development of a majorsystem requires the coordination of hun-dreds of programmers writing millions of lines

Ch. 13—Trends in Computer Technology . 135

of code, the project must be managed in sucha way as to guarantee a usable product, ontime, and at a reasonable cost.

The relatively new discipline of softwareengineering has been attempting the dif-ficult task of developing techniques both forprograming and for managing the program-ing of large software systems. The operatingsystem and support software of a largemultiprocessor computer system is extraor-dinarily complex. However, the economic im-perative will force work in this area andassure the quick adoption of new results.

Until the present time, efforts have beengeared to developing both techniques forbreaking a large proposed system into suc-cessively smaller logical subunits that can beassigned to programing teams, and ways inwhich to manage the work of the programmersand the communications among them. Theseand related techniques will gradually becomecommonplace over the next 5 to 10 years asthey are learned by the next generation ofprograming professionals and managers.

Input-Output TechnologyThe principal means of communication

with a computer has traditionally beenthrough a form of typed input and alpha-numeric printed output. Although this typeof communication will continue to dominatein the near future, it is inadequate for manynew applications. Advances in technologyfor human-computer interaction will changethe way in which computers are used bymany people.

Graphics

Graphical display of information is be-coming more economically viable, and thetechnology, both hardware and software, isimproving in its capability. In the last fewyears, researchers have developed efficienttechniques for removing hidden lines inorder to display solid objects and for combin-i n g h a l f - t o n e , s h a d e d i m a g e p r o d u c t i o n w i t hc o l o r . S o m e c u r r e n t r e s e a r c h i s f o c u s e d o nd e v e l o p i n g t e c h n i q u e s t h a t m o d e l v a r i o u ss u r f a c e r e f l e c t i v i t i e s a n d t e x t u r e s w i t h r e -spect to different types of light sources.

While work is proceeding on improvingthe speed of graphical computations to allowthe terminal to display very high resolutionpictures fast enough to depict motion, such acapability may remain very expensive over

the next few years for any but the most sim-ple types of pictures.

Sharp cost breaks are expected for dis-plays with lower image quality. The use ofbit-mapped frame buffers, which store slowcomputer graphics output and play it backat normal speed over a display terminal, willgrow as the costs of memories drop.

Some research is being pursued on holo-graphic output of three-dimensional images.However, holographic display is not ex-pected to become widely used in the nextdecade.

Computer graphics technology is alreadyfinding widespread commercial use forcreating animated films. It competes favor-ably with traditional manual techniques. Theuses range from pure and commercial art tothe production of educational films. Com-puter languages and graphics software havebeen developed to allow artists and de-signers to interact directly with the com-puter, generally through the display screen,to create their images.

In computer-aided design, the user is in-teractively coupled to a display screen with agraphical data base and analytical programsstored on a computer. Designers use thecomputer and the display to develop their

136 ● Computer-Based National Information Systems: Technology and Public Policy Issues

designs on a screen. For example, an ar-chitect designing a building on the graphicsdisplay can have an instantaneous computa-tion of cost, and a civil engineer can do stresscalculations while designing a bridge. Com-puter-aided design has emerged from its in-fancy. Steady improvements in software anddecreasing computing costs will likely makeit the methodology of choice for most engi-neering design in the 1980’s. Even today, in-tegrated circuits and printed circuit boardsare preferentially designed by these tech-niques.

Graphical output of computer data cantransmit information that is difficult or evenimpossible to derive from a printout list ofnumbers. Whether calculating stresses in anairplane wing, the water flow in a flood plain,or the molecular structure of an enzyme, thenumerical output of the computer calcula-tions must be translated into a picturebefore any sense can be made from theresults.

Voice Communication

Voice communication with computers ison the verge of becoming commercially suc-cessful. The advances have been both indeveloping better and cheaper computa-tional techniques for generating and rec-ognizing speech and in learning more aboutthe ways in which such systems could beused. This new understanding has loweredthe estimates of what would constitute acommercially useful level of performance.Capabilities far short of a human level of per-formance can markedly enhance communica-tion between human and computer. Some ex-perts even expect a voice-driven typewriterto be on the market before the end of thedecade.

There are two basic problems in speechsynthesis—first, creating a voice tone carry-ing a phoneme (a fundamental linguistic ele-ment of speech), and second, stringing thesephonemes together to make a sentence orphrase. Although neither problem has beensolved to the ultimate point of producing

natural human-sounding sentences, the tech-nology is improving rapidly. It has alreadyreached the point of commercial utility.

Several companies sell chips in the $10range to synthesize speech sounds. TexasInstruments offers a mass-produced elec-tronic toy, Speak and Spell® , to help chil-dren learn to spell.

One important application of speech is areader for use by the blind that convertstyped text into spoken words. While current-ly very expensive, these devices should be-come more economical in the near future.

Speech recognition is a more difficultproblem, since the system must deal with aninput having great variability. The voice andstyle of speech varies widely among indi-viduals; the computer recognize is poten-tially confronted by a wider array of poten-tial vocabulary to identify; and it mustanalyze a multiplicity of grammatical struc-tures. Again, entrepreneurs have found thateven the very limited capabilities now possi-ble are marketable.

So-called “continuous speech” recogni-tion, understanding natural speech, is still inthe research laboratory. While large-scalecommercial systems are not expected in theshort term, the strong market already de-veloping for limited range speech recogni-tion systems will motivate R&D and en-courage the rapid commercialization of re-search results as they appear.

The best performance to date in a researchlaboratory environment has been shown by asystem that can recognize sentences 91 per-cent of the time from multiple speakersusing a vocabulary of 1,011 words. Major re-search efforts are proceeding at IBM and ata few university laboratories. Commercialdevices with small vocabularies are nowbeing marketed.

Image Recogn i t i on

Image recognition presents another formof the problem of recognizing patterns ofdata. The state-of-the-art is at a similar

Ch. 13—Trends in Computer Technology ● 137

point. Simple self-contained patterns not in aconfusing context can be recognized. De-vices on the market read standard type-writer and even handprinted characters.Point-of-sale scanners in stores read and rec-ognize the universal product code on pack-ages drawn across the unit at various speedsand orientations. However, analyzing a moregeneral picture for a variety of elements is a

more complicated task, one which, likespeech, may depend upon some “under-standing” of the context of the pattern.

Slow but steady advances in pattern rec-ognition technology are occurring. The sharpdrop in cost for the requisite technology isincreasing the range of potential applica-tions.

Data CommunicationThe ability to move data quickly over long

distances through the use of new digitalcommunication technology has had a sig-nificant impact on the design and use of in-formation processing systems. Designersnow have the opportunity to build systemswith greater power than was previously pos-sible, enhancing their ability to process in-formation and provide it to the user in atimely manner. More importantly, however,telecommunication has brought the usercloser to the computer by allowing directinteraction with the processing system. As aresult, the use of information processingtechnology has become an integral part ofthe day-to-day work of many organizationswhose operations have become totally de-pendent on a computer system.

only technology that relates directly tothe development of computer-based informa-tion systems is discussed here. (For a de-tailed analysis of communication tech-nology, see the OTA assessment report en-titled, An Assessment of Telecommunica-tion Technology and Public Policy.

Digital CommunicationTechnology

The steady improvement of telecom-munication service over the last quarter cen-tury has benefited the design of computersystems by decreasing their cost and im-proving their reliability. Significant applica-tions using communications date back to theearly 1960’s. However, the cost and com-

plexity of putting together a communi-cation-based computer system restricted itsuse to applications, such as reservationsystems, that had an inherent need for re-mote data entry and display.

Existing communication carriers and newenterprises are beginning to offer new datacommunication services. These services aredesigned to provide inexpensive high-speedcommunication capacity specifically design-ed for use by computer systems. With thesenew services available, a host of new com-munication-based applications will appearover the next decade.

Traditional communication systems havebeen tailored to carrying voice. The char-acteristics of voice communication are quitedifferent from those of data communicationbetween computers or between computersand people. The voice telephone network,through switching, provides a temporaryline connecting two users. The charges arebased on the length of the line provided andthe length of time it is made available.

Data communication tends to come invery high-speed bursts, with long periods ofsilence between transmissions. To performthis type of communication on a traditionaltelephone network is inefficient, as the line isunused for most of the time. One approachhas been to design a network with multiplepath connections. Packets of information,along with a destination address, are sentover any instantaneously available path, andthe user is charged for the quantity of in-

138 . Computer-Based National Information Systems: Technology and Public PolIcy Issues

formation transmitted and for the connec-tion with the network. One payoff in sharingtraffic from a larger community to obtainbetter line usage is lower user costs. Sec-ondary benefits include error-free per-formance and higher reliability. This type ofcommunication facility is called packetswitching, and is available as a commercialservice. Thus, a librarian in Washington,D. C., with a terminal can dial a local numberto access the Washington entry to a nationaldata communication network. Through thatnetwork, the librarian can access abibliographic data base in Los Angeles, anddo so at a low cost.

Digital Communication asPart of the System

Viewed in the context of computer systemoperations, data communications are no dif-ferent from any other application. However,they do introduce new capabilities and prob-lems to the design, implementation, and op-eration of information systems.

Early implementations of the data com-munication programs were designed to takeadvantage of processor cycles that could notbe used productively by other applications.However, the growth in the communicationw o r k l o a d , c o m b i n e d w i t h o t h e r n e w t a s k st h a t m a y h a v e b e e n l o a d e d o n t o t h e p r o c -e s s o r , c a n s a t u r a t e i t a n d c r e a t e a n e e d t om o v e t h e c o m m u n i c a t i o n m a n a g e m e n t f r o mt h e c e n t r a l c o m p u t e r t o a p e r i p h e r a l p r o c -e s s o r .

Fully programmable front-end processorssupport the trend of moving communicationprocessing away from the central computer,In some cases these devices have been spe-cifically designed for communication proc-essing. In other cases, general purpose mini-computers are being used as front ends.Either way, the availability of inexpensivelogic and memory components has contrib-uted to the further distribution of the com-munication function away from the centralprocessor.

Security CapabilitiesComputers have handled sensitive data

and programs for many years; however, it isonly recently that the need to secure themhas become a serious concern to systemdesigners and operators. During the socialunrest of the 1960’s, concern arose over thephysical security of computer systems. Theywere expensive and visible symbols and, con-sequently, attractive targets for sabotage.Later, concerns over privacy and an aware-ness of increasing incidents of financial com-puter crime motivated the managers to takea more sophisticated look at protecting theirsystems and data.

Classifications of ComputerSecurity

Security experts distinguish betweenthree types of security: physical, proceduraland technical.

Physical security refers to techniques thatphysically isolate a computer system fromaccess by unauthorized persons. It also in-cludes protection of the facility from exter-nal dangers such as earthquake, fire, flood,or power failure.

Procedural security is the set of rules bywhich a system operator manages the sys-tem personnel and the flow of work in the or-ganization. It can include such measures aspreemployment screening of staff, workassignments that minimize opportunities toact in inappropriate ways, auditing pro-cedures, and controls on the flow of workthrough the system.

Technical security refers to the softwareand hardware controls set up within the sys-tem itself. Techniques used to providesecurity may include cryptographic en-coding of data, complicated access and iden-

Ch. 13—Trends in Computer Technology ● 139

tification procedures, and hardware which isdedicated to the auditing function.

Some security experts claim that toomuch attention on technological fixes hasdistracted system operators from moretraditional but effective measures they couldbe instituting. However, the increased pro-liferation of small systems and the trendtoward communication-based systems aremaking technical security more important.The techniques of physical and proceduralsecurity are well-understood and translaterelatively easily from the noncomputerworld into that of the system operator. Tech-nical security, a newer area of research, isless understood, but is related directly to theproblems of system design.

Computer scientists have proved that it istheoretically impossible to achieve perfectsecurity inside the program itself. That is, itcannot be demonstrated that any particularcombination of hardware and programing isproof against some new unexpected type ofattack. Improving the security of a com-puter system involves balancing the costs ofprotection against the expectation of lossresulting from the threats and vulner-abilities. While it cannot provide a finalanswer, R&D in the field of computer secu-rity can substantially decrease protectioncosts.

Risk analysis, the process of weighing allthese factors in a decision model, is a dif-ficult job. The precise factors are unknown,and it is difficult to determine whether allpossible alternatives have been covered. Re-search can develop techniques for perform-ing risk analyses with greater precision, andthe Government has new research andstandards activities in this area. While thestandards are directed at Federal systems,they will provide useful guidance to the pri-vate sector.

Technological instruments for securityfall into three categories, according to the in-tent of the designer: prevention, detectionand auditing. Prevention means keepingunauthorized persons from having access to

the system, and keeping authorized personsfrom using the system wrongly. Detectionmeans catching an unauthorized procedurewhen it is attempted and preventing its com-pletion. Auditing means the determinationof whether unauthorized acts have occurred.A particular security technique is usually di-rected toward one of these goals.

Specific Techniques of Security

Authentication: The first objective ofsecurity is to assure that only authorizedpersonnel can access the system. Identificat-ion is the process of establishing a claim ofidentity to the system, either with a name oran account number. Authentication is theprocess of verifying the claim.

The simplest and oldest procedure is touse a password or number that is knownonly to the individual authorized to use thesystem. The personal identification numbersassigned to bank customers for use onATMs are examples of such codes.

The security provided by passwordschemes is limited, although more elaborateversions offering some improvements havebeen developed. However, the security ofany password scheme depends fundamen-tally on the ability and willingness of theuser to keep the code secret.

Physical identification techniques, whichdepend on measuring certain personal phys-ical characteristics, are being developed foruse as authenticators in computer systemaccess. To be useful, any such system mustbe able to discriminate between persons, butat the same time be insensitive to changes inthe characteristics of a particular individualover time.

The system operator, when selecting anauthenticating technology, has to make achoice in balancing two types of errors—classifying a fraudulent identity as correct(type I) and classifying a proper user asfraudulent (type II). These two types oferrors have costs associated with them, andare usually at opposite ends of a tradeoff

140 ● Computer-Based National Information Systems: Technology and Public Policy Issues

curve for any specific technology. The type Ierror can be minimized only at the cost ofmaximizing the type I I error, and vice versa.

Fingerprints: The technology exists forreading and encoding fingerprints with mini-mum delay, but the devices are expensive(over $50,000). Although the pattern is com-plex, fingerprints can be encoded in a com-puter using less than 100 characters by stor-ing only certain key data points. This stor-age efficiency means that a complete set offingerprints for every person in the UnitedStates could be stored easily in a com-mercially available bulk memory.

The cost of directly reading fingerprints,however, seems to suggest that it will notbecome a widely used method of authentica-tion, at least in the near future. Its use willbe restricted to very high security require-ments, and to applications where finger-prints themselves represent significant data,such as in police work.

Hand Geometry: A new and surprisinglyeffective form of physical identification isthe geometry of the hand. Individual fingerlengths vary from one person to another.This variance is sufficiently significant andunique to be the basis for a relatively inex-pensive (around $3,000) identification de-vice. It is based on the use of a high intensitylight shining on a pattern of photocells. It issensitive both to external geometry and tothe translucence of the flesh near the finger-tips. Thus, it is quite difficult to deceive itwith any artificial device.

Voice Recognition: Research on the tech-niques for voice analysis and voice synthesishas resulted in methods for distinguishingindividuals by their speech. In these sys-tems, a random list of words is shown to theindividual, who then speaks them into amicrophone. By having the computer gen-erate a new list each time, the system makesit impossible for an imposter to use a taperecorder to fool the system.

The system has high interpersonal dis-crimination, but seems to be weaker onintrapersonal differences. It may reject

authorized persons suffering from colds,hoarseness, or even emotional tension.

Voice recognition systems are not yetcommercially available, although at least onefirm, Texas Instruments, has installed ahome-developed system for use in theirfacilities.

Since information collection is relativelycheap, requiring only a standard micro-phone, amplification, and signal conversionhardware, the limitations of the technologyseem to be in the computational techniques.As software improves, and as the cost ofcomputer hardware drops, voice recognitioncould become a popular low-cost authentica-tion technique.

Signature Verification: Passive signatureverification uses pattern-recognition tech-niques to analyze and encode a signature ona check or form. It is a difficult task, becausea signature can vary depending on an indi-vidual’s mental and physical state, the typeof writing implement used, and becauseforgers can be quite skillful. One companyhas announced a product for providing pas-sive signature verification for bulk applica-tion, particularly the processing of checks. Itis not clear whether such technology isoperationally effective in identifying for-geries, and whether it could be reduced incost sufficiently to be used at the point ofsale.

Dynamic signature verification systemstrack the actual path of the pen point as theindividual creates a signature. Sensitive in-struments measure variables such as thepen’s speed and acceleration, the amount oftime the point remains on the paper, and thechanging pressure on the point. Severalorganizations, including IBM, are workingon dynamic identification; however, no prod-ucts are as yet commercially available. Someexperts judge this to be a promising tech-nology.

Much R&D is aimed at finding a morereliable substitute for the currently usedmagnetic cards and passwords to identifyand authenticate individuals. To date only a

Ch. 13—Trends in Computer Technology ● 141

few products have come on the market, andthose have been designed for use in applica-tions with very high security requirements.The cost limitation seems to depend on thecharacteristics of the sensor, since themicroprocessor costs have dropped so low.No doubt a growing demand could motivatea flurry of new products in the early 1980’s.

The amount of data required to store com-puter representation of the pattern for anyof these candidate technologies is relativelysmall—a few hundred characters. Thus anyof them, if they become widely implemented,could become the basis for a quasi-universalidentification code used by all private andGovernment organizations.

E n c r y p t i o n

In the past, cryptography was principallya tool for military and diplomatic com-munication. Now, however, modern datacommunication systems increasingly aretransmitting private information of highvalue, and the need to protect these com-munications from interception and manip-ulation has prompted an explosion of in-terest in civilian encryption.

A standard for encryption technology, theData Encryption Standard (DES), has beenestablished by the National Bureau ofStandards for Federal use. It is apparentlyalso being widely adopted in the private sec-tor, since several commercial manufacturersare producing devices based on it. Whilesome experts have questioned the robust-ness of DES, it seems to have been acceptedgenerally as an inexpensive technology thatis at least effective for low- or middle-levelsecurity needs.

Another set of techniques that has re-ceived some attention lately has been labeled“public key” encryption. The idea behindthese codes arose from basic research in com-putational complexity, a field of computerscience dealing with possible theoreticallimits on the ability of even the most power-ful computers to compute certain mathe-matical solutions. A public key code uses one

key to encrypt and another to decrypt amessage. Knowledge of the encryption key isno help in deriving the decryption key, eventhough their mathematical relationship isknown. Thus, the security of the code doesnot depend on the security of either the en-coding key or of the secrecy of the mathe-matical relationships. Since one of the majorproblems in the use of cryptography is con-trol of the key itself, a system in which thedecoding key need not be known even to thedata sender is promising for many applica-tions.

Public key codes also promise to be usefulin electronic message systems, since theycan be used for authenticating messages inthe absence of signatures. Several applica-tions of this sort have been proposed. How-ever, public key codes are in their infancy,and it is not known with certainty whetherunanticipated problems will arise as they areused.

Encryption has uses other than merelysecuring communications. Used internally ina computer system, it can isolate sets of datafrom unauthorized users. It can also allowusers to enter data but not to read it, or toread but not modify data. It can even sep-arate the activities of various connectedcomputer processors.

A u t h o r i z a t i o n

Most large-scale information systems aredesigned to serve several users simul-taneously. The data bases in the machineoften contain clusters of information thatserve multiple needs. It is necessary, then, tocontrol the access of users who are author-ized to be on the machine, but may not beauthorized to have access to specific parts ofthe data.

For each user, the system must keep trackof which parts of the file can be accessed andmanipulated, and what forms of access areheld (read the data, enter new data, and soon). The system also must control the givingof permissions. Can one user, who is au-thorized to see a file, give access permission

142 ● Computer-Based National Information Systems Technology and Public Policy Issues

to another user? There are many situationsin which this is a legitimate and even nec-essary procedure, yet it complicates enor-mously the problems of access control. Re-searchers are developing ways to modelthese assignments mathematically andavoid unexpected loopholes in system accesscontrol.

The continued growth in the size of in-formation systems and in the numbers ofpeople allowed to access them will continueto put pressure on system designers by com-plicating the authorization process.

L o g g i n g

Logging is the process of auditing the ac-cesses made to a data base by all users. Sys-tems for keeping a complete or partial recordof all access to the data have received moreattention since privacy has become an issue.The system operator needs to account for allaccesses made to files of personal data. Sincethe log itself is a file that contains potential-ly sensitive personal information, the need toprotect it may be even greater than that forthe original data base. For this reason, someexperts suggest the use of a separate smallmachine to monitor accesses to the database.

The system operator can then examine thelog for unusual patterns of file access orother significant actions of the users thatmay indicate that unauthorized use is beingmade of the system. When it is possible tocode certain unusual patterns of use, the log-ging system itself can be programed towatch for those events. It will then eithersend a warning to the system operator, orcall on a special security surveillance pro-gram that collects as much detailed informa-tion as possible about the transaction.

Operating Systems

The operating system of a computer, theset of programs that control its work, is thefundamental piece of software on which allother application programs depend. Conse-quently, the integrity of the operating

system is a necessary prerequisite for anyother software security. Although no systemcan be designed to be perfectly secure, thereis much that can be done to construct highlysecure systems.

R&D is ongoing in this area, and resultswill be slowly incorporated into existingsystems. However, progress will be hinderedby the difficulty in adapting current op-erating system programs. These contain mil-lions of instructions, and have been modifiedand expanded over several years by manyprogrammers. The systems are difficult tochange, as are the habits of their users.

Some computer installations still useoperating systems written nearly 20 yearsago. Computer operators fear that disrup-tion and trauma would result from adoptingradically different operating systems, andmanufacturers resist compromising invest-ments amounting to billions of dollars thatthey have made in existing programs. Thus,the most acceptable new techniques over theshort term will be those that can be adaptedto existing systems. However, it is the verysize, complexity, and growth history of thesecurrent systems that create their greatestweaknesses—the logical holes and flawsthrough which a determined outsider cangain access.

Data Base Security

As data bases grow larger and are design-ed to serve multiple purposes, the likelihoodincreases that system operators will want togrant selective access. The problem is dif-ficult. In an employee data base, for exam-ple, it may be desired to allow the personneldepartment access to some records, the fi-nance department to others, and the medicaldepartment to still others.

One of the major problems is that ofauthorization determining which user can dowhat. Another related issue is how to struc-ture the data base to allow such authoriza-tions to be enforced. The question of whichcontrols are even possible is, in itself, a com-plicated one. Research in data structures is

Ch. 13— Trends in Computer Technology . 143

developing new techniques which will prob-ably come into use rapidly as new data basesystems come on the market.

Encryption is one technique that will beused increasingly in cases where differentgroups of data and their users can be easilypartitioned in terms of access control. It willbe less useful when the data are highly in-tegrated, and when it is not known duringthe design stage where boundaries will even-tually be drawn.

Years ago, some hope was placed in theuse of so-called “aggregated files, ” par-ticularly when used for research with sen-sitive personal data. These files supposedlyeliminate problems associated with main-taining personally identifiable data by strip-

ping off identifiers and lumping them to-gether in statistical clusters. It has beenshown, however, that aggregating data sta-tistically does not always assure that aclever inquirer cannot reverse the processand derive substantial personal information.In the same way, merely stripping identifiersoff records of personal information may notpreserve the integrity of the information, fora surprisingly small amount of descriptiveinformation can serve to identify an individ-ual uniquely. R&D is being conducted onways to transform data bases collected forsocial research purposes so that the individ-ual information is completely obscured, butstatistically relevant calculations can still bedone.