neural net applications and products: richard k. miller, terri c. walker, and anne m. ryan, madison,...

3
7t4 Book Reviews tisticat methods in which Wasserman concentrates his ef- forts on training methods rather than on procedures for producing output from a previously trained network, as most of the original sources have done, Wasserman gives a detailed presentation of different training methods and the advantages and problems associated with each. This is one of the few places in the book in which general issues appear prominently and an attempt is made to compare the relative advantages and disadvantages of models on general criteria. Wasserman does, in fact, make frequent comparisons between the various models along more tech- nical criteria. I thought perhaps that this chapter should have appeared before the backpropagation chapter and might have been expanded a bit to lay the groundwork for understanding the other chapters. The Hopfield nets chapter and the bidirectional asso- ciative memories (BAM) chapters were not the book's strongest. While Wasserman was effective at avoiding mathematical jargon throughout the book, in these chap- ters neural network jargon was plentiful enough to have obscured meaning for the uninitiated "'professional not specializing in mathematical analysis." He uses a variety of terms and ways of saying things that, while well-known to the initiated, were not defined sufficiently to be under- standable to the uninitiated reader without more context, One example of this is the discussion of the Hopfield net- works that is case in terms of the network moving from vertex to vertex. While Wasserman defined this in a literal: informal mathematical way, he did not present any kind of an intuitive explanation of what it means for the network to be in a vertex, and how that relates to the problem being solved. This is the kind of discussion that can easily frus- trate those who have not been in frequent contact with this way of explaining things. There are some analogous examples of this in the BAM chapter. The chapters not specifically mentioned above are. m general, solid practical introductions to the models. In an overview, it is difficult to say exactly what approach Was- serman takes in covering the models. In some chapters he stresses training of the models, in other chapters he stresses retrieval, and in still others, he stresses an odd combina- tion. At times, I had the vague impression that he tended to present the greatest amount of detail on aspects of the models that have been the most neglected in the original treatments, and to present the other more commonly- stressed aspects of the models in a superficial way. There are obvious examples of this in the ART chapter, as well as in the backpropagation, statistical methods, and per- ceptron chapters. 1 found this "filling-in" aspect of the book refreshing. Further. I think it was for this reason that the book never seemed to get boring, even if I had read other presentations of these models. This "filling-in" is likely to be appreciated by those wo have read other papers and books in the area of artificial neural networks, but might be frustrating to those who have not attempted other sources. In summary, I believe Wasserman's book is a useful reference for acquiring the skills needed to translate the- oretical presentations of various artificial networks into working models. It lies somewhere in between guides like work books that perhaps give too much practical help (i.e,. pre-written programs may not always be the best way to acquire the skills needed to simulate artificial neural net- works) and theoretical presentations in which everything is left as an exercise for the reader. As Wasserman points out. taking a grade like thi~ am! implementing some of the models is perhaps the bes~ wa~ to acquire a "depth of understanding ~f artificial neural networks. ~4 lice ,f. C)"toote School qt thtenan Development Univer~io' of Texas at Dalla,~ Richard~oH. T X Z5083-0688 Neural Net Applications and Products Richard K. Miller, Terri C. Watker, and Anne M. Ryan, Madison, GA: SEAl Technical Publica- tions, 1990, $395,347 pp. ISBN 0-89671-107-2. In Neural Net Applications and Products. the authors draw together brief descriptions of several applications of neural networks and a listing of commercial neural network prod- ucts and services. The book has an 8~ x 11" (21.5 x 28 cm) format and was apparently produced directly from double-spaced, letter-quality manuscript pages. This print- ing, which is an updated version of a report produced jointly by SEAl Technical Publications and Graeme Press in 1988, contains 17 chapters, two appendices and a glos- sary. The $395 price tag places the book well outside the range most individuals are willing to pay and indicates that its intended audience is more the marketing organizations of large corporations than individuai researchers. Chapter I is a 12-page overview of neural network tech- nology. Chapter 2 is an introduction to the 226-page ap- plications section, which comprises chapters 3 through t5. Each chapter of this section consists of a brief introduction followed by descriptions of existing applications of neural networks in one subject area. Chapter 16 is a 50-page annotated listing of companies offering neural network products and services, Chapter 17 is mainly a report of the results of a survey of 34 neural network professionals con- ducted by Future Technology Surveys, Inc._ of Madison. GA, but also contains a market prediction derived from a 1988 report from The Schwartz Associates, Mountain View. CA, and a section entitled, "Hecht-Nielsen's Views." The first appendix lists the addresses ot many academic, in- dustrial, and governmental groups acnve in neural network technology. The second appendix consists of brief descrip- tions of applications of the Nestor l oearning System. and the book ends with a glossary of terms useful to the neural network researcher. This book contains much worthwhile information. Fhe authors have performed a valuable service by collecting into one volume a substantial pomon of the applications to which neural networks had been applied through about mid-1989. The glossary, the list of companies offering products and services, and the less complete list of ad- dresses of research groups offered in Appendix A may also be quite helpful to some. Less useful are the overview of neural network technology contained in chapter 1 and the predictive and market-oriented material of chapter 17, The 13 areas represented in the application section are sensor signal processing and data fusion: pattern recog-

Upload: charles-butler

Post on 26-Jun-2016

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Neural net applications and products: Richard K. Miller, Terri C. Walker, and Anne M. Ryan, Madison, GA: SEAI Technical Publications, 1990, $395, 347 pp. ISBN 0-89671-107-2

7t4 Book Reviews

tisticat methods in which Wasserman concentrates his ef- forts on training methods rather than on procedures for producing output from a previously trained network, as most of the original sources have done, Wasserman gives a detailed presentation of different training methods and the advantages and problems associated with each. This is one of the few places in the book in which general issues appear prominently and an attempt is made to compare the relative advantages and disadvantages of models on general criteria. Wasserman does, in fact, make frequent comparisons between the various models along more tech- nical criteria. I thought perhaps that this chapter should have appeared before the backpropagation chapter and might have been expanded a bit to lay the groundwork for understanding the other chapters.

The Hopfield nets chapter and the bidirectional asso- ciative memories (BAM) chapters were not the book's strongest. While Wasserman was effective at avoiding mathematical jargon throughout the book, in these chap- ters neural network jargon was plentiful enough to have obscured meaning for the uninitiated "'professional not specializing in mathematical analysis." He uses a variety of terms and ways of saying things that, while well-known to the initiated, were not defined sufficiently to be under- standable to the uninitiated reader without more context, One example of this is the discussion of the Hopfield net- works that is case in terms of the network moving from vertex to vertex. While Wasserman defined this in a literal: informal mathematical way, he did not present any kind of an intuitive explanation of what it means for the network to be in a vertex, and how that relates to the problem being solved. This is the kind of discussion that can easily frus- trate those who have not been in frequent contact with this way of explaining things. There are some analogous examples of this in the BAM chapter.

The chapters not specifically mentioned above are. m general, solid practical introductions to the models. In an overview, it is difficult to say exactly what approach Was- serman takes in covering the models. In some chapters he stresses training of the models, in other chapters he stresses retrieval, and in still others, he stresses an odd combina- tion. At times, I had the vague impression that he tended to present the greatest amount of detail on aspects of the models that have been the most neglected in the original treatments, and to present the other more commonly- stressed aspects of the models in a superficial way. There are obvious examples of this in the ART chapter, as well as in the backpropagation, statistical methods, and per- ceptron chapters. 1 found this "filling-in" aspect of the book refreshing. Further. I think it was for this reason that the book never seemed to get boring, even if I had read other presentations of these models. This "filling-in" is likely to be appreciated by those wo have read other papers and books in the area of artificial neural networks, but might be frustrating to those who have not attempted other sources.

In summary, I believe Wasserman's book is a useful reference for acquiring the skills needed to translate the- oretical presentations of various artificial networks into working models. It lies somewhere in between guides like work books that perhaps give too much practical help (i.e,. pre-written programs may not always be the best way to

acquire the skills needed to simulate artificial neural net- works) and theoretical presentations in which everything is left as an exercise for the reader.

As Wasserman points out. taking a grade like thi~ am! implementing some of the models is perhaps the bes~ wa~ to acquire a "depth of unders tanding ~f artificial neural networks.

~4 lice ,f. C)" toote School qt thtenan Development

Univer~io' o f Texas at Dalla,~ Richard~oH. TX Z5083-0688

Neural Net Applications and Products Ric ha rd K. Mi l le r , Terr i C. Watke r , and A n n e M.

R y a n , M a d i s o n , G A : S E A l Technica l Publ ica- t ions , 1990, $395 ,347 pp. ISBN 0-89671-107-2.

In Neural Net Applications and Products. the authors draw together brief descriptions of several applications of neural networks and a listing of commercial neural network prod- ucts and services. The book has an 8~ x 11" (21.5 x 28 cm) format and was apparently produced directly from double-spaced, letter-quality manuscript pages. This print- ing, which is an updated version of a report produced jointly by SEAl Technical Publications and Graeme Press in 1988, contains 17 chapters, two appendices and a glos- sary. The $395 price tag places the book well outside the range most individuals are willing to pay and indicates that its intended audience is more the marketing organizations of large corporations than individuai researchers.

Chapter I is a 12-page overview of neural network tech- nology. Chapter 2 is an introduction to the 226-page ap- plications section, which comprises chapters 3 through t5. Each chapter of this section consists of a brief introduction followed by descriptions of existing applications of neural networks in one subject area. Chapter 16 is a 50-page annotated listing of companies offering neural network products and services, Chapter 17 is mainly a report of the results of a survey of 34 neural network professionals con- ducted by Future Technology Surveys, Inc._ of Madison. GA, but also contains a market prediction derived from a 1988 report from The Schwartz Associates, Mountain View. CA, and a section entitled, "Hecht-Nielsen's Views." The first appendix lists the addresses ot many academic, in- dustrial, and governmental groups acnve in neural network technology. The second appendix consists of brief descrip- tions of applications of the Nestor l oearning System. and the book ends with a glossary of terms useful to the neural network researcher.

This book contains much worthwhile information. Fhe authors have performed a valuable service by collecting into one volume a substantial pomon of the applications to which neural networks had been applied through about mid-1989. The glossary, the list of companies offering products and services, and the less complete list of ad- dresses of research groups offered in Appendix A may also be quite helpful to some. Less useful are the overview of neural network technology contained in chapter 1 and the predictive and market-oriented material of chapter 17,

The 13 areas represented in the application section are sensor signal processing and data fusion: pattern recog-

Page 2: Neural net applications and products: Richard K. Miller, Terri C. Walker, and Anne M. Ryan, Madison, GA: SEAI Technical Publications, 1990, $395, 347 pp. ISBN 0-89671-107-2

Book Reviews 715

nition, image processing and machine vision; a case study of neural networks in automated inspection; robotics and autonomous sensor-motor control; speech recognition and synthesis and natural language processing; knowledge pro- cessing; financial applications; database retrieval; hand- writing and character recognition; medical diagnosis, health care and biomedical uses; manufacturing and process con- trol: defense applications: and a miscellaneous category entitled "other applications." The reports within each chapter vary in length from a few sentences to several pages. Collecting these examples undoubtedly required a gargantuan effort, and the authors are to be congratulated for their energy and dedication.

The case study of chapter 5 is an interesting illustration of the process of adapting neural technology to a difficult industrial task, that of automated product inspection. The material for the chapter apparently was supplied by Global ttolonetics Corporation and describes development of an alternative network-based decision unit for their Lightware automated inspection system. The report provides an in- teresting glimpse into the commercial research and devel- opment process. It also unintentionally provides a small lesson in the ways of the market place. Sometime after this book went to press, Global Holonetics Corporation was bought out and their product withdrawn from the mar- ket.

Despite its usefulness, the application section of the book has two unfortunate characteristics. The first is the confusing internal organization of the chapters. Many sec- tions describe a single application but these are randomly intermixed with others that deal with descriptions of ap- plications being pursued by a particular company or with a description of the way a particular commercial software package or brand of neurocomputer is being applied to problems in the relevant area. Similarly, some chapters contain sections entitled "Current Research," while many do not. The usefulness of the application chapters would have been considerably enhanced if the authors had ad- hered to a single organizational principle for every chapter.

A second irritation is that no direct literature citations are given for the applications. A small number of refer- ences are listed at the end of each chapter but it is never stated whether these are suggestions for further reading, direct sources of the information contained in the chapter, or a representative sampling of articles on the subject. The book contains no introduction or foreword in which this and similar background information could have been fur- nished to the reader. Fortunately, the names and affilia- tions of the researchers associated with each application are usually given so that the reader can, by consulting the address list at the end of the book, contact the researchers directly to request reprints or other information. Overall, though, the application section is useful and presents in- formation heretofore available only by searching a number of journals and conference proceedings.

The second major section of the book, and one equally useful for American and Canadian researchers, is an an- notated listing of 57 companies offering neural network products and services. The title of this chapter is "As- sessment of Commercial Neural Net Products," which im- plies that only companies with tangible products are in- cluded and that some evaluation of these products is offered.

In reality, however, no assessment of products is at- tempted, and several companies that provide only con- sultant services are included. The chapter simply presents a listing of addresses, telephone numbers, contact persons, and a brief description of the product or service supplied by each company. Once again, the information seems to be current through about mid-1989.

As might be expected, there are some inaccuracies caused by personnel relocations, company failures, price changes, and discontinued product lines which occurred after the book went to press. A comparison of this list with entries on Neurodisk 1, put out about the same time by Derek Stubbs' newsletter Neurocomputers, revealed only a few omissions that could not be attributed to the fact that Stubbs' list was current through late 1989. Most of the omissions of American and Canadian companies were consulting firms with no software or hardware products. Notably missing from the list, however, were Japanese and European Ec- onomic Community enterprises that offered products be- fore mid-1989. Neuristique, a French company, and NEC Corporation were there, but Fujitsu's security robot was not mentioned, nor were any UK, German, Italian, or other Japanese companies. Anyone desiring to remain in- formed about products and services currently available worldwide would do better to subscribe to one or more of the several newsletters available in the United States and elsewhere, some of which are quite reasonably priced.

The authors have produced a useful cross section of current network applications and they furnish us with a valuable, if geographically limited~ list of companies of- fering neural network products and services. It is all the more disappointing, therefore, that the book suffers from two distortions, First, the material chosen for special pres- entation seems to have been selected more because it was easily available than because it was representative. Thus, applications of HNC's A N Z A are presented in some detail, while Science Applications International Corporation's Sigma neurocomputer, used by about an equal number of companies and institutions, is hardly mentioned. Similarly, applications of the Nestor products, besides being fre- quently, and justifiably, mentioned throughout the text, are described in some detail in a 31-page appendix. This effectively incorporates a catalog of Nestor products and services in the book while excluding such material from other companies. This lack of balance in illustrative ma- terial might mislead the casual reader but a person seri- ously seeking a neurocomputer or network consulting ser- vices would quickly find other sources listed in chapter 16. However, the second imbalance might not be so easily perceived.

Throughout the book, but most noticeably in chapter 1, the authors repeatedly use material obtained from the same small group of researchers. These researchers are respected and productive but unfortunately all share the same view of the elephant. The book would be much more valuable if the authors had been more eclectic in the view- points they presented. Nowhere is this bias more evident than in their repeated use of a particular taxonomy of neural network paradigms originally constructed, I believe, by Patrick Simpson. The list suffers from two main prob- lems which reflect themselves into the fabric of the book. First, it does not give equal emphasis to equally distinct

Page 3: Neural net applications and products: Richard K. Miller, Terri C. Walker, and Anne M. Ryan, Madison, GA: SEAI Technical Publications, 1990, $395, 347 pp. ISBN 0-89671-107-2

7It> Book Reuiews

paradigms. This is best illustrated with its treatment of the bidirectional associative memory (BAM), which Simpson has studied extensively, It inflates the apparent importance of this moderately useful family of networks by according each variation of the basic design a separate entry. If this were done for other designs, the functionally distinct vari- ations of backpropagation alone would lengthen the list considerably and designs like the adaline and madaline would receive separate entries.

The second failing of the taxonomy is that it omits sev- eral important network designs that were well known when the list was assembled. The neocognitron is not mentioned, nor, for instance, are the functional link network of Pao. yon der Malsburg's dynamic link architecture, or Hecht- Nielsen's spatiotemporal network. Surprisingly, no higher- order networks are listed, including Specht's padaline

(polynomial adaline) and probabilistic networks, in ad- dition, Carpenter and Grossberg's missing ART 2 and ART 3 were both well known long before the apparent cutoff date of the material in this book.

In summary, while the book contains ~evcral flaws, not the least of which is its price, it is probably the best avail- able source of information on actual neural network ap- plications and should also be useful t~+ those interested in contacting companies offering neural network products and services.

Charles Butler l%,~ical Sciences lnc

635 Slaters Lane Suite GIO!

A/evandria, VA 22314