cdna06379enc_001

344
Gerald Musgrave Editor Computer-Aided Design , of digital electronic circuits and systems North-Holland for the Commission of the European Communities

Upload: ongkingcozarijane

Post on 20-Oct-2015

51 views

Category:

Documents


5 download

DESCRIPTION

SG CAD standard

TRANSCRIPT

  • Gerald Musgrave Editor

    Computer-Aided Design , of digital electronic circuits and systems

    North-Holland for the Commission of the European Communities

  • COMPUTER-AIDED DESIGN of digital electronic circuits and systems

  • organized by

    The Commission of the European Communities

    Directorate-General for Internal Market and Industrial Affairs

    NORTH-HOLLAND PUBLISHING COMPANY-AMSTERDAM NEW YORK OXFORD

  • COMPUTER-AIDED DESIGN of digital electronic circuits and systems

    Proceedings of a Symposium Brussels, November 1978

    edited by Gerald MUSGRAVE Brunei University Uxbridge, Middlesex, U.K.

    1979

    NORTH-HOLLAND PUBLISHING COMPANY -AMSTERDAM NEW YORK OXFORD

  • * ECSC, EEC, EAEC, Brussels and Luxembourg, 1979

    All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without the prior permission of the copyright owner.

    ISBN: 0444 85374

    Published by NORTH-HOLLAND PUBLISHING COMPANY AMSTERDAM NEW YORK OXFORD

    Sole distributors for the U.S.A. and Canada ELSEVIER NORTH-HOLLAND INC. 52 VANDERBILT AVENUE NEW YORK, N.Y. 10017

    for The Commission of the European Communities, Directorate-General for Scientific and Technical Information and information Management, Luxembourg

    EUR 6379

    LEGAL NOTICE Neither the Commission of the European Communities nor any person acting on behalf of the Commission is responsible for the use which might be made of the following information.

    PRINTED IN THE NETHERLANDS

  • FOREWORD

    "What we have to learn to do we learn by doing" Aristotle

    With the rapid change in technology providing the ever increasing complexity of digital systems it is essential to utilise the products of that technology in order to cope with the evolution. Computer aided design of digital electronic circuits and systems is essential to the ongoing development of any electronics and associated data processing industry. The European Communities recognised its importance in July 1974 when they initiated a programme of studies in the D.P. field. One study, the CAD Electronics Study commenced in June 1977 as a feasi-bility project with the following objectives: a. Assessment of current state-of-the-art of CAD of logic design, its cost benefits, user requirements, problem area and impact of technology evolution. b. Time projection of designers' opportunities and require-ments within an extrapolated electronics and computer evoluation in the 1979-82 period. c. Investigation of the opportunity in terms of strategic scientific, industrial and economic benefit. d. Recommendations for further Community work, if appropriate, with detailed justification.

    To match these objectives a two phase project structure was used. First a world-wide survey of CAD techniques applied to digital electronics was undertaken call-ing for information from users, non-users, suppliers and encompassing the product ranges of computers, communications, military systems etc. The second phase was an analysis of this data with respect to the implication for CAD development in Europe and the technology impact over the next quinquennium. One of the important conclusions of this work was the appalling ignorance of CAD techniques even by those who were purporting to be using the same. Hence the organisation of a three day symposium in November 1978 where a state-of-the-art set of lectures was given followed by important papers from leading authorities on the problem areas. In these presentations a balance was retained between software suppliers' and users' views. There was also the opportunity to present the structure, results and general conclusions of the EEC CAD Electronics Study and give delegates the opportunity to discuss the subject matter. This book is a record of the lectures, papers and discussions at the symposium and covers the subject of CAD of electronics circuits and systems, from the conceptual specification through synthesis, simulation, testing and implementation from printed circuit boards (PCBs) to very large scaled integrated (VLSI) chips. The management aspects of future trends and economic viabilities are covered which

  • vi FOREWORD

    affords the reader a wide spectrum of information. This volume is clearly the work of many willing and cooperative authors whom I wish to acknowledge. It has only been possible by the foresight of the European Commission and the dedication and forebearance of Mr. Bir and his staff of the Joint D.P. Project Bureau of the Commission.

    GERALD MUSGRAVE BRUNEL UNIVERSITY

  • C O N T E N T S

    INTRODUCTORY SESSION

    OPENING ADDRESS 3 E. Davignon KEYNOTE ADDRESS 7 K. Teer

    TECHNICAL SESSION I INTEGRATED COMPUTER-AIDED DESIGN - AN INDUSTRIAL VIEW 13 R.W. McGuffin PRODUCT SPECIFICATION AND SYNTHESIS 25 D. Lewin SIMULATION OF DIGITAL SYSTEMS: WHERE WE ARE AND WHERE 41 WE MAY BE S.A. Szygenda

    TECHNICAL SESSION II NEW CONCEPTS IN AUTOMATED TESTING OF DIGITAL CIRCUITS 57 .. Breuer LSI DEVICE CAD VERSUS PCB DIGITAL SYSTEM CAD : ARE 81 REQUIREMENTS CONVERGING? H. De Man CURRENT TRENDS IN THE DESIGN OF DIGITAL CIRCUITS 91 H.M. Lipp CAD IN THE JAPANESE ELECTRONICS INDUSTRY 103 K. Kani, A. Yamada, M. Teramoto

    TECHNICAL SESSION III ASPECTS OF A LARGE, INTEGRATED CAD SYSTEM 123 F. Hembrough, R. Pabich LARGE SCALE CAD USER EXPERIENCE 133 F. Klaschka COMPUTER AIDED DESIGN OF DIGITAL COMPUTER SYSTEMS 139 L.C. Abel

    TECHNICAL SESSION IV VERIFICATION OF LSI DIGITAL CIRCUIT DESIGN 149 J.-C. Rault, J.-P. Avenier, J. Michard, J. Mutel

  • viii CONTENTS

    page no. COMPUTER AIDED DESIGN THE PROBLEM OF THE 80'S 169 MICROPROCESSOR B. Lattin USER EXPERIENCE C. Gaskin DEVELOPMENT OF

    DESIGN

    : IN SIMULATION

    A DIGITAL P.E. Roberts, K.T. Wolsk' AN APPROACH TO A TESTING

    TEST I

    AND TESTING

    GENERATION SYSTEM

    SYSTEM FOR LSI

    173

    183

    187 H.E. Jones, R.F. Schauer

    TECHNICAL SESSION V AN ENGINEERING COMPONENTS DATA BASE 207 M. Tomljanovich, R. Colangelo CUSTOM LSI DESIGN ECONOMICS 217 J.G.M. Klomp AUTOMATIC GATE ALLOCATION PLACEMENT AND ROUTING 229 S.C. Hoffman INTEGRATED CAD FOR LSI 237 K. Loosemore

    E.E.C. PROJECT SESSION EUROPEAN COMMUNITIES STUDY ON CAD OF DIGITAL CIRCUITS AND SYSTEMS 245 Introduction: A. De Mari 27 Organisational Aspects: W. Quillin 255 Technical Perspective: G. Musgrave 257 Survey in USA and Canada: A. Carter 303

    TECHNICAL FORUM TECHNICAL FORUM I 313 Chairman: Jakob Vlietstra TECHNICAL FORUM II 317 Chairman: Jakob Vlietstra FINAL SESSION EUROPEAN ECONOMIC COMMUNITY PERSPECTIVE 321 Chairman: S. Bir

    INDEX OF AUTHORS 325

  • INTRODUCTORY SESSION

    Chairman: C. GARRIC, European Communities

  • G. Muigim/e, editou, COMPUTER-AIPEP DESIGN Oi digital tlzuViotilc ccAcacti and iijtvni NoiUh-Holland Publlhoig Company ECSC, EEC, EAEC, Biuiiili S LuxmbouAa, 1979

    OPENING ADDRESS E. DAVIGNON EUROPEAN COMMUNITIES

    It is a pleasure for me to welcome this gathering which includes many of the world's most distinguished specialists in that key tool of advanced technology, computer-aided design. We look forward to hearing contributions from leaders in the field not only from Europe, but from the United States, Japan, and even the Soviet Union. Your numbers and quality augur well for the conference. I would like to start if off by placing it in the political and economic context of the Commission's objectives for industrial policy. Not only Europe, but the developed world as a whole is in the throes of fundamen-tal industrial change due not only to the ending of a long period of sustained economic growth, but to deep shifts in its industrial structure. As the develop-ing nations of the world acquire competence and capability in many of the older industries, from textiles and shipbuilding to steel and cars, a process which we must welcome as offering them the chance to live and even thrive, the trad-itional industrial regions such as Europe must look increasingly to the newer technologies and industries as the main source of future economic growth, employ-ment and social development. Europe has to become a high technology workshop for the world. Of these new technologies, far the most important is the complex of electronic industries associated with the processing and communicating of information. The parts of this complex still often go by separate names - the computer industry, telecommunications, electronic components. But I do not need to tell this assembly that they increasingly are one, as the French have recognised in their new word "tlmatique". This is both the nervous system and the key base technology for a modern industrial or indeed post-industrial society. The facts speak for themselves: In mid-recession the market for computing in Europe is still growing at something approaching 203S per year in fixed money terms, while each year the value for that money in terms of computing power is multiplied several times. Despite the many jobs it displaces, we already expect the number of people employedin the direct use or manufacture of computing power in Europe to double, from some 1 million in 1975 to 2 million by the mid 1980s. Badly handled, the information revolution can indeed lead to a new crisis of unemployment. Imaginatively handled, it can lead to a vast new range of employment opportunities. It is not too much to say that the competitiveness of the large majority of European industry and services will depend on the speed and competence with which it applies the new electronic tech-nology to its products and processes and to the services it offers during the next ten years. For these reasons, the Community has recognised that both the application of data-processing throughout the economy and the industry itself deserve vigorous public support at Community level both to help create a receptive homogeneous market and to match the immense public resources which are put behind the industry in other advanced regions of the world, such as the US and Japan. The first political recognition of this need was the Resolution of the Council of Ministers of the Community of July 1976 which called for a Community policy for data-processing. That Resolution stressed in particular the need to promote collaboration in data-processing applications, the need for users to be brought together so that the power of computing could be more effectively applied. Since then, the Council has adopted a number of priority studies, exploring the needs

  • E. DAVIGNON

    and feasibility of action in certain specific fields of user applications. Among these were two on Computer Aided Design: a CAD study in the building and construction field and the other in Digital Circuit Design, the subject of discussion today. The Council is now approaching a more critical political test, a decision on a four-year programme for informatics which would provide more systematic and greater support for a wider spectrum of user applications. CAD will be one important element in this programme. It will be the framework in which practical proposals emerging from the work of this conference can be implemented. Why does the Commission attach importance to CAD as a tool of economic development? Major sectors of industry, particularly those using advanced technology, such as aerospace, electronics and the automotive industry are already forging ahead and making massive investments on their own account in this field. These investments can acquire larger importance for the Community when the technology is purposefully transferred to other sectors of industry. For example, the large investments made by the aerospace industry to develop three-dimensional systems enable the tech-niques, when proven, to be carried over and used in shoe manufacturing, plastics, glass, mould and die industries. It is now recognised moreover that, in the future, CAD will become an integrated part of the production process, combining the design processes with automated manufacture (as is already the case with an aircraft wing). Modern CAD is there-fore a tool which European industry has to have. And when it has it, it will have to take account of massive social implications of its introduction. These wider aspects of CAD are being studied systematically by the Commission in preparation for the four-year programme. CAD is clearly basic to electronics, our topic of discussion today. The designer must wrestle with the challenge of ever-growing complexity which only computer aids and tools can enable him to master. These tools need to be available to a vaste range of medium-sized and small firms, in addition to the great. The study sponsored by the Community, which you will be discussing over the next few days, is designed to identify the state of the art computer aids potentially available and to suggest what the Community might do to improve them and make them more accessible. In this as in so many other fields of computing, ready access to data, effective standards, education in the use of new techniques will be essential, as a complement to innovating technology. We look forward to receiving your advice and hearing your views on what needs to be done. Ladies and Gentlemen, in every industrial period certain industries play a key part in the development of society. Today, the key industry is the complex of industries covering the processing and communication of information and using electronic technology. A strong capability in these related industries is essential to Europe's future because:

    the character of our society will depend on our skill in using these technologies most industries and many services will become dependent on these technologies

    and the remarkable growth rate of the market for these industries will continue to represent an increasing element of European and world production and wealth. This vast complex of technologies, with its unprecedented challenge to human skill and endeavour, requires resources and investments which no single European nation

  • OPENING ADDRESS 5

    could justifiably or possibly undertake on its own. It is my belief that the European Community should and can make a greater contribution to the development of this world technology than it has done so far, so that the potential economic and social benefits can be harnessed to benefit mankind. In this effort, the leading role will always fall to industry, to those who develop and apply the new techniques. Moreover, national Governments can and will continue to play an essential supporting role. They have, in particular, an immense educational responsibility in this new age, for we cannot accept the paradox of a Europe with many millions of unemployed, whose econimic development is held up by an acute shortage of critical software and engineering skills in the most advanced fields, and whose citizens have only the barest understanding of the potential implications for them of the new technology. There are, however, at least three vital tasks which only the Community can fulfil and which are wider than the modest programmes for data-processing which I des-cribed earlier. One is to ensure that the powerful broadband communications infrastructure needed in the electronic age is developed on a European scale. The second is to support the development of the key electronic technologies of the future which will permit Europe to become more than the follower which it has been in the past. And the third is to develop the activities in the fields of standardisation and procurement which alone can generate a true European market. In social terms, moreover, Europe has a vocation to ensure that in a European information society these formidable tools are in the hands of the citizen, in his workplace, school or home, and not solely in the hands of centralised power, whether management, Government or anyone else. I hope that spirit will inform your discussions too. The Community must also be open to mutually beneficial co-operation with organ-isations outside; when there is so much work to be done, we cannot afford to reinvent the wheel. It is for this reason that the Commission chose to share with you the results of this study. I hope you will both contribute and benefit from your participation in this Symposium to which I wish all success.

  • G. Muia/, edUoK, COMPUTER-AI PEP PESIGN o$ digital iZect/tUc CAOLUA and yitem No/ith-Holland PablUking Company ECSC, EEC, EAEC, 8/ui4es S Luxmboujig, 1979

    KEYNOTE ADDRESS

    K. Teer Philips Research Laboratories

    Eindhoven, The Netherlands

    The following are the basic notes which Dr. Teer used to outline his main points.

    1. The electronics industry is a relatively young and dynamic industry with potentially large growth figures due to a very wide area of application and a high rate of innovation. Growth figures of the last decade materialized as substantially higher than those of the industry as a whole or the Gross National Product. Notwithstanding that also electronic industry is subject to industrial saturation phenomena of recent years. Especially electronic components present an investment and massproduction picture that could easily lead to overproduction. 2. There is a standard partitioning of the electronic field in telecommunication (telephony, telegraphy), radio (radio, television, radar, navigation) data processing (computers, data transmission) and instrumentation (measurement, control, registration); the regular market view follows similar lines. One of the striking trends, however, is that these divisions tend to merge in many ways. In particular dataprocessing penetrates in almost every field. For the future it might be much more relevant to order the classification in terms of social categories: traffic systems, education systems, health care systems, distribution systems, office systems, production systems, home systems etc. 3. Few will dispute that the "push" in the electronics field in the past and for the future is dominated by:

    semiconductor technology binary processing satellite technology.

    The first two are especially related to the issue of this symposium. It is easy to present amazing figures about the progress in semiconductor technology (in terms of bits and gates per square mm or per chip) and about the penetration of binary processing (in terms of traditional computer use as well as new applications). These facts are widely known and are assumed to be common knowledge at this symposium. 4. It is relevant to notice that apart from the pure electronic technology, the optical technology is emerging now with special power in transmission and recording of information. This certainly should not be seen as competing with, but as complementary to, the pure electronic hardware. It is beyond any doubt that "micro optics" will give an enormous extra momentum to the electronic field. 5. With the tools of microelectronics and microoptics available there is a remarkable situation growing where central issues are on the move. Very schematically we can say that the question is no longer 'how to make it' but 'how to use

  • 8 K. TEER

    it', and the question is no longer 'how to reduce production cost', but 'how to reduce design cost'. 6. How to use it? Most present day stories about microprocessors in news-papers and magazines start with a hard fact namely the transistors per square mm but then jump into vagueness and threat. The reader is left unsure about what actually the message is but with an uneasy feeling that things might go extremely wrong in particular concerning privacy, employment and human dignity. A cool careful analysis is seldom available which has much to do with our inability to foresee the use of modern electronics. A first step here is to order things in various levels so that distinction is made between:

    better function of existing products new products new functions new organisation new social categories. It 1s true that our government bureaux, our offices, our banks, our education, our health care, our homes will all change but how? To know better we should transfer an experimental attitude well trained in achieving new technologies to the domain of using new technology. 7. Notwithstanding the uncertainties about applications, already a few new subsystems can be identified now without too much sciencefiction:

    audio and visual facilities in the home for information acquisition, giving do-it-yourself education and active entertainment; the electronic file with powerful data and document retrieval as a comfort for almost all environments in a broad range of sizes; the intelligent manipulator which can be instructed by craftsmen on the working floor for flexible automation; the electronic inspector and recognizer (of pictures, sounds and other inputs) to improve failure diagnosis of objects and human beings; the intelligent controller optimizing the function of non-electronic equipment towards minimum energy, minimum pollution, maximum efficiency or maximum security; the speech addressable equipment leading to hand-free use, lower users threshold and often to faster reaction of the equipment; the picture generator as a tool to explain (in instruction), to analyse (during design), to amuse (in entertainment) and to express oneself (in free time creativity). 8. How to reduce cost in design? In fact the question is somewhat broader namely how to simplify the design process so that speed, cost and clarity do benefit. This is in the focus point of this symposium and will be discussed by a number of speakers much more able than the author of this contribution. Indeed it is of utmost importance that the physical parameters of the devices, the equivalent network, the logic concept, the layout, the photomasks can be achieved with the aid of automatic means. However, this is not sufficient:

    the computerized process should be standardized the standard should be easily accessible

  • KEYNOTE ADDRESS

    the action should extend to higher levels than modules: the multichip domain, the level where computers are a basic building block. 9. As we are in this symposium as a European community it is good to realize what the position of European industry is in its social-economic context. European industry is confronted with:

    increased competition fragmentation of market in national states growing intervention of social forces in industrial activities saturation phenomena in some product ranges indistinct and unbalanced relations between automation, productivity, employment, need for work, dislike of work, the need of leisure, the need for education and the demand for education. To attack these difficulties it is necessary to respond with an enthusiastic and original approach. In that approach new forms of cooperation are a necessity. Cooperation of industries in the same line of business, cooperation of complemen-tary industries, cooperation of industry and government, cooperation of govern-ments. Regrettably this may sound retoric, pathetic and illusion like. But the validity of this observation cannot be denied.

    Next to these 'musts' for the European industry as a whole there is an additional point for the electronic industry. That is the challenge to cooperate with categories of users in much closer coupling than the customer-supplier relation in order to explore the possible answers to the question: "how to use?".

  • TECHNICAL SESSION I

    Chairman: J. BOREL, Centre Nuclaire, France

  • G. Uuignave, editou, COMPUTER-AIPEP PESIGN o& digital elect/iOrUc Acuit* and yitem& Nolth-HoUand Pubtiiking Componi/ ECSC, EEC, EAEC, Biuiteti Luxemboung, 1979

    INTEGRATED COMPUTER-AIDED DESIGN - AN INDUSTRIAL VIEW

    R W McGUFFIN INTERNATIONAL COMPUTERS LIMITED

    MANCHESTER ENGLAND

    The underlying reasons for the growth of CAD systems are examined. The various aspects of an integrated total technology CAD systems are presented and discussed. Also an orthogonal view is taken of CAD system design and from that, some pointers to the future examined.

    13

  • 14 R.W. McGUFFIN

    1. INTRODUCTION A general industrial overview presupposes that there is a common understanding in industry of what Computer Aided Design is, why it is used and what its benefits are. Nothing could be further from the truth. It is fair to say that the industry which has done most to understand the nature of, develop and exploit CAD is the computer industry. Superficially, it could be argued that this happened because there was in it a surfeit of cheap computer power and people who could understand (program) them. In reality, these factors, combined with an emerging, fiercely competitive industry, trying desperately to reduce exceedingly long timescales and high costs, caused it to become a leader in development and exploitation. This view is confirmed by the nature of the product which ranges from integrated circuit chips through printed circuit boards and operating system software to mechanical frames and piece parts. It is not claimed that one CAD system will handle this diversity of design disciplines, however, a vast amount of experience has been gained on where CAD may be used cost-effectively. 1.1 Scope and Definition of Terms The phrase 'computer-aided design' has, from a purist point of view, become devalued since it now embraces activities which relate to the design process but are not necessarily in the design loop. Engineering design is a process of decision making in order to produce information to enable correct manufacture. Thus, CAD embraces the use of computer systems to improve decision taking, communication and information flow. For the purposes of this paper, it is worthwhi.le subdividing CAD into two categories. These are not definitions, but merely a convenience to aid the understanding of an integrated approach to the subject.

    CAD - This is defined as the interaction of a designer with a computer in order to aid design decision making. The interaction need not be real time, in fact there are many good reasons why the interaction should be via batch job turn-round. The essence is that the computer is processing information supplied by the designer and yielding results that enable the best design decision to be made. Examples of this are as follows: (a) Simulation - here, a designer is performing experiments on a model of the system he is designing. The simulated results of these experiments will cause the designer to modify the parameters under his control and hence obtain an adequate design compromise. (b) Component Placement - a designer may wish to minimise the total length of copper track on a printed circuit board. The position or placement of the components is the prime parameter but this in turn is modified by track density profiles, technological rules etc. However, the designer can interact with trial placements in order to optimise around these parameters.

    DA - Design Automation - this is defined as automatic design translation or pushbutton design. The design algorithm is embedded in a program rather than in the mind of the designer. However, this is only achieved by sets of rules, codes of practice and compromises which make the design translation amenable to automation. The consequence of this is a restriction on design freedom, however, it does balance ease of design with ease of production. Given a product structured for automation, then DA acts as an amplifier.

  • INTEGRATED COMPUTER-AIDED DESIGN - AN INDUSTRIAL VIEW 15

    Instictively, the world has linked CAD with hardware, indeed, this is the principal area of application. However software, especially computer operating systems, networks etc., in many ways are more deserving candidates for the attention of CAD. How many hardware projects consume 200 to 600 man years? When considering design management, in this paper, this problem will be examined more thoroughly. 2. HISTORICAL DEVELOPMENT OF CAD The milestones in the development of CAD have been well-documented elsewhere and there is little point in reviewing them here; but it is instructive to examine the underlying reasons why there has been an acceleration in the growth of CAD systems. CAD hardware (computer power, graphics etc) and software, have grown with the size of the problem to be solved. Although some leap-frogging has taken place, in general, both have kept pace. In the late 1950s, early 60s, in the computer world, machines were relatively simple to understand (and hence, design!) consequently, the degree of assistance required in the design and manufacture of them, was minimal. By the mid 60s to early 70s, with the advent of TTL small scale integrated circuits and the multi-layer printed circuit board, the complexity of the product increased by an order of magnitude. Hitherto, CAD was the sole preserve of small isolated teams of programmers and dedicated hardware. This period saw the growth of CAD/DA 'systems'. Here, attempts were made to rationalise the product design requirements with the CAD tools available. Further as the complexity increased, the production problems mushroomed and there was an increasing demand to provide numerically controlled machines with output from the design data files. Up to this point, CAD was developed 'in house' by large manufacturers with large problems. Smaller concerns had started to see the advantages of CAD but did not have the necessary expertise/computer power to develop their own systems. Also, computer graphics under the guise of CAD systems had been sold by over-energetic salesmen and were proving to be 'white elephants' - fun to play with but not much relevance to design problems. It was in this atmosphere that the 'turnkey' CAD system started to evolve. Hardware and software engineering experts started to combine to form small companies. They tackled a limited range of problems (predominately integrated circuit design), tailored the hardware and software to the problem, and made a lot of money. This, in many ways, was the salvation of the small company since, for a modest outlay, they could enjoy the advantages of CAD without the birth pains. However, to quote from Ecclesiastes 'He that increaseth knowledge, increaseth sorrow'. Manufacturers which are in the 'system' business must seek total systems solutions and, with VLSI accelerating upon us, erstwhile semiconductor manufacturers are rapidly becoming acquainted with the problems they bring. System problems are many-faceted. One of crucial importance is design integrity. It is not sufficient to be able to store/manipulate/delineated integrated circuit patterns unto a mask. Design integrity demands that the system concept be faithfully translated perhaps through many levels of design decision, into the designed product. It is in this world that the integrated CAD system finds its living.

  • 16 R.W. McGUFFIN

    3. THE INTEGRATED CAD SYSTEM In this section, I am going to draw upon the ICL experience with its CAD systems. I believe it can be justly described as an 'integrated system' although, as with most industrial 'in house' products, it has evolved during its life. This evolution has, for the most part, been controlled, but as will become clear, rolling evolution is a direct result of an unclear view of the future at any point in time. The current ICL Design Automation system is called DA4 ie: it is the fourth generation of a CAD system. As the title implies, the primary concern is with design automation (translation) since, when it was conceived (1974) it was considered that this provided the most cost-effective solution to ICL's design problems. The overall structure is shown in figure 1. Since the service was provided on ICL 1900 range of computers under George 3, the operating system was used to control the housekeeping of the file store. The primary design task in ICL is to design logic which will be physically realised to make computers to make money and DA4 was tailored to this task. The overall CAD task is to balance ease of design with ease of production. Standardisation leads to dramatic savings in both design and production problems. Because of the company structure and the willingness of computer designers, technologists, test-gear designers etc., to co-operate with the DA team, the DA4 system, (database and tools) became the unifying influence in the design and production. As may be seen from figure 1, DA4 provides a 'total technology' outlook: * High level system design language - here the computer is considered at the

    architectural level in terms of structure and behaviour. Simulation may be performed to confirm that the machine will obey, for example, the basic order code. Further, the design may be expanded, in an orderly top down fashion, and pattern comparison performed to ensure safe design decisions.

    * Compressed logic data capture and automatic expansion to the level of detail required for implementation. When the system level description has reached a level low enough to be translated into detailed logic diagrams, engineers sketch the designs on gridded paper. These rough diagrams are coded by technicians and entered into the design database. This task is tedious and error-prone. Techniques such as multistrings (highways, buses etc) and macrosymbols reduce the drawing and data entry problems and hence save time, reduce errors and show better logical flow.

    * Microprogram assembly - there is a continuing debate on whether micro-programs are true software or hardware conveniences. From the DA4 view point, where the output from the microprogram assembler is often being burnt into Proms and the flow diagrams going to the field engineer etc., they constitute a vital part of the total technology and as such must be supported. Further, they provide a useful source of test patterns for simulation.

    * Logic simulation - an interactive tool used by many computer design projects. As distinct from high level simulation, this is concerned with complex logic elements, nominal and worst case delays, timing race and hazard analysis etc. The model library contains around 600 descriptions of the elements currently used in ICL computers. The tool itself has been optimised for interactive usage.

    * Logic data file - the logic content of the computer is stored as 'pages' of logic. Conceptually, the whole computer can be thought of as an enormously large logic diagram. This diagram is cut into manageable portions (say

  • INTEGRATED COMPUTER-AIDED DESIGN - AN INDUSTRIAL VIEW 17

    MICRO PROGRAM ASSEMBLERS

    PROMS

    LIBRARIES RND TABLES

    DRAWINGS

    DOCN

    HIGH LEUEL SYSTEM DESIGN

    LOGIC DESIGN DATABASE

    ASSEMBLY FILE

    PLACEMENT & TRACKING PCBs BACK PLANES CHIPS CABLES

    DETAILED LOGIC CAPTURE

    SIMULATION

    GROUP CHECKS

    Map Into Physical

    ARTWORK

    PROD. CONTROL TESTS

    Fig 1. ICL's INTEGRATED CAD SYSTEM

  • 18 R.W. McGUFFIN

    1000 gates) and called a page. The page is also a convenient drawing unit. ICL, like many major computer manufacturers, is diagram-based, ie: it is the basic medium for communications. The logic page contains all the information necessary for the design and field engineer alike, for example, logic cross-references and physical placement. Two basic forms of output are used: line printer - fast and inexpensive; dot matrix printer- high reproducable quality. Assembly Extract - as previously described, the data file contains 'pages of logic'. This basically logic description of the machine is used to construct the building blocks - integrated circuit chips, printed circuit boards, multi-layer back planes (platters) and cables. The process of mapping logical into physical is called 'Assembly Extract' and is performed with the aid of an engineer generated 'flyfile' which describes which pages or parts of pages, are to be mapped into a physical assembly. This process creates an assembly file upon which act a different variety of tools. Production Output - the variety of output is large and the data expansion up to three orders of magnitude. * Automatic tracking of printed circuit boards. Typical boards contain up to 150 dual in-line integrated circuits. * Automatic tracking of integrated circuits, typically 300 - 400 ECL gates on an uncommitted logic array. * Automatic technology-rule-obeying placement of components. * Fully validated manual placement. * Photographic artwork. * Drill tapes for a wide variety of machines on many sites. * Production control documentation. * Assembly drawings, silk screens for component insertion. * Control documentation for manual modification of boards. * Version control. Testing - this is an entire subject in its own right but, in brief, from the assembly file the following may be performed: * Automatic functional test pattern generation. To save computer power (money) some rules are applied to logic design. The benefits of these rules are that many thousands of board and IC chips types may have test patterns generated automatically. Typically, boards containing 30 IK RAMS, 1500 logic gates and 10 256 bit Proms, will require 12 million bits of test data. * Verification of manually produced test tapes. Using fault simulation the quality of the tapes may be assessed and diagnostic resolution determined. * Base Board Test - the tracks on printed circuit boards, before component insertion, may be checked for unwanted open and short circuits by means of computer controlled probes. The information to

  • INTEGRATED COMPUTER-AIDED DESIGN - AN INDUSTRIAL VIEW 19

    control this equipment is generated from the assembly file. * Probe Test - functional tests may be applied to each component, or group of components on a board by probes (reading and writing) at preselected points. This is achieved by selectively powering up the components to be tested. The allocation of probe points and generation of tests is performed automatically. * Group Checks - as previously described, the logic pages which describe the computer are partitioned into chips and boards. However, it makes sense to perform checks, physical and logical, on that group of boards which comprise a logical group. These checks include physical loading rule and timing path checks. The latter, in many respects, is much more economic than simulation.

    This is only a cross-section of the ICL design automation scheme but does give a flavour of the 'total technology' outlook. 4. SYSTEM CAD In this section, I will try to take a more orthogonal view of CAD systems capitalisation, company organisation etc., and from it, try to indicate some of the shortcomings of today's approaches and where the future lies. 4.1 Design Not enough is known about the nature of design. Alternatively, if we fully understood the design decision making process, then we would be able to provide methodologies and tools that would provide a safe (error free) design evolution. Given that this is not the case, the task of the CAD system designer is to provide a framework in which design may take place in a controlled manner and tools to assist this process. This framework will have many attributes but the most important is probably the man-machine interface. Whether this interface is a natural design language or graphical is of second order importance. The prime requirement is that the design engineer should be in harmony with his tools and that they should provide fast response on the implications of design decisions. Communication, or the lack of it, is a continuing problem. The framework (design database) should be capable of creating lines of communication between related design activities such that duplication, incompatibilities etc., can be avoided. 4.2 Conflicting Pulls The overriding benefit of CAD is cost reduction. This may be translated into: * Labour reduction, use of less skilled labour. * Timescale reductions. * Error reductions/design integrity. * Effective use of manpower - ease of design/manufacture. * Managerial control. * Timeliness. This l i s t , although incomplete, does indicate the conf l ic t ing pulls in CAD.

  • 20 R.W. McGUFFIN

    At the one end, we have management requiring the tools to control the design of complex products such that they feel that they are in command. At the other end, we have production demanding a product which they can make at a price marketing will accept. In the middle, is the design engineer, struggling against impossible timescales, to produce an error-free design. In no way can all these requirements be satisfied without compromise. Unfortunately, in many respects, the area of compromise is in the freedom of the design engineer. For example, an engineer may design an integrated circuit to all the conventional constraints - minimum silicon area, correct power consumption etc., but if it is non-testable his work has been wasted. These conflicting pulls upon the CAD system have not been satisfactorily met to date. As the complexity of the product increases, the freedom of the designer will be continually eroded unless a total systems view of the CAD system is taken. 4.3 Capitalisation In this section I will discuss the computer hardware required to support CAD systems. Obviously, the size of computer and variety of peripherals is dependent upon the types of application, the number of users and volume of job throughput. However, at the outset, the most important point is that under-capitalised CAD is a recipe for failure. Users of CAD systems expect to derive tangible benefits from their 'conversion' to CAD. If the service provided in terms of reliability, resilience and response is poor, they very quickly will become disillusioned and will justifiably claim that using a computer, despite the benefits, is actually slowing them down. To generalise, the use of CAD must provide at least three quantifiable benefits to the user. For example, an automatic printed circuit board tracking program provides: * Labour reduction. * Reduced timescale compared to manual method. * Reproducable results. Unless benefits such as these are identifiable, then one should rethink whether CAD is appropriate to the solution of the problem. Large mainframe computers have been the traditional workhorses of CAD. However, over the last five years, minicomputers have made inroads into the mainframe business. I do not want to get into the mini -v- mainframe arguments, however, at ICL I believe we have achieved the partition of computing activities between mini and mainframe which suits our activities. The configuration is shown in figure 2. The rationale underlying this arrangement is: (a) A considerable proportion of the CAD service work is of a data processing nature - logic group error reports, drill tape production etc., and is best suited to the background batch type of job. (b) Some jobs - simulation, automatic tracking etc., require fast response but demand considerable computer power. (c) With some four hundred users of the service, a considerable amount of file

    management is required.

  • INTEGRATED COMPUTER-AIDED DESIGN - AN INDUSTRIAL VIEW 21

    DESIGN DATA-BASE

    GRAPH-PLOTTER

    MAG. TAPE DIGITISER DOT MATRIX PRINTER

    LINK TO MAIN FRAME

    MINI UTILITIES INTER-ACTIONS

    TABLETS TERMINALS

    DISK STORE

    Fig 2. MINICOMPUTER CONFIGURATION

  • 22 R.W. McGUFFIN

    (d) Other activities are of an interactive graphic nature - interactive logic diagram modification, LSI circuit design - these require 'instant response' which can best be supplied by a minicomputer. However, they are 'front ends' to the CAD system and thus, cannot be considered in isolation from the mainframe/design database. Project activities (a) to (c) are provided on the mainframe; for (d) various graphic work stations have been provided on different geographic sites within the company. In general these are used for LSI circuit design, interactive logic design and overall drawings for technical publications. Typical peripherals are: * Graph plotters and dot matrix printers. * Editing tablets and digitisers. * Interactive storage screen terminals. * Magnetic tape back-up. In ICL we have found that, at present, this is a useful configuration of hardware and partitions of design activities. However, as more powerful minis are being produced, the partition must be constantly reviewed. 4.4 Company Organisation It is impossible to generalise on this subject. However, the position and status of the CAD system generators within the company is of vital importance. To take an extreme example, if they are located in the research department on a site remote from their customers, their impact on the company's products will be less than significant. In ICL the teams which provide hardware and software (operating systems) CAD are under the same manager. The combination, called a segment, enjoys equal status with: * Technology - printed circuit board technologies, component evaluation, semiconductor device fabrication. * Test Engineering - design of in-house test equipment, evaluation of OEM testers. * Project Teams - computer design projects. * Design Services - drawing office, technical authors etc. * Operating systems segment. and has a favourable relationship with the manufacturing division. This 'equal status' relationship ensures that, in the areas where CAD/DA is cost-effective, the magnitude and timeliness of support meets the product requirements. 4.5 Software CAD Every programming manager responsible for large or medium scale projects must have felt that the problems associated with software production were inherent in the very nature of the software design process. Many of these problems, however, sprang from the way in which software design was viewed more as an art form than a science. By ignoring the techniques which are standard practice in hardware engineering, software planning and costing exercises tended to remain a rather hit-and-miss affair.

  • INTEGRATED COMPUTER-AIDED DESIGN - AN INDUSTRIAL VIEW 23

    Then there was the seemingly inevitable gulf between design and implementation. The traditional approach, with an analyst specifying the design in a natural language like English for subsequent translation into machine-executable form by a different group of implementors, led to many misunderstandings and inefficiences. This was aggravated by the imprcisions caused by using a natural language in the design whereas there is no reason why the language used for expressing high level design concepts should be any less exact than the language chosen to express that design to the machine. To control projects involving a large volume of code and many people there was also a need to have automatic facilities for progress monitoring and control as part of the philosophy underlying the design structure; but whether 20, 50 or 200 people are involved in a project, it has to be recognised that there is a considerable price to be paid to ensure effective design co-ordination and communication. An automated design technology must also take into account the problems of tests and validation and of release, enhancement and maintenance of final products. This is particularly true for computer manufacturers where, for example, pieces of code the size of George 3 or 0S360 cost a lot of money to design and code, more to build and debug and a great deal more to enhance and maintain. Over six years ago, on the threshold of embarking on a major programme of software development, ICL decided to overcome these traditional software production problems by developing an in-house technology of software design. This technology took a unified view of the formalisation of the design process, the use of a formal design definition language, the use of computer-aided design and design automation techniques, code documentation and standards, and project control procedures. This approach to software design and control (Cades) has been thoroughly aired and documented elsewhere. However, the lessons learned are equally applicable to hardware projects, particularly structural preservation. In large projects there is a requirement to be able to identify and preserve the overall structure - when design decisions are delegated often the nett result is that the nature of the product changes since the effect of the decision is not reflected upwards. This is structural decay. Hardware CAD excels in the range of tools available to create, manipulate and produce design. The management of design has been less emphasised in the past but, in the next generation of CAD system this cannot be ignored and the software experience should be utilised. 5. ADVANCING THE STATE OF ART In reviewing the history of CAD, it was observed that the computer aids kept pace with the complexity of the problems. In hardware design, the product complexity has remained within tolerable proportions - ie: within the intellectual scope of man. However, with VLSI we can see a complexity barrier approaching - the early symptoms of which are a creeping paralysis in design. The initial design time for integrated circuit chips is lengthening at a predictable rate, however the time and the number of interactions required to 'get it right' is escalating. Also, the testing of these devices is starting to become unmanageable in that the goal of 100% testing is steadily retreating. The more cynical among the system manufacturers could claim that the semiconductor companies are starting to experience the problems they have lived with for years. This is partially true, however using conventional SSI and MSI packaged devices on printed circuit boards does not present all the problems of integrated circuit design - the principal difference being that PCBs are a 'modifiable' technology, ie: design or

  • 24 R.W. McGUFFIN

    production errors may be fixed with the aid of a soldering iron. This is not so with integrated circuits and the onus is on the designer to minimise the number of design interactions and to produce a testable product. Of course, hardware design problems are only a facet of the total complexity barrier - it is not sufficient for turnkey CAD system vendors to claim that they have the tools for VLSI, the roots of the problems lie much deeper. 'Systems' are a combination of hardware and software - they cannot be considered independently. So, in conclusion, the key points for the future: * Total design capture/coming together of disciplines - the next generation

    of CAD systems must support both hardware and software design and production. There should be as natural a relationship between the two in the CAD systems as there exists in the product. Facets of this are the design language and the simulation of the interactions of hardware and software at various levels of design.

    * Design and manufacture - a keystone in successful industry is the effective use of design manpower. This may be translated into the effective use of design automation to free designers, not from their responsibilities for production, but from detailed concern.

    * Testing - it is difficult.to determine how the test patterns for large integrated circuits will be determined, however, three points are clear: (1) design for testability. There is little point in designing

    something which cannot be tested with contemporary methods and equipment and in a time period which reflects the complexity.

    (2) fault model. The conventional model (nodes stuck high/stuck low) it could be argued, is not accurate enough to detect/diagnose all possible faults. On the other hand, with networks comprising many millions of nodes, the use of this model may be uneconomic.

    (3) top down test pattern generation. The concept here is of the abstraction of test patterns from the functional description of the system (hardware or software) used for design.

    The convergence of design disciplines will remain a problem through to the 1980s but, as CAD systems evolve which will give us a key to the solution of this problem, we can look forward to VLSI with genuine confidence.

  • G. Uutgiwui, editan., COMPUTER-AIPEP PESIGN c(5 digital eZectiionic CIAJCWCU and iyitxm Nonth-HoLiand Publl&king Company ECSC, EEC, EAEC, Bnuiteti S LuxemboMg, 1979

    PRODUCT SPECIFICATION AND SYNTHESI S Douglas Lewin

    Department of Electrical Engineering and Electronics, Brunei University

    Uxbridge, Middlesex

    The specification and evaluation of computer systems at the initial user requirement level is one of the most important and critical aspects of digital systems design. A formal specification of the system not only ensures that the user requirements are correctly translated into an acceptable design but also provides the essential basis for contractual and design documentation. A critical survey of existing methods of system specification, including such techniques as directed graphs, FSM theory, hardware description languages, simulation languages etc. is presented fol lowed by a brief review of the current state of synthesis methods. In so doing it is concluded that no suitable specification and design system is available at the present time and possible reasons are given why this situation exists.

    1. INTRODUCTI ON Current digital and computer systems have now reached such a high degree of sophistication that conventional design methods are rapidly becoming inadequate. The major problem is the sheer complexity of the systems which are now feasible using LSI and VLSI subsystem modules such as micro-processors, micro-computers, ROM's, RAM's, PLA's, etc. In order to control and manage this complexity (in both software and hardware realisations) It has become necessary to enlist the aid of computers, and computer aided design techniques are now becoming accepted as essential design tools. Unfortunately CAD, though successfully used at the logistics and manufacturing levels, has not as yet realised its full potential when applied to system specification and the conceptual design stages. This is evidenced by the singular lack of success in attempting to develop realistic specification and evaluation languages, synthesis techniques and design methods for secure and reliable systems. At the present time, as shown by a recent EEC feasibility study (I), there Is no viable specification and design scheme available for digital systems, and industry - just managing to cope with current technol-ogy -could well be faced with a major dilemma in the near future. The objective of this paper is to review the current "state of the art" in product specification and synthesis. In so doing the basic principles and similarities of the techniques which have emerged so far will be described, followed by an attempt to define the fundamental problem area and future requirements. 2. PRODUCT SPECIFICATION The most Important property of any CAD scheme is the ability to be able to accu-rately specify the system under consideration using a suitable representation. It is essential that the specification language should be able to provide an unambiguous and concise description of the system and be capable of serving as a means of communication between users, designers and mplementers. Note also that a formal specification of the system not only ensures that the user requirements are correctly translated Into a viable design but also provides the essential

    25

  • 26 D. LEWIN

    basis for contractual and design documentation. In order to handle complex digital structures a specification language must be able to describe the system at several levels, that is on a hierarchal basis. At the top level is the behavioral (information flow) description which treats the system as an interconnection of functional modules specified by their required input/output characteristics. The next level down is the functional (data flow) description; this partitions the system into subsystem components and details the logical algorithms (micro-programs) to be performed by the components with their corresponding highway transfers. At this level it should be possible to represent the algorithms in a variety of ways, for example in terms of Boolean equations, state tables, timing diagrams, flow-charts etc. Finally, at the lowest level, is the structural (implementation) representation, which describes in detail the actual gates, bistables, LSI and MSI chips, software data representation etc., used to physically realise the subsystem functions.

    An ideal specification language should have the following characteristics: a) capable of representing logical processes independent of any

    eventual system realisation; b) facility to formally represent and evaluate the information

    flow in large variable systems at the behavioral level and also to analyse data flow at the functional level;

    c) ability to handle concurrent processes and to provide insight into alternative partitions of the system;

    d) act as a means of communication between users, designers .and implementers;

    e) able to proceed directly from system description to physical realisation using either software or hardware processes.

    Numerous methods have been described in the literature for the description and design of digital systems; these techniques may be generally classified into three basic approaches, which are as follows:

    i) Functional descriptive programming languages, such as hardware descrip-tion languages (including register transfer languages) simulation languages and some general purpose high level languages such as APL.

    ii) Finite State Machine (FSM) techniques, such as state-tables, regular expressions, flow charts, including the algorithmic state machine (ASM) approach, etc.

    iii) Graph-theoretic methods, employing transition graphs, Petri nets, occurrence graphs, etc.

    The principle methods will now be considered in more detail in the following sections. 2. I Register Transfer Languages ' The intuitive design procedures used in digital and computer systems engineering are normally centred around a predefined register configuration. The execution of a required system function (for example, a machine-code instruction) is then interpreted in terms of micro-order sequences (called a control or micro-program) which govern the necessary transfers and data processing operations between registers. Register transfer languages are based on this heuristic design procedure and allow the declaration of register configurations (the data structure) and the specification of the required data flow operations (the control structure). Thus the declarative section of the language in essence forms a linguistic description of the block diagram of a machine, with RTL operational procedures being used to specify the control programs. Note that the RTL procedures can be used for documentation and simula-tion purposes; it is also possible to generate Boolean design equations directly from the RTL descriptions. A typical register transfer language description (Chu's CDL) for the LOAD Instruction of a computer is shown in Table I. The first register transfer language was proposed by Reed (4) and was non-

  • PRODUCT SPECIFICATION AND SYNTHESIS 27

    procedural in nature with a small vocabulary directly related to hardware elements; it was used essentially as an algorithmic language for defining microprograms. Due to the nonprocedural character of the language it was necessary to prefix each statement with a conditional label (either a clock pulse or flag value) detailing the conditions for executing the operations defined by the RTL statements; thus the notation could be used to represent both synchronous and asynchronous systems. The Reed language however was very primitive, having no facilities for block structures or adequate means of handling branching operations such as test and jump instructions. Schorr (5) extended the Reed language by including timing pulses as an integral part of the conditional statements, and a form of GOTO statement. For example, it is possible In the language to write statements of the form:

    |t,?3| : D; I * t g |tjS3| : A + A I; I t 2

    where, i f 'S", = I and t| = I the operation ) is performed and the next statement to be executed occurs in ^5, that is a jump to \* takes place; i f S3 = I the alternative operation takes place. Note that A, and D are registers and S3 a flag bistable or register stage. Schorr's language not only provided a more practical means of documenting microprograms but also had the distinct advantage of being fully implemented using a syntaxdirected compiler based on ALGOL 60. Moreover the language had facilities for performing logic synthesis, and analysis, with microprogram statements being directly translated into the Boolean Input equations for the bistable registers. Reed's language was also used as the model for the LDT (logic design translator) language developed by Gorman and Anderson (6) and Proctor(7), LDT was a formally defined procedural language and included highlevel ALGOL type operators such as IF, THEN, ELSE, GOTO etc. More important however was the introduction of subroutine facilities which allowed system modules, such as counters, adders, etc., to be declared as high level blocks, thus enabling a hierarchal description to be employed. The main function of LDT was the derivation of the bistable equations, suitably optimised, directly from the RTL description. LDT also enabled a timing analysis to be performed, using a sequence chart approach (8) which enabled the individual register transfer operations to be displayed against time. Another ALGOL based language (though nonprocedural) was described by Chu and called CDL (9). This language had the advantage of being able to describe special operators (such as count up/down), predetermined sequences, branching and conditional transfers as well as the basic RTL operations. Unfortunately CDL had the major disadvantages of functioning in a synchronous mode, no facilities for block structures and the inability to describe independent concurrent operations. CDL was used primarily for the specification and simulation of digital systems and is still widely used in teaching. Though not originally conceived as a register transfer language APL (10) has been extensively used for algorithm definition and the description of computer architectures; In particular the language has found acclaim in the teaching of digital systems (II). APL has also been used by IBM as the basis of the ALERT (automatic logic design generation) system (12) with modifications to allow the expression of control and timing functions and the representation of block structures and parallel processes. ALERT was basically a conventional RTL system with provision for translating the microprogram description into a minimised set of logic design equation for the registers and control logic. ALERT was implemented on the IBM 7094 machine and used to reproduce the design for an IBM 1800 computer; though the resulting design was logically correct it was found to be highly redundant in terms of hardware. ISP (instructionsetprocessor) was Initially developed to describe primitives at the programming level of design in the PMS and ISP descriptive system due to

  • 28 D. LEWIN

    Bell and Newel I(13). ISP is similarin characteristics to other register transfer languages but with facilities for handling block structures and concurrancy and the simple sequencing of processes. However ISP has been implemented and used to describe and simulate computer architectures (14). In particular it has success-fully been used to perform comparison studies of computers for military use (15), and is being seriously considered as a standard hardware description language by U.S. government agencies. Though some of the languages described above have the ability to describe sub-system blocks, none of them have facilities for representing a partitioned system consisting of interconnected autonomous modules. In the digital design language (DDL) described by Duley and Dietmeyer (15) a system Is viewed as a collection of several subsystems or automatons, each possessing "private" facilities and having access to "public" facilities (common busses) which are used for intercommuni-cation between automatons. In DDL a system Is specified using a block structured description, where the outermost block defines the whole system in terms of subsystem blocks (automata), global variables, input-output requirements etc., and the inner blocks specify the automata in terms of their state and I/O behaviour. The description itself is in Reed-like statements and contains the usual register transfers ana operators, including special operators and declarative statements for the system level description - Table 2 shows examples of the more usual operators and declarations. DDL is a non-procedural language and uses the concept of finite state machines to control operations - for example, by storing the state of the system in registers which can be tested and modified using special operators. As well as being able to describe digital systems the DDL specification can also be translated into Boolean and next-state equations to describe a hardware realisation.

    Other system design languages have been described in the literature. One such language is CASSANDRE, proposed by Mermet and Lustman (17) which was based on ALGOL and uses the block structures of that language to achieve system partitioning. A CASSANDRE description consists of defined units and their Inter-connections; each unit may itself comprise a network of units. The language has been implemented on an IBM 360/67 machine but only used for logic level simulation and micro-program evaluation. A similar language is the CRISMAS system (18) which also uses a hierarchical block-structured definition language; no implementation of this language has as yet been reported. The CASD language (19) (computer aided system design) encompassed high level system descriptions, simulation at both systems and logic levels and automatic translation to detailed hardware. CASD was based on PL/I and used its block structuring facilities to develop the hierarchal specification. CASD,bas i cal ly a feasibility study, was never implemented. The LALD (language for automatic logic design) system (20) allows a multilevel system description in terms of Interconnected sub-system components. The control and data structures must be specified separately and the control structure can be Implemented using either hardware or software. LALD compilers have been reported for the CDC 6400 (using SNOBAL) and in PL/I for the IBM model 91.

    Though it would appear that considerable effort has been expended on the develop-ment of register transfer and hardware description languages very few have been adopted for use in a real engineering situation, and a viable cost-effective system still remains to be developed. Moreover, many of the systems described above have been outdated by the rapid progress In microelectronics. Problem orientated programming languages suffer from the inherent disadvantage that they have no formal mathematical structure. Consequently, system behaviour must be interpreted indirectly from program performance whilst operating on certain specified data types. Hardware description languages usually describe a digital system In terms of simulated components and their interconnections. In order to evaluate logic networks modelled this way It is necessary to perform a physical step-by-step examination of all the relevant input-output conditions. It will be obvious that this s a time consuming process and that large amounts of storage would be required to represent the circuit model. In addition since the system is described in terms of a topological model, rather than by formally

  • PRODUCT SPECIFICATION AND SYNTHESIS 29

    specified system functions, the description s of limited value for general communication purposes. Moreover, since register transfer languages are constrained to operate on well defined data types, they are normally restricted to hardware representation. The use of formal methods, such as FSM and graph theory, for system description would appear to have considerable potential - these techniques are described in the following sections. 2.2 Finite State Machine Techniques Finite state machine theory, using for example state-table representation, though theoretically capable of describing any digital system is not viable In practice owing to the considerable practical difficulties involved in expressing large variable problems and the inordinate amount of computation required to manipulate the resulting structures. This Is undoubtedly true, particularly if both control and data structures are represented In the same state-table. However large systems must inevitably be partitioned by the designer into sub-system components in order to comprehend their complexity, and If the concept of separately defining data and control structures is used state-tables can still be a useful aid in design. This is borne out by the algorithmic state machine (ASM) approach to design (21) which uses a flow-chart to specify the control logic for a system, the implementation of which draws heavily on FSM theory. The method was successfully used by Hewlett Packard for the design of calculators etc., but currently no computer implementation is available. A formal approach to system description based on FSM theory was originally described by Keene (22), who showed that any finite state, deterministic, synchronous automaton can be described by a regular expression, and that, inversely, every regular expression can be realised as a finite state machine(23). Thus regular expressions constitute a formal language which can be used to characterise the external (input-output) behaviour of sequential circuits (combinational circuits being treated as a special case). Later work by Brzozowskl (24), using a derivative of a regular expression described an easy-to-use and systematic method of transforming a regular expression to a state-table. Regular expressions are used to describe the required set of input sequences (in terms of algebraic operations on sequences of O's and I's) to a FSM in order to generate an output. Thus the behavioural description for a FSM can be reduced to an algebraic formula. Though regular expressions would appear to have many of the characteristics required by a specification language, for example, a formal structure capable of analysis, direct implementation etc., there are considerable disadvantages in practice. Contrary to what has been written the method Is not easy to use and design engineers find the formalism very difficult to apply, encountering con-siderable difficulties in converting from a verbal description to the algebraic formulation. Another basic disadvantage is that the language is really only suitable for FSM's with a single output terminal. Consequently with multiple-output circuits it is necessary to derive separate regular expressions for each output terminal. It will also be obvious that the method automatically specifies both the control and data structures and hence would certainly lead to computational difficulties with large variable circuits. Finite state machine methods, as well as having practical drawbacks, also suffer from a more fundamental disadvantage. In general the FSM accepts a serial input (or Inputs) and progresses from state to state producing an output sequence (or sequences) in the process. Due to its finite memory limitation (that is, the number of internal states) the FSM is bast suited to describing systems where the amount of memory required to record past events (that is the effect of earlier inputs) is both smaI I and finite. For example, serial systems (such as pattern detectors) where the computation can proceed as a step-by-step operation on the input, and the amount of Information required to be 'remembered' Is very small. However some processes, such as serial multiplication, require to have al I the Input data available before the computation can proceed. Moreover large amounts

  • 30 D. LEWIN

    of information could need to be stored during the course of the operation, (for example, the accumulation of partial sums in the case of multiplication). Thus it follows that the FSM has the inherent disadvantage that it is impossible to specify a machine which requires the manipulation of arbitrarily large pairs of numbers. Note also that the FSM lacks the ability to refer back to earlier inputs unless the entire Input sequence Is initially stored; this implies that the Input sequence of interest must be of known finite length. These limitations can of course be overcome by using an Infinite machine model, such as the Turing machine (25), where the available memory Is theoretically uni Imi ted. 2.3 Directed Graph Methods One mathematical tool which is finding increasing application in computer systems design and analysis is graph theory (26) and many of the more successful specification methods are couched in graph theoretic terms. A directed graph is a mathematical model of a system showing the relationships that exist between members of its constituent set. The elements of the set are normally cal led vertices or nodes, with the relationships between them being indicated by arcs or edges. An example of a directed graph is shown in Figure la where the set of nodes is given by N {n|,2,3,4,5} and the set of edges by E = {e|, e2,63,64,es,e&,} . Graphs may be classified into various types depending on their properties. For example a net shown in Figure lb is a directed graph consisting of a finite nonempty set of nodes and a finite set of edges; note that a net may have parallel edges, that is two nodes connected by two different edges but both acting in the same direction. Again, a net which does not contain parallel edges but with assigned values to its edges is called a network as shown in Figure Ic.

    Directed graphs have been used, for instance to represent information flow In control and data structures, parallel computation schemata, diagnostic procedures in logic systems etc. The major advantage of using graph theory, apart from the obvious visual convenience, Is that formal methods exist for the manipulation of graph structures, which can be represented by matrices for computer processing. The Transition graph is a simple example of a directed graph used to represent automata. It consists of a set of labelled vertices connected by directed arcs and in every graph there is at least one starting vertex and at least one terminal vertex. Each directed arc is labelled with symbols from the input alphabet of the machine (I {0,1} in the case of a binary system). A sequence of directed arcs through the graph is referred to as a path and describes the input sequence consisting of the symbols assigned to the arcs In the path. An Input sequence is said to be accepted by the graph if a path exists between a starting and terminal vertex. Transition graphs have the advantage over statediagrams (which are a special case) in that it is only necessary to define the input sequences of direct interest, alternative input transitions being omitted. Thus the transition graph Is nondetermnistlc in the sense that, unlike statediagrams, it is incompletely specified. The transition graph also provides a convenient shorthand for representing deterministic machines, since it is always possible to convert a transition graph into an equivalent statediagram (27). However in general it is difficult to derive a transition graph which faithfully represents a required machine specification. Another directed graph approach which has found considerable application in the description and analysis of digital systems is the Petri net (28)(29). The Petri net Is an abstract, formal graph model of information flow In a system consisting of two types of node, places drawn as circles and transitions drawn as bars, connected by directed arcs. Each arc connects a place to a transition or vice versa; in the former case the place is called an input place and In the latter an output place of the transition. The places correspond to system conditions which must be satisfied in order for a transition to occur; Figure 2 shows a

  • PRODUCT SPECIFICATION AND SYNTHESIS 31

    typical Petri net. In addition to representing the static conditions of a system the dynamic behaviour may be visualised by moving markers (called tokens) from place to place round the net. It is usual to represent the presence of tokens by a black dot inside the place circle; a Petri net with tokens is called a marked net. A Petri net marking is a particular assignment of tokens to places in the net and defines a state of the system; for example, In figure 2a the marking of places and C defines the state where the conditions and C hold and no others. Progress through the net from one marking to another, corresponding to state changes, is determined by the firing of transitions according to the rules;

    a) a transition is enabled If all of its input places hold a token b) any enabled transition may be fired c) a transition is fired by transferring tokens from input places to

    output places; thus firing means that instantaneously the transition inputs are emptied and a I I of its outputs filled.

    (Note that transitions cannot fire simultaneously, thus only one transition can occur at a time). This is illustrated in figure 2, where 2a shows the original marked net and 2b the state of the net after firing transition a; note that the Petri net is able to depict concurrent operations. After two further firings the net would arrive at the marking shown in figure 2c, here the net is said to be in confi iet since firing either of the transitions d or e would cause the other transition to be disabled. In general a conflict will arise when two transitions share at least one input place; Petri net models are normally constrained to be confIict free. Another limitation imposed on the model is that a place must not contain more than one token at the same time: this condition leads to a safe Petri net. A I I ve Petri net is defined as one in which it is possible to fire any transition of the net by some firing sequence, irrespective of the marking that has been reached. Note that a live net would still remain live after firing. The Petri net model described above may be extended into a Generalised theory by allowing multiple arcs between transitions and places, thereby allowing a place to contribute

  • 32 D. LEWIN

    hierarchal structures, since an entire net may be replaced by a single place or transition at a higher level. The major advantage of the directed graph approach is that it is amenable to mathematical analysis and many authors (32)(33) have described algorithmic methods for their analysis. In the main the techniques apply to the control graph function only, known as an uninterpreted analysis, and no allowance is made for operations performed in conjunction with the data structure. Though Petri nets have many of the properties required for the specification and design of digital systems to date there has been only one example of its use In a CAD system, and that on an experimental basis. Project LOGOS (34)(35) conceived at Case Western University was based on Petri net principles and had the objective of providing a graphical design aid which would enable complex parallel systems to be defined (on a hierarchal basis) evaluated at any level and then finally implemented in either hardware or software. The LOGOS system employed two directed graphs, one for data flow (the data-graph, DG) and one for control flow (the control-graph, CG) to define a process leal led an activity). Though it was found possible to realise the control operators In the CG the problems of transforming the DG components was never fully resolved. In addition the computational problems encountered n attempting to perform an interpreted analysis (involving both CG and DG structures) of an activity was found to be extremely difficult. Though the LOGOS system was the most ambitious attempt to date to develop an Integrated CAD system, it nevertheless still requires considerable further development before it can become a viable design tool.

    3. SYNTHESIS OF DIGITAL SYSTEMS(36)

    An essential prerequisite to any synthesis package is a suitable specification and evaluation language, otherwise the problem is reduced to one of minimisation and implementation of design equations. As we have seen, many of the specification techniques described above incorporate some method of hardware realisation, for example, RTL's, LOGOS, ASM charts, etc., but these in the main rely heavily on conventional switching theory. Purpose designed synthesis systems such as CALD (37) and MINI (38) employ a tabular or cubic notation to input Boolean design equations and then use heuristic techniques to obtain a near minimal solution; the resulting circuits being realised In terms of basic gates and bistable elements. Though these procedures have some application In the design of MSI sub-system components, for example in reducing the required surface area of the chip, their usefulness in system design Is strictly limited. The CALD and MINI systems, together with numerous other examples of synthesis techniques (39), all rely heavily on classical switching theory. Unfortunately semiconductor technology has progressed to such an extent that the use of minimisation methods and Implementation in terms of NOR/NAND logic Is no longer relevant. Current exceptions to this are in the use of PLA's, which utilise multiple output SOP's terms, and implementing FSM's using ROM's (40) where minimising the states will reduce the number of words required In the memory. Notwithstanding, theory has been (and is still being) outpaced by technology and a major and severe problem now exists due to the lack of a suitable design theory at the sub-systems level; for example, algorithmic techniques for the realisation of systems using ROM's, PLA's, etc. The situation is becoming even more critical now that programable electronics such as microprocessors and micro-computers are being used as sub-system components. The specific question of logic circuit synthesis has become subsumed by the general problem of computer systems engineer-ing, including the vital topic of specification and evaluation. Moreover, it is essential that a "top-down" approach to design be adopted to allow the system to be partitioned into viable and compatible hardware and software processes. Thus at the systems level It is no longer possible to divorce hardware and soft-ware techniques, and It is essential that any synthesis procedure should take Into account the design of software, as an alternative to hardware, in system realisa-tion. Specific hardware design techniques are still required at the LSI component level, though even here conventional techniques are of little use and certainly

  • PRODUCT SPECIFICATION AND SYNTHESIS 33

    they will not be able to cope with future VLSI circuits.. It will be apparent that the whole question of synthesis is wide open and may difficult problems remain to be solved, and must be solved, if progress is to be maintained. It Is vital that the dichotomy that now exists between hardware and software engineering is obviated, since the synthesis problem can only be solved by adopting a general systems approach. 4. DISCUSSION There are many difficult problems to be solved before a viable specification and design language for digital systems engineering can be developed. Register transfer languages are adequate for the design of register structured systems, but they are specifically hardware orientated, and since formal methods are not possible, evaluation must be performed using simulation techniques. Another disadvantage is that the languages tend to generate very simple constructs. This is due to the languages providing only simple elements and the users perpetuate the situation by designing at a low level. Another problem occurs in the generation and use of library routines for components used to represent complex MSI and LSI circuits and other data structures. There are two distinct cases when subsystem blocks are required:

    a) to represent a component or sub-routine which will be used by the system many times over, but not actually implemented each time; for example, an arithmetic unit or any complex data-processing structure;

    b) the insertion of a standard hardware component (analogous to a software macro) such as a multiplexer unit, which needs to be implemented as such in various places in the system.

    The major difficulty comes in isolating identical functions and, if necessary, merging them together. It is this fact which accounts for much of the redundancy encountered in RTL implementation schemes. The problem is also relevant when considering the implementation of Petri net schema, as for example in LOGOS, and generally for all systems which separate the control and data functions. It has already been suggested that the FSM model has severe limitations when used to specify complex systems. These limitations can of course be overcome by using Infinite or unbounded models such as the Turing machine or Petri net. Using this type of model the designer is unconstrained in his thinking, allowing general logical processes to be specified without reference to a particular Implementation. Unfortunately the transformation from a conceptually unbounded model to a practical realisation can, and does, present serious difficulties. Another fundamental problem is encountered In the analysis of large systems. It would appear inevitable that, if a detailed analysis of a logic algorithm is required (say in seeking the answer to a specific question) there is no other choice but to examine all possible alternatives in an iterative manner. In general, particularly if an unbounded model is adopted, it is necessary in order to determine the system operation to constrain the analysis to a restricted set of input and state conditions. This means, for example, that only particular paths through a Petri net are allowed, and the technique results in a loss of information and affects the accuracy of the model. If exact information about a system is sought the Petri net must be examined (ideally in an interpreted mode) for all possible firing sequences. In general a detailed analysis of the modelled system proves to be prohibitive In computer time. The specification techniques described above have included both special purpose languages and graphical methods. It will be obvious that from the users (and designers) point of view the use of formal graph theory could present an intellectual problem. Though graphical techniques have a visual advantage it would appear that a language approach based on formal methods would be preferable.

  • 34 D. LEWIN

    5. CONCLUDING COMMENTS It would appear that at the present time there is no ideal specification, evaluation and synthesis scheme available for digital systems design. Directed graph techniques appear to hold the most promise as the basis for specification and analysis, but there are nevertheless many fundamental problems remaining to be solved before a viable CAD system can be evolved. The urgent need for developing a design automation scheme is indisputable, but is there any hope of finding a solution to the basic problems? In the short term much might be accomplished by jettisoning the philosophy of attempting to develop a general design language which can serve all purposes - from user level specification, through evaluation down to system realisation - and concentrate on a specific language and methodology for each function with well defined trans-formations from one to the other. Unfortunately the complex systems which will be required in the not-too-distant future require a fundamental reappraisal of the available theory. The need for a system level theory capable of handling the Interconnection of LSI modules working in a concurrent mode has already been stressed. Sutherland and Mead (41) go further in suggesting that a new theoretical basis for computer science Is required, based on spatial distribution and communication paths, rather than the theoretical analysis of sequential algorithms. There is also another, perhaps even more fundamental, aspect to the problem. Computer science today is concerned primarily with the derivation of efficient (and correct) algorithms for sequential machines based on a deterministic binary model. Thus the scientist searches for a solution which enables system behaviour to be explicitly described by s