chapter -2 literature survey – total quality...

25
- 4 - Chapter -2 Literature Survey – Total Quality Management 2.1.0 Introduction: This chapter deals with Literature Survey on Total Quality Management. As has been already outlined in detail, the main objective of this research project is to investigate and analyze various aspects related to TQM and TEM in five selected industrial sectors. In view of the above, an in-depth and updated literature survey has been done in this chapter containing the state of the art review on Total Quality Management (TQM) and Total Energy Management (TEM) which includes the following important aspects focused appropriately: 2.2.0 Concepts and Features of TQM: The historical evolution of Total Quality Management has taken place in four stages. This can be categorized as follows: 2.2.1 Quality Inspection: Quality has been evident in human activities for as long as we can remember. However the first stage on this development can be seen in the 1910s when the Ford Motor Company’s ‘T’ Model car rolled off the production line. The company started to employ teams of inspectors to compare or test the product with the project standard. This was applied at all stages covering the production process and delivery, etc. The purpose of the inspection was that the poor quality product found by the inspectors would be separated from the acceptable quality product and then would be scrapped, reworked or sold as lower quality. It involves Salvage, Sorting, Corrective Action, Identify sources of non-conformance 2.2.2 Quality Control: With further industrial advancement came the second stage of TQM development and quality was controlled through supervised skills, written specification, measurement and standardization. During the Second World War, manufacturing systems became complex and the quality began to be verified by inspections rather than the workers themselves. Statistical quality control by inspection –the post production effort to separate the good product from the bad product- was then developed. The development of control charts and accepting sampling methods by Shewhart and Dodge-Roming during the period 1924-1931 helped this era to prosper further from the previous inspection era. At this stage Shewhart introduced the idea that quality control can help to distinguish and separate two types of process variation; firstly the variation resulting from random causes and secondly the variation resulting from assignable or special causes. He also suggested that a process can be made to function predictably by separating the variation due to special causes. Further, he designed a control chart for monitoring such process control and lower evidence of non-conformance. It involves Quality manual, Performance data, Self-inspection, Product testing, Quality planning, Use of statistics, Paperwork control 2.2.3 Quality Assurance: The third stage of this development, i.e. quality assurance contains all the previous stages in order to provide sufficient confidence that a product or service will satisfy customers’ needs. Other activities such as comprehensive quality manuals, use of cost of quality, development of process control and auditing of quality systems are also developed in order to progress from quality control to the quality assurance era of Total Quality Management. At this stage there was also an emphasis of change from detection activities towards prevention of bad quality. It involves Third-party approvals, Systems audits, Quality planning, Quality manuals, Quality costs, Process control, Failure mode and effect analysis, Non-production operation. 2.2.4 Total Quality Management: The fourth level, i.e. Total Quality Management involves the understanding and implementation of quality management principles and concepts in every aspect of business activities. Total Quality Management demands that the principles of quality management must be applied at every level, every stage and in every department of the organization. The idea of Total Quality Management philosophy must also be enriched by the application of sophisticated quality management techniques. The process of quality management would also be beyond the inner organization in order to develop close collaboration with suppliers. It involves Focused vision, Continuous improvements, Internal customer, Performance measure, Prevention, Company-wide application, Interdepartmental barriers, Management leadership. 2.2.5 Key Principles of TQM: The following key principles underpin the TQM concept, which are common to all manifestations, namely

Upload: dangdien

Post on 12-May-2018

221 views

Category:

Documents


3 download

TRANSCRIPT

- 4 -

Chapter -2

Literature Survey – Total Quality Management 2.1.0 Introduction: This chapter deals with Literature Survey on Total Quality Management. As has been already outlined in detail, the main objective of this research project is to investigate and analyze various aspects related to TQM and TEM in five selected industrial sectors. In view of the above, an in-depth and updated literature survey has been done in this chapter containing the state of the art review on Total Quality Management (TQM) and Total Energy Management (TEM) which includes the following important aspects focused appropriately: 2.2.0 Concepts and Features of TQM: The historical evolution of Total Quality Management has taken place in four stages. This can be categorized as follows: 2.2.1 Quality Inspection: Quality has been evident in human activities for as long as we can remember. However the first stage on this development can be seen in the 1910s when the Ford Motor Company’s ‘T’ Model car rolled off the production line. The company started to employ teams of inspectors to compare or test the product with the project standard. This was applied at all stages covering the production process and delivery, etc. The purpose of the inspection was that the poor quality product found by the inspectors would be separated from the acceptable quality product and then would be scrapped, reworked or sold as lower quality. It involves Salvage, Sorting, Corrective Action, Identify sources of non-conformance 2.2.2 Quality Control: With further industrial advancement came the second stage of TQM development and quality was controlled through supervised skills, written specification, measurement and standardization. During the Second World War, manufacturing systems became complex and the quality began to be verified by inspections rather than the workers themselves. Statistical quality control by inspection –the post production effort to separate the good product from the bad product- was then developed. The development of control charts and accepting sampling methods by Shewhart and Dodge-Roming during the period 1924-1931 helped this era to prosper further from the previous inspection era. At this stage Shewhart introduced the idea that quality control can help to distinguish and separate two types of process variation; firstly the variation resulting from random causes and secondly the variation resulting from assignable or special causes. He also suggested that a process can be made to function predictably by separating the variation due to special causes. Further, he designed a control chart for monitoring such process control and lower evidence of non-conformance. It involves Quality manual, Performance data, Self-inspection, Product testing, Quality planning, Use of statistics, Paperwork control 2.2.3 Quality Assurance: The third stage of this development, i.e. quality assurance contains all the previous stages in order to provide sufficient confidence that a product or service will satisfy customers’ needs. Other activities such as comprehensive quality manuals, use of cost of quality, development of process control and auditing of quality systems are also developed in order to progress from quality control to the quality assurance era of Total Quality Management. At this stage there was also an emphasis of change from detection activities towards prevention of bad quality. It involves Third-party approvals, Systems audits, Quality planning, Quality manuals, Quality costs, Process control, Failure mode and effect analysis, Non-production operation. 2.2.4 Total Quality Management: The fourth level, i.e. Total Quality Management involves the understanding and implementation of quality management principles and concepts in every aspect of business activities. Total Quality Management demands that the principles of quality management must be applied at every level, every stage and in every department of the organization. The idea of Total Quality Management philosophy must also be enriched by the application of sophisticated quality management techniques. The process of quality management would also be beyond the inner organization in order to develop close collaboration with suppliers. It involves Focused vision, Continuous improvements, Internal customer, Performance measure, Prevention, Company-wide application, Interdepartmental barriers, Management leadership. 2.2.5 Key Principles of TQM: The following key principles underpin the TQM concept, which are common to all manifestations, namely

- 5 -

• TQM starts at top management - Top management should demonstrate understanding, commitment and be involved in the total quality improvement process from day one in order to improve quality in all areas of the institution.

• TQM requires total employee involvement – People at all levels are the essence of an institution and their full involvement enables their abilities to be used to the benefit of the institution. The involvement of every individual in an institution is necessary for successful TQM implementation. Institutions need imagination, ideas, input, commitment and energy from everyone to reach for world-class quality that will make a country competitive in today’s market.

• TQM focus on the customer – The goal of satisfying customers (internal or external) is fundamental to TQM and is expressed by the institution’s attempt to design and deliver products and services that fulfil customer needs. Institutions depend on their customers and therefore should understand current and future customer needs, meet customer requirements and strive to exceed customer expectations.

• TQM need strategic planning – Strategic planning is necessary to align and integrate all the efforts of the institution with the TQM concept. The link between TQM and strategic planning should provide an integrated management system for an institution.

• TQM focus on the systems approach to management – Identifying, understanding and managing interrelated processes as a system should contribute to the institution’s effectiveness and efficiency in achieving its objective.

• TQM requires ongoing education and training of employees – Training should start with educating top managers in TQM and its principles, in the need for quality improvement, and in the tools of improvement. Training should provide employees with the education required to effectively participate in quality improvements.

• TQM focus on teamwork – Institutions should understand that employees need to participate in vertical, horizontal and cross-functional teams to be most effective. Teams should be used through collaboration/participation, to provide an opportunity for employees to work together in their pursuit of total quality in ways that they have not worked together before.

• TQM focus on continuous improvement – Continuous improvement should be a permanent objective of the institution. Continuous improvement means a commitment to constant examination of technical and administrative processes in search of better methods. Underlying this principle are the concept of institutions as systems of interlinked processes and the belief that by improving these processes, institutions can continue to meet the increasingly stringent expectations of their customers.

• TQM respects employees and their knowledge – Subordinates’ inputs as improvements should be taken into account, especially where they have the appropriate experience and are specialists in their field. Employees should be actively involved in the improvement process.

• TQM focus on process improvement – The institution should be reconfigured as a set of horizontal processes that begin with the supplier and end with the customer. All processes in an institution should be identified to establish ownership for the processes and processes should be kept as simple as possible.

• TQM requires statistical way of thinking and the use of statistical methods – Results of tests, measurements and conditions under which measurements were made should be kept meticulously. Electronic systems that are available should be used, but computer software packages can be developed relatively cheap for in-time statistical purposes.

• TQM focus on prevention rather than detection – Problems are to be anticipated to prevent them from occurring. Frequent meetings should be held to discuss foreseen problems.

• TQM requires mutually beneficial supplier relationships – Suppliers should be treated in a way to ensure a win-win situation for all parties involved. An institution and its suppliers are interdependent, and a mutually beneficial relationship enhances the ability of both to create value.

• TQM focus on performance measures that are consistent with the goals of the institution – Feasible measures should be established to reward performance and thereby promoting positive attitudes. In order to monitor how the institution is performing, management must analyse the performance on a continuous basis.

• TQM focus on product and service quality design – Quality should be built into the programme as soon as possible, preferably from day one, and should be spread over the total sphere of the programme. Therefore, the advice of experts should form part of the project right from the start.

- 6 -

• TQM focus on substantial culture change – All changes in the environment should be taken note of and the necessary adoptions should be made promptly. It will mean that certain alterations should be frequently made to meet new circumstances.

• TQM focus on the factual approach to decision-making – Effective decisions should be based on the analysis of data and information. Facts are necessary to manage the institution at all levels by giving the correct information to people so that decisions are based upon facts rather than ‘gut feelings’ which is essential to achieve continuous improvement.

• TQM requires self-assessment efforts as control mechanism to determine results – Institutions’ performance should be evaluated against internationally recognized standards.

• TQM focus on fast response - Increasingly rapid response times and ever-shorter cycles for new or improved product and service introduction are a necessity for customer satisfaction today. The time performance of work processes should be among key process measurements. Improvements in response time often drive simultaneous improvements in institution, quality and productivity.

• TQM provides standardisation – Institutions should develop and adhere to the best known ways to perform a given task.

• TQM focus on partnership development - Institutions should seek to build internal and external partnerships to better accomplish their overall goals. Internal partnerships might include those that promote cooperation between labour and management. External partnerships might be with customers, suppliers and educational institutions for a variety of purposes, including education and training. A partnership might permit the blending of institution’s strengths and capabilities, thereby enhancing the accomplishment of each partner’s mission.

2.3.0 Tools Used In Quality Management: Conventional tools and techniques have been reviewed followed by update survey of advanced TQM tools and techniques 2.3.1 The Seven Conventional Quality Tools: From the multitude of techniques which support the problem solving process, it is the seven statistical tools, which are also referred to as QC (Quality Control) tools that have become the most firmly established (Bunney and Dale7). Some of the individual methods have been used for decades or have been borrowed from other disciplines. They form a methodological tool, which can be used to structure and visualize complex issues and which thus supports all phases in the problem-solving process (PDCA). They are particularly suitable when all of the data needed to solve the problem are available and particularly suitable when all of the data needed to solve the problem are available and have to be analyzed. i) Cause and effect diagram: The cause and effect diagram, also known as the Ishikawa or fishbone diagram, is a tool which can be used to analyze facts with a view to identifying the cause of a defined effect. The problem, or the effect, is entered in the head of the fish; the bones represent the main influencing variables. The individual causes are entered inside the bones. The principle influencing variables frequently correspond to the 7M checklist (Man, Machine, Material, Method, Marginal conditions, Management, and Measurement). ii) Histogram: Histograms help to interpret the reasons for scatter by displaying the distribution of data values. The data values are divided into classes in accordance with statistical rules. These form the abscissa of the diagram. The number of data values per class is shown on the y-axis. The average value and type of scatter are shown by the distribution curve. iii) Correlation diagram: Scatter or correlation diagrams describe graphically whether there is a correlation between two variables (problem and influencing variable). Plotting the factors in relation to one another in a x-y diagram yields information as to the nature of the correlation between the factors. This is achieved by plotting a sufficient number of pairs of values, formed by altering the problem variable and determining the associated influencing variable, as measured points in the diagram. The nature of the correlation (strong or weak, positive or negative) is shown by the distribution of the points. This permits conclusions to be drawn as to potential cause. iv) Flow chart: The term flow chart is a general term used to describe the common forms of presentation such as bar, line, pie and spider charts. Depending on the purpose of the analysis, one or other form of visualization is suitable for demonstrating correlations or flows. v) Control chart:

- 7 -

Data collection is the starting point for improvement activities. Control or SPC charts are used to take samples at regular intervals and enter the measured valued or the statistical parameters (e.g.. average value, scatter or spread) into the SPC chart. It may become necessary to intervene in the process, depending on the control limit specified and the characteristic progress of the data values. vi) Pareto analysis: People are frequently faced by a number of problems or causes of faults that cannot be processed simultaneously. It makes sense to deal with the greatest, most important or most –cost-intensive problem first. Pareto Analysis (also known as ABC Analysis or Lorenz Distribution) visualizes the rank order of the influencing variables of relevance to one particular issue. These are listed in order of the level of influence they exert and their numerical significance and cumulative percentage are shown accordingly. In the course of investigating a quality issue, it often emerges that only a few of the many causes identified are actually very important while many of the remainder are very insignificant. vii) Frequency distribution: The frequency of the occurrence of individual types of faults and the frequency with which data values occur at certain intervals in the range can be presented in the form of frequency distributions. Fault clusters at individual points can thus be recognized and the causes investigated. 2.3.2 The Seven New Management Tools: Where complex problems and incomplete data collections are involved, the analysis and solution identification tools described above are insufficiently effective. In reality, many facts are described only by fuzzy data and information which has been passed on only verbally. This verbal information must be put into a form conductive to decision-making, by means of new, suitable tools. The seven new management tools have been defined in order to supplement the tools previously described (Bunney and Dale7). Each of these seven new management tools is very effective in itself. However, there is additional benefit to be gained from combining them. The combined effect of the seven new management tools is described in the following. i) Affinity diagram: Affinity diagram are used to collect and classify ideas. In a brainstorming session, the individual ideas are noted on cards and then classified according to the subject to which they relate. This focuses attention and concentration on individual aspects within the problem solving process. ii) Relationship diagram: These diagrams starting with a central problem or a central idea and show the influencing factors or causes and their relationship. When the diagram is being developed, the possible causes are arranged as cards around the issue or problem. In the second step, cause and effect relationships between the cards are shown, thus revealing possible principle causes. The relationship diagram is suitable for the visualization of complex thought processes.` iii) Tree diagram: In an extension of functional analysis, the tree diagram describes relationships between targets and the measures needed to achieve them Starting from the target under examination, possible solutions are shown as branches to the right. Each solution identified is then investigated in order to determine whether it represents an activity which can be executed directly. When this is not the case, the appropriate branch must be branched out more widely. iv) Matrix diagram: A matrix diagram shows relationships and interactions between two factors. These diagrams are usually used to link two lists. A typical application for this mode of representation is the House of Quality used in GFD. v) Portfolio (Matrix data analysis): Matrix data analysis helps to reveal the hidden structure of a jumbled mass of information. The information recorded in a matrix diagram can be investigated in greater detail on the basis of defined criteria or dimensions in a portfolio. vi) Problem decision plan: The problem or process decision plan is used to recognize potential problems at the planning phase and to develop preventive measures. Starting from the target specified, the points which are important to the success of the operation are discussed and investigated with a view to identifying possible problems and weighting them. Counter measures must be developed for the items which have been prioritized. vii) Network diagram: Network or arrow diagrams are suitable for presenting individual operations in a project and their dependencies. The sequence is shown by operations to be conducted either sequentially or parallel to one another.

- 8 -

2.3.3 Critical Factors Affecting TQM Implementation: An attempt has been made to identify critical factors affecting TQM implementation. TQM tools and techniques can be grouped into six major categories as determined by their primary area of implementation focus (Fazel & Salegna17). i) Customer-based strategies: Customer-based strategies should be the focal point of every TQM programme, around which all other strategies are formulated. Customer satisfaction is only likely to be achieved and maintained when the customer plays an active role in the organization’s process of quality improvement. Major techniques used to accomplish this are customer needs analysis, customer surveys and quality function deployment. ii) Management-based strategies: Management-based strategies are also extremely important for the successful implementation of TQM. TQM initiatives are not likely to succeed without strong leadership and support from top management. The goals and the benefits of implementing TQM must clearly be communicated by top management to the workforce. The alignment of the reward structure with the goals of the organization is also vital to the organization’s success in achieving these goals. iii) Employee-based strategies: Employee-based strategies provide a means of increasing the participatory role of workers. Strategies such as empowerment, teamwork and cross-training may result in employees having increased decision making authority, greater job responsibilities, and increased motivation and sense of pride in their work. Quality programmes may also benefit from employee suggestions resulting from other group activities including quality teams, quality circles, the nominal group technique and brainstorming. iv) Supplier-based strategies: Supplier-based strategies provide a means of increasing an organization’s likelihood of having suppliers who are reliable and willing to work towards the organization’s goals of providing a quality product. Given the trend towards companies reducing the number of suppliers and cultivating long-term relationships with the remaining ones, these strategies are particularly important today. v) Process-based strategies: Process-based strategies focus on improving processes by reducing waste, defect rates, cycle time, and providing feedback on the performance of the process. Benchmarking, SPC and JIT are some of the most popular techniques employed by companies to achieve these goals. vi) Product-based strategies: Product-based strategies are directly focused on the quality of the product, its physical characteristics and its manufacturability. 2.3.4 Quality Control via Defect Prevention, Design Modification and Process: Quality control via defect prevention, design modification and process modification has also been analyzed.The importance of designing for manufacturing is underlined by the fact that about 70% of manufacturing costs of a product (cost of materials, processing, and assembly) are determined by design decisions, with production decisions (such as process planning or machine tool selection) responsible for only 20%. (Suma53). This can be done by following: i) Reduce the total number of parts: The reduction of the number of parts in a product is probably the best opportunity for reducing manufacturing costs. Less parts implies less purchases, inventory, handling, processing time, development time, equipment, engineering time, assembly difficulty, service inspection, testing, etc. In general, it reduces the level of intensity of all activities related to the product during its entire life. A part that does not need to have relative motion with respect to other parts does not have to be made of a different material, or that would make the assembly or service of other parts extremely difficult or impossible, is an excellent target for elimination. Some approaches to part-count reduction are based on the use of one-piece structures and selection of manufacturing processes such as injection moulding, extrusion, precision castings, and powder metallurgy, among others. ii) Develop a modular design: The use of modules in product design simplifies manufacturing activities such as inspection, testing, assembly, purchasing, redesign, maintenance, service, and so on. One reason is that modules add versatility to product update in the redesign process, help run tests before the final assembly is put together, and allow the use of standard components to minimize product variations. However, the connection can be a limiting factor when applying this rule.

- 9 -

iii) Use of standard components: Standard components are less expensive than custom-made items. The high availability of these components reduces product lead times. Also, their reliability factors are well ascertained. Furthermore, the use of standard components refers to the production pressure to the supplier, relieving in part the manufacture’s concern of meeting production schedules. iv) Design parts to be multi-functional: Multi-functional parts reduce the total number of parts in a design, thus, obtaining the benefits given in rule 1. Some examples are a part to act as both an electric conductor and as a structural member, or as a heat dissipating element and as a structural member. Also, there can be elements that besides their principal function have guiding, aligning, or self-featuring features to facilitate assembly, and/or reflective surfaces to facilitate inspection, etc. v) Design parts for multi-use: In a manufacturing firm, different products can share parts that have been designed for multi-use. These parts can have the same or different functions when used in different products. In order to do this, it is necessary to identify the parts that are suitable for multi-use. For example, all the parts used in the firm (purchased or made) can be sorted into two groups: the first containing all the parts that are used commonly in all products. Then, part families are created by defining categories of similar parts in each group. The goal is to minimize the number of categories, the variations within the categories, and the number of design features within each variation. The result is a set of standard part families from which multi-use parts are created. After organizing all the parts into part families, the manufacturing processes are standardized for each part family. The production of a specific part belonging to a given part family would follow the manufacturing routing that has been setup for its family, skipping the operations that are not required for it. Furthermore, in design changes to existing products and especially in new product designs, the standard multi-use components should be used. vi) Design for ease of fabrication: Select the optimum combination between the material and fabrication process to minimize the overall manufacturing cost. In general, final operations such as painting, polishing, finish machining, etc. should be avoided. Excessive tolerance, surface-finish requirement, etc. are commonly found problems that result in higher than necessary production cost. vii) Avoid separate fasteners: The use of fasteners increases the cost of manufacturing a part due to the handling and feeding operations that have to be performed. Besides the high cost of the equipment required for them, these operations are not 100% successful, so they contribute to reducing the overall manufacturing efficiency. In general, fasteners should be avoided and replaced, for example, by using tabs or snap fits. If fasteners have to be used, then some guides should be followed for selecting them. Minimize the number, size, and variation used; also, utilize standard components whenever possible. Avoid screws that are too long, or too short, separate washers, tapped holes, and round heads and flatheads (not good for vacuum pickup). Self-tapping and chamfered screws are preferred because they improve placement success. Screws with vertical side heads should be selected vacuum pickup. viii) Minimize assembly directions: All parts should be assembled from one direction. If possible, the best way to add parts is from above, in a vertical direction, parallel to the gravitational direction (downward). In this way, the effects of gravity help the assembly process, contrary to having to compensate for its effect when other directions are chosen. ix) Maximize compliance: Errors can occur during insertion operations due to variations in part dimensions or on the accuracy of the positioning device used. This faulty behaviour can cause damage to the part and/or to the equipment. For this reason, it is necessary to include compliance in the part design and in the assembly process. Examples of part built-in compliance features include tapers or chamfers and moderate radius sizes to facilitate insertion, and nonfunctional external elements to help detect hidden features. For the assembly process, selection of a rigid-base part, tactile sensing capabilities, and vision systems are example of compliance. A simple solution is to use high-quality parts with designed-in-compliance, a rigid-base part, and selective compliance in the assembly tool. x) Minimize handling: Handling consists of positioning, orienting, and fixing a part or component. To facilitate orientation, symmetrical parts should be used whenever possible. If it is not possible, then the asymmetry must be exaggerated to avoid failures. Use external guiding features to help the orientation of a part. The subsequent operations should be designed so that the orientation of the part is maintained. Also, magazines, tube feeders, part strips, and so on, should be used to keep this orientation between operations. Avoid using flexible parts - use slave circuit boards instead. If cables have to be used, then include a dummy connector to plug the cable (robotic assembly) so that it can

- 10 -

be located easily. When designing the product, try to minimize the flow of material waste, parts, and so on, in the manufacturing operation; also, take packaging into account, select appropriate and safe packaging for the product. 2.3.5 Statistical Process Control: Statistical process control techniques tries to establish the pattern of process variability and uses it to predict the process quality and apply the necessary process corrections to keep the process under control further. Statistical process control (SPC) is an optimization philosophy centred on using a variety of statistical tools to enable continuous process improvement. Closely linked to the total quality management (TQM) philosophy, SPC helps firms to improve profitability by improving process and product quality. Although SPC is enabled with statistical analysis, the management philosophy that underlies SPC is much broader than a set of statistics. To improve a process systematically, managers must first identify key processes and key variables of interest. Every organization has hundreds, if not thousands, of processes and variables that can affect product and service outcomes, and one challenge is to focus on the processes and variables that are of key concern. SPC tools can be useful in identifying areas that need attention, but managerial insight is needed to use the SPC tools strategically. Managers can directly influence organizational performance using SPC practices. Their choice of key processes and performance variables creates a feed-forward signalling device to the organization about key performance indicators. This causes attention to be paid to these processes and variables. Feedback is then received through the SPC information, enabling evaluation of the data and an opportunity for corrective actions to be taken. Thus, SPC is not merely a set of statistical tools, but a management philosophy that helps organizations to improve performance through feed-forward and feed-back loops.SPC tools as below can be used in process evaluation and improvement (Srinivasu50). i) SPC Tools: Seven tools used in SPC are discussed below. a) Flowcharts: Flowcharts depict the progress of work through a series of defined steps. They can be used to communicate a process to employees who are being trained for the work, and management can use them to evaluate process flows, constraints, and gaps. b) Pareto Charts: Pareto charts are graphical demonstrations of occurrences, with the most frequently occurring event to the left and less frequent occurrences to the right. Pareto principle, which says that about 80% of outcomes are typically created by about 20% of causes. By constructing a Pareto chart, managers can quickly see what problems are most prevalent in their organizations. c) Ishikawa Cause-and-Effect or Fishbone Diagrams: These diagrams depict an array of potential causes of quality problems. The problem (the head of the fish) is displayed on the right, and the bones of the fish—representing the potential causes of the problem—are drawn to the left. Potential causes are often categorized as materials, equipment, people, environment, and management. Other categories may be included as appropriate. Useful in brainstorming the causes of problems (including potential problems) from multiple perspectives, these diagrams should include all possible reasons for a problem. When completed, further analysis is done to identify the root cause. d) Run Charts: Run charts are graphical plots of a variable over time. These charts can be made for a single variable, but they are useful in detecting trends or relationships between variables when two are included on the same run chart. e) Control Charts: Control charts combine expanded run chart information with statistical control data to help identify process variation over a period of time that is not likely due to random chance. Time can be defined as a production run, a series of batches, a day’s activities, or any relevant time period that captures the process being evaluated. Useful in manufacturing, administrative, and service functions, control charts provide rapid feedback on key variables of interest. Control charts are used to show when a process is in, or out of, statistical control. Statistical control does not imply zero variation—some degree of variation is normal and it is unrealistic to expect zero variation. However, the control chart is able to demonstrate data patterns that indicate that a process is out of control, and it is useful as a tool for making continuous improvement by reducing variability. The most commonly employed control charts are the mean chart and the range chart, often referred to as X-bar and R-charts. The mean chart (X-bar chart) shows the variation in a process by plotting the actual mean values of a set of sample data. The range chart (R-chart) is similar to the mean chart in having upper and lower (three-sigma) control limits, but the data plotted for each sample are

- 11 -

now the range between the largest and the smallest value in the sample. By plotting the range of values, variation within each sample is more apparent. f) Process Capability Analysis: Process capability analysis is a technique that is used to determine the ability of a process to meet product or service specifications. It is a useful tool to evaluate variation within a process and whether improvements can be made to process control. Although a process may be within control limits as determined by control chart data, capability analysis takes things a step further by evaluating the amount of variation in process outcomes (the product or service) compared to the capability of the process. g) Taguchi Loss Function: The Taguchi loss function is based on the assumption that all variation has a cost, even when the variation does not violate the data patterns defined by control charts. This concept is most useful where deviations from expectation are expected to be costly. Taguchi posited that all deviations from target values ultimately result in customer dissatisfaction. The Taguchi loss function enables organizations to calculate the financial consequence of process variability, making it useful in reaching design decisions. ii) Procedure for SPC: Following write up gives idea about the procedure to be followed for SPC. a) Identify the Problem: • Flowcharts identify and communicate information about the flow of a process, including constraints and gaps. • Pareto analysis identifies the issues that are causing most of the problems. b) Identify the Reasons for the Problem: • Use Ishikawa cause-and-effect diagrams to brainstorm the causes of a problem from a multidimensional perspective. c) Analyze the Data: • Run charts show the variability in data over time and the potential relationships between multiple variables. • Control charts identify process variation using a set of statistical tools, enabling the identification of outof- control variation. • Process capability analysis is used to show the amount of variation in an in-control process, and can be useful in improving a process. • The Taguchi loss function assigns an economic value to variation, helping to make trade-off decisions in process and product design. 2.3.6 Impact of Quality Management Practices: Impact of quality management practices like CIM, TPM, Just in Time, 5S tools, quality circles, zero defects, ISO implementation in on performance of small and medium enterprises (SME) has also been reviewed. i) OR Models and Methods in CIM Context: TQM is a strategic competitive philosophy with significant implications for CIM design and management. The basis of TQM is customer satisfaction, so it is important that all entities within the organization synchronize their operations to achieve this objective. CIM is a major tool for achieving competitive advantage should be synergized with TQM philosophy to attain this ultimate goal OR models and methods are directly, applicable in CIM (Puranik and Ghosh38). Of course, the first application concerns the order management. For a long time, when the demand was greater than the offer, it was primordial to forecast the demand, in order to release the production of the parts before receiving the customer order! Such a forecast was possible thanks to forecasting methods However, in the today's situation, the forecast is not so important: indeed, if it is necessary to know the quantity of basic parts which will be useful in the next production period, the specific components of the product are done only when the customer order is real. In the product design, some traveling salesman problems have to be solved. Indeed, when several holes are made with the same tool on a face of a product, the path of the tool could be minimized solving a TSP. Notice that it would be interesting to study more deeply the problems linked to the product design, because they are probably many others cases which could be solved with OR methods. Concerning the planning, it is necessary to distinguish two cases: if the part is a prototype or if the part is unique, the most efficient method is to deal with a PERT model (Program Evaluation Research Task); else, the planning of the production of several parts in different quantities could be solved thanks to the famous Gantt charts. As explained before, some new concepts have appeared in the last twenty year, like JIT (Just-in-time) or OPT (Optimized Production Technology), which give less importance to the planning module. There exist some interesting models combining planning and scheduling problems, which could be applied in mechanical industry. The scheduling problems have held the attention of a huge amount of searchers during the last

- 12 -

forty years, since Johnson has proposed optimal two and three-stage production schedules. Many heuristics have been developed to solve the problems of flow shop, job shop and open shop. Unfortunately, none of these heuristics are general models: they deal with some special cases, each one with specific constraints. Even if the shop floor control is a really complex problem, it seems to be possible to introduce some algorithms or some heuristics methods in this context. Contrariwise, the links between OR models and supplying logistics and inventory management is easy to show, as these logistics aspects are two of the basic elements in a firm. Finally, the distribution logistics becomes a new topic of research, as different constraints occur in this case. The aim is not here to solve a "simple" vehicle routing problem, but to take in account the constraints given by the customers (like time-windows for the delivery) or the one generated by territory (geography of the country). ii) TPM: TPM is designed to maximize equipment effectiveness (improving overall efficiency) by establishing a comprehensive productive-maintenance system covering the entire life of the equipment, spanning all equipment-related fields (planning, use, maintenance, etc.) and, with the participation of all employees from top management down to shop-floor workers, to promote productive maintenance through motivation management or voluntary small-group activities. TPM provides a comprehensive company-wide approach to maintenance management, which can be divided into long-term and short-term elements. (Narender and.Gupta34). In the long-term, efforts focus on new equipment design and elimination of sources of lost equipment time and typically require the involvement of many areas of the organization. In this paper, we focus on the short-term maintenance efforts that are normally found at the plant level of the organization. In the short-term, TPM activities include an autonomous maintenance program for the production department and a planned maintenance program for the maintenance department. The basic purpose of TPM is to increase plant and machine uptime. It is implemented in 3 phases, 1) Autonomous maintenance. 2) Preventive maintenance 3) Predictive maintenance. Maintenance engineers often complain regarding machine being tampered by operators owing to various reasons. The purpose of autonomous maintenance is to install a sense of machine ownership among the operators. Operator fills up a general check sheet, at the start and end of the shift to ensure that the machine is received in OK condition. The operator as per the checklist carries out basic machine cleaning, inspect Oil level, needle number, abnormal vibrations, SPI. Skip stitches etc. This ensures that the output is right first time, every time. Preventive maintenance is planned periodic maintenance derived from manuals and past experience. Machine history cards are maintained to identify root causes of recurring breakdown. Based on the analysis of history card, preventive maintenance checklist is modified to suit the in house working conditions (Temp. /humidity, etc). Key performance indicators for Maintenance can be Mean time to repair (MTTR), mean time between breakdowns (MTBB), and plant up time. Utility department should include humidifiers, trolleys, stackers, machine lifters etc in preventive maintenance schedule. Trolleys being dragged with non-functioning /noisy wheels is a common sight in many factories. This can be avoided by including them in preventive maintenance schedule. Preventive Maintenance, when part of a world class manufacturing strategy that incorporates JIT and TQM, should lead to improved TMP. The importance of the relationship between JIT and TPM is clear. JIT’s emphasis on waste reduction creates an environment where inventory is reduced, production processes are interdependent, and the plant operation is susceptible to breakdowns of any process. TPM provides dependable equipment, reduces the number of production disturbances, and increases plant capacity by providing effective equipment maintenance. TPM practices indirectly influence MP by supporting JIT practices. The relationship between TPM and TQM is also important. TQM aims to reduce variation in the product and eliminate defects. A strong maintenance program is needed to provide reliable equipment maintenance and reduce equipment process variation. Quality practices focusing solely on quality improvement might not be a sufficient means for a plant to attain and sustain its competitive position. It is likely that the use of TPM to improve equipment performance and increase the skills of workers could be an additional factor in supporting TQM and explaining competitive advantage. Therefore, we believe that TPM indirectly improves MP by supporting TQM efforts. iii) Just in Time (JIT) Systems: Implementation of JIT for strategic management technology has been also discussed. As it can be considered to be part of TQM effort, JIT manufacturing represents approach to improving the effectiveness and efficiency of an organisation’s total operations functions by aiming at waste reduction, improvement of product quality and customer services.The JIT production system is a reasonable way of making products because it completely eliminates unnecessary production elements to reduce costs. (Kootanaee et al.24). The basic and simple idea in such a production system is to produce the kind of units needed, at the time needed, in the quantities.

- 13 -

Even though the system’s most important goal is cost reduction, three sub-goals must be first achieved. These are

• quantity control, which enables the system to adjust to daily and monthly demand fluctuations in quantity and variety;

• quality assurance, which ensures that each process will supply only defect-free units to subsequent processes; and

• respect of the worker, which is the base to increase human resources productivity necessary for attaining the system’s cost objectives.

A special feature of the JIT production system is that the primary goal cannot be achieved without realizing these subgoals, and vice versa. A continuous flow of production adapting to demand changes in quantity and variety is obtained by producing the needed units in the needed quantities at the needed time, by never allowing defective units from a preceding process to flow into and disrupt a subsequent process, by varying the number of workers based on demand changes, and by capitalizing on worker suggestions. To implement these basic concepts, the JIT system includes the following subsystems and methods:

• the kanban system to maintain JIT production; • a production-smoothing approach to allow adaptation to demand changes; • shortened setup times that reduce production lead times; • standardizing operations to attain line balancing; • machine layouts that promote multiple-skilled workers and the flexible work-force concept; • small-group improvement activities and a worker suggestion system to reduce the work force and increase

the morale, respectively; • a visual control system to achieve autonomous defect control; and • a “functional management” system that promotes company-wide quality control.

iv) ISO for TQM: It is Critical to attain ISO certification integrate the effort with TQM. Implementing ISO 9000 alone did not contribute much to quality improvement, while a combination of ISO 9000 and TQM contributed the most. (Prabhu et al.36). It has been recognized that the emphasis on quality has led organizations to adopt TQM. Moreover, organizations and customers have demanded external recognition of quality, which has in turn provided the momentum for the International Organisation for Standardization (ISO) development of the ISO 9000 series. ISO 9000 represents a trend in quality management, which cannot be ignored in today’s business environment. In fact, those companies wishing to remain competitive and improve their quality systems are recommended the use of ISO 9000 as a foundation for a much broader system of TQM. This is based on the fact that ISO 9000 is an important part of TQM, and the implementation of both approaches together will lead to organizational success and competitive advantage. It is clear that both approaches tend to complement each other. ISO 9000 can be implemented first to create stability and consistency in the organisation’s work, then the implementation of TQM can enhance employee motivation and operational efficiency, and achieve overall organizational success and performance. On the other hand, it appears that some managers have misunderstood the role of ISO9000 certification. One possible explanation for this misunderstanding is that managers fail to distinguish between conformance and performance specification. These organizations seek certification because of pressure from their customers and government policy. The quality of their products and services may improve in the short term, however, they are unlikely to improve and sustain their organizational performance. In these cases, ISO 9000 certification is a hollow achievement in the long run. Companies that implement ISO9000 and TQM at the same time and in an integrated manner might expect to have advantages in product quality, delivery, productivity and customer satisfaction. ISO9000 certification is only the beginning of a continuous improvement process rather than the end and could be a useful stepping stone for TQM. ISO 9000 can be an excellent start to TQM, if it is interpreted in a way that encourages the company v) 5 S: 5S is a system to reduce waste and optimize productivity through maintaining an orderly workplace and using visual cues to achieve more consistent operational results (Michalska and Szewieczek31 ).The term refers to five steps – sort, set in order, shine, standardize, and sustain – that are also sometimes known as the 5 pillars of a visual workplace. 5S programs are usually implemented by small teams working together to get materials closer to operations, right at workers’ fingertips and organized and labelled to facilitate operations with the smallest amount of wasted time and materials. The 5S system is a good starting point for all improvement efforts aiming to drive out waste from the manufacturing process, and ultimately improve a company’s bottom line by improving products and services, and

- 14 -

lowering costs. Many companies are seeking to make operations more efficient, and the concept is especially attractive to older manufacturing facilities looking to improve the bottom line by reducing their costs 5 S means

• Seiri (sorting, organization of the workplace, elimination of unnecessary materials). Refers to the practice of sorting through all the tools, materials, etc., in the work area and keeping only essential items. Everything else is stored or discarded. This leads to fewer hazards and less clutter to interfere with productive work.

• Seiton (set in order, place for everything). Focuses on the need for the workplace in order. Tools, equipment, and materials must be systematically arranged for the easiest and the most efficient access. There must be a place for everything, and everything must be in its place.

• Seiso (shine, cleaning, removing of wastes, dust etc.). Indicates the need to keep the workplace clean as well as neat. Cleaning in Japanese companies is a daily activity. At the end of each shift, the work area is cleaned up and everything is restored to its place.

• Seiketsu (standardize, constant place for things, constant rules of organization, storage and keeping cleanness). Allows for control and consistency. Basic housekeeping standards apply everywhere in the facility. Everyone knows exactly what his or her responsibilities are. Housekeeping duties are part of regular work routines.

• Shitsuke (sustain, automatic realization of above-mentioned rules). Refers to maintaining standards and keeping the facility in safe and efficient order day after day, year after year.

The 5S methodology is a simple and universal approach that works in companies all over the world. It is essentially a support to such other manufacturing improvements as just-in-time (JIT) production, cellular manufacturing, total quality management (TQM), or six sigma initiatives, and is also a great contributor to making the workplace a better place to spend time. vi) Zero-defects program: In addition to the concept, underlying the zero-defects program and its approaches to changing ways of thinking, measures must also be implemented and changes must be introduced to the way employees act. These measures can be grouped into the four main elements of a zero-defects program: - To create conditions for fault-free work.

- To introduce fault-avoidance techniques.

- To systematically eliminate existing faults.

- To investigate particularly good results. The most basic step in a zero-defects program is to create the conditions necessary for fault-free work. In relation to the employees, a zero-defects program can be successful only when the staff want to, are able to and are authorized to implement the program. Wanting implies awareness and motivation. Ability presupposes the employee has specialist, methodological and social skills. The third element, authorization, demonstrates clearly that the employees must be given adequate permission as well as flexibility to act and make decision if they are to be in a position to implement the zero-defects program. In addition to the employee-related conditions, there are some organizational conditions which must be met. These include, for example, the fact that it is necessary to create the working conditions required for avoiding errors, that the relevant information must be made available or that appropriate plant, machines and installations must be in place to permit fault-free work to be undertaken. Techniques of avoiding faults are the second element of a zero-defects program. The QM techniques from the early phases of product life cycle, such as Design Reviews, Fault Tree Analysis and FMEA, play an important role, as do the QM techniques used in manufacturing, such as Design of Experiments, Capability tests and the management of testing equipment in the broader sense. Many of these techniques are covered in Part B of this book. They are therefore not described in greater detail at this point. The third element of zero-defects programs is the systematic elimination of existing errors. The view that this means getting rid of faults which have occurred in order to be able to offer the customer fault-free products and services is quickly dispelled. It is the elimination of the cause of the fault which is of real significance and which is vital to the medium and long-term success. The aim is to ensure that faults are not repeated, and that both time and money are not wasted over and over. To achieve this aim, it is necessary to analyze the reasons for the fault. A number of techniques and tools, such as the seven statistical tools, are available for this fault diagnosis operation as well as for fault avoidance. In many companies, it has proved helpful to use specially trained employees with high levels of specialist, methodological and social skills for this task, especially where complex cause-and-effect relationships are concerned.

- 15 -

The fourth element of the zero-defects program, the investigation of particularly good results, is also an integral part of modern benchmarking concepts. This involves analysis and documentation of the factors which are responsible for the achievement of excellent results, such as particular working conditions, processes, techniques or staff training. The objective is this element of zero-defects production is to transfer best practice to other areas or processes and to introduce is as standard practice throughout the organization. Whereas the analysis of the best practice in zero-defects programs is limited to the company concerned, it is frequently compared with the best practice of other companies in benchmarking projects. vii) Six Sigma: Like the zero-defects program, this approach focuses on faults or deviations. However, in contrast to the zero-defects program, the Six Sigma approach defines a value which states how many faults can be regarded as acceptable. This value is so low that it is very close to the zero-defects target. The processes in the company form the reference quantities and starting point for this value and for all improvements in the Six Sigma approach. The approach is not restricted to manufacturing processes but encompasses all business processes. But what does Six Sigma mean and when does a process reach the Six Sigma level aspired to? The number of acceptable faults or deviations occurring in a process is determined by calculating the standard deviation of the results of the process; this standard deviation and the associated methods, such as Statistical Process Control, only applies to stable processes, a corresponding fault rate is calculated in the Six Sigma approach. The fault rate equivalent to a Six Sigma level is two faults per 1 thousand million possible faults. This corresponds to 0.002 ppm (parts per million) or 0.0000002 percent. The value of 3.4 ppm frequently quoted in the literature on Six Sigma results from the assumption that the average value of the process results can deviate by 1.5 σ from the nominal value. Assuming this is indeed, a Six Sigma level is reached by a process when fewer than 3.4 faults per 1 million possible faults actually occur. The aim of the Six Sigma program is to achieve the highest possible Sigma level, in other words the lowest fault rate for all processes under examination. (Hammer and Goding21). This target is clear and readily understandable. However, when the number of steps needed in order to determine the Sigma level or the fault rate in a process is investigated, it is instantly apparent that a Six Sigma program is also associated with a high level of organization as well as considerable costs, both financial and personnel. 2.4.0 Overview of TQM for Planning and Demand Synchronization: Based on various TQM principles outlined in this chapter, the following two cases have been considered in this chapter with relevant detailing: 2.4.1 Hoshin Process of Long-Range Strategic Planning: In today’s business environment, the industry must, in effect, keep “both hands on the wheel” to move forward successfully. Hoshin Kanri provides a planning structure for bringing critical processes up to the desired level of performance, exceed the customer’s expectations and stay competitive using long range strategic plan. Globalization is posing several challenges to manufacturing sector. Design and operation of manufacturing systems are of great importance. In this context, companies need to synchronize their manufacturing activities. Hoshin Kanri is an organizing framework for strategy management Puranik40. It is concerned with four primary tasks and the cycle is an annual one. First it focuses an organization’s attention on corporate direction by setting, annually, a vital few strategic priorities; secondly, it aligns these with local plans and programmes; thirdly, it integrates them with daily management; and finally it provides for a structured review of their progress. Thus it is Focus – Alignment – Integration – Review In the Hoshin process, strategic planning is systematized: The format of the plans is unified via standards. The standardization provides a structured approach for developing and producing the organization's strategic plan. The structure and standards also enable an efficient linkage of the strategic plan through the organization. This ultimately leads to an organization wide understanding of not just the plan but also the planning process. This also holds true for the methodology used to review and track the plan's progress. This built-in standardization enables the organization to evaluate decisions made by the organization's leaders and to gauge the effectiveness of selected strategies. Because the review process emphasizes not only results but how decisions are reached, the organization can identify successful decision-making methods and practices. The review methodology is essentially a built-in benchmarking process for the organization's decision making. Hoshin is not an all-encompassing management system, rather Hoshin is about change. Hoshin therefore does not have a role to play in day-to-day management and control, this is done via ongoing process standardization (albeit with some incremental improvement) and the maintenance of a companywide management system The link between Hoshin and Daily Management can be described as a move from Plan, Do, Check, Act (PDCA) to Standardize, Do, Check, Act (SDCA). PDCA is about improvement. Once that improvement has been achieved (i.e. the PDCA cycle

- 16 -

has generated the required performance for the process) then the new level of performance should be maintained via the SDCA cycle. The process is question will stay in the SDCA cycle until such time as a further improvement in its performance is required Issues addressed within the Hoshin plan one year move back into the daily management process the next and may stay there for several years until next in need of focused attention, as expressed in the following diagram. Hoshin is a valuable addition to the arena of Strategic Change Management. As such, it encompasses many ‘traditional’ approaches to strategic analysis and the formulation of the organization’s strategic response to that analysis. Hoshin brings value to this management arena in the degree to which it supports implementation as well as formulation and it is particularly powerful in creating alignment to ensure that the whole organization is working towards the same strategic goals. Hoshin’s link with the arena of TQM is strong in the sense that it makes use of well established TQM principles but applies these in a more strategic context than many TQM users have previously been able to achieve. In particular, the principles of employee involvement, cause and effect analysis, establishing measures and setting targets and the concept of deploying the needs of the customer through techniques such as QFD (matrix deployment) and Kano analysis all link Hoshin to TQM and increase the power of the Hoshin approach. i) Common model for implementing Hoshin: The following section is based on a benchmarking analysis of several applications of Hoshin Planning. The analysis has helped distil out distinctive features and common generic practices which have formed the basis of identifying what the critical factors of implementation are in so far as Hoshin Planning is concerned. These factors are therefore important rules that need to be adhered so that the benefits promised by Hoshin can be achieved. a) Starting with a Vision of the future: Most cases describe some form of Visioning as the start point for implementation and virtually all the companies reviewed took this approach. This is consistent with the long-term approach to business reported by observers of Japanese business practice where plans stretching in excess of twenty years are not uncommon. Having this clarity of direction is essential to the successful implementation of Hoshin, since the annual Vital Few goals are selected on the basis of moving the company in the overall direction indicated in the Vision. Using the remaining elements of Hoshin without this Vision in place is likely to reduce the strategic impact of the approach, down-grading it to an annual implementation tool rather than a long-term change tool. b) Using Hoshin to drive change, not as a general management tool: This critical aspect of Hoshin is promoted most strongly by Hewlett Packard but also supported by many of the smaller companies. Hewlett Packard’s approach of developing a ‘Business Fundamentals’ table has found its way into several other companies and this reflects their international standing as a company to study when seeking best practice. Others appear to run their Hoshin and general management systems close together and this reflects their realistic view that, given finite human resources to work on non-client issues, they must be sure that the business remains broadly under control while change projects get special attention. c) Using Hoshin for annual strategic planning alongside business planning: Most companies have some form of annual business planning process and Hoshin does not replace this in any of the companies reviewed. It is seen to be practical by most of the companies, however, to run the two systems as parallel annual cycles. In some cases, the Vision contains clear direction for the growth of the company in terms of revenue and this drives an annual process of agreeing revenue targets for their five operational profit centers. This is not, however, considered to be part of the Hoshin process. Hoshin is brought to bear once the annual targets have been agreed and is used to identify the changes that must be made above and beyond what the company did last year so that the improved performance levels are reached. Whatever its position, Hoshin is clearly seen as a cyclical process, with all of the companies following a recognizable Think, Plan, Implement and Review pattern which can be likened to the Shewart/Deming PDCA cycle. The difference between Hoshin and less strategic Quality Management approaches is in the extent to which this annual planning activity is guided by the long term Vision for the business. A key test as to whether a goal should be selected as a Vital Few is in the strength of its support for the strategic intent of the business. d) Identification of a limited number of Vital Few goals: This element of Hoshin comes out strongly in all of the research companies, particularly the language of the ‘Vital Few’ (or Critical Success Factors in the case of Exxon). It is commonly reported among the companies that there is a tendency to pick too many Vital Few goals, particularly in the first year and many have had to reduce the number selected. e) Definition and prioritisation of the Vital Few via Catchball:

- 17 -

This practice shows more variation between the cases considered, although is recognized by all as a critical requirement if the full benefit of Hoshin is to be felt. Xerox has a sophisticated and well developed system for catch ball. f) Use of Cause and Effect thinking in establishing implementation programmes: The concept of Cause and Effect is one of the elements of Hoshin that link it closely with TQM and it is apparent that the users appreciate this as a strong driver of meaningful change. All of the cases analyzed make use of TQM style quality tools and techniques when building their Hoshin plans and these include Affinity diagrams, Fishbone, ‘5-Whys’, Relationship charts and force-field diagrams. All of the cases do also place strong emphasis on ‘Management by Fact’ and the process of setting goals and picking key projects is fed with extensive real data on external market conditions and internal performance. g) Method and scope of deployment: There are clear differences between some of the cases studied in terms of their approach to implementation deep into their organization structure. Perhaps unsurprisingly, there does appear to be a link between the size of the company and the extent to which multi-level catchball and deployment are practiced. Xerox for instance report extensive efforts in deployment, brought to life through the ‘Blue Book’ which contains objectives for all four levels of the business, from corporate, through unit and department to team and individual. Smaller businesses appear to take a less formal, hierarchical approach to deployment h) Team based implementation: Another stated similarity between TQM and Hoshin is in the extent of teamwork. Teams are clearly an important element of the real business of implementation in all of the cases covered. The use of teams is recognized also as an important factor in the softer ‘change management’ elements of Hoshin. Successful implementation is not just a matter of clear objectives and adequate resources but is perhaps most dependent on the commitment of the people responsible for making the changes. Involvement in these teams both at the levels of goal setting and implementation planning encourages commitment and has brought many reported benefits to the companies using this aspect of Hoshin. i) Monitoring progress through measurement and review: This is one of the strongest features of the Hoshin approach among all the cases concerned. Each has developed their own system for tracking and reviewing progress but all nevertheless attach a high priority to it. For many, the Deployment Matrix is used as the core basis for review and the ‘Red, Amber, Green’ or ‘Traffic Light’ system is in common use across the companies. This system can be applied directly to the Vital Few and their supporting projects since all these elements are assigned measures and targets when they are initially developed. Many of the cases have taken this approach further by developing Visual Management techniques allowing everyone in the company to see clearly where progress is and is not being made. 2.4.2 Achieving Co-Ordination between Production Rate and Demand Rate in Manufacturing System: In this era of globalization there are several challenges facing the manufacturing sector. Complexity in taking decisions due to the immense availability of information, randomness in the system which affects performance, heterogeneity in events occurring all make modeling for performance prediction difficult. The market globalization has forced companies to become more effective and efficient in their operations all over the world. Information and communication technology advances are changing the nature of the marketplace. The virtual market has become a reality. Excess inventories, long lead times and uncertain delivery dates are caused by randomness and lack of co-ordination. There are only two possible solutions: reduce the randomness (due to machine failures, engineering changes, customer orders and so on) and reasons for the lack of co-ordination (costly set-up changes, large batch machines and others) or respond to them in a way that limits their disruptive effects. Both responses are valid, but they can be, in practice polar opposites. In this context, companies need to synchronize their manufacturing activities with the variable product demand. (Puranik and Ghosh39). Several business functions and activities are involved in coordinating production rate with demand rate. i) Preliminary considerations for allocating and integrating resources to ensure an effective co-ordination: Preliminary considerations about customer demand and manufacturing, logistics, and distribution systems to understand how companies can allocate and integrate resources to ensure an effective co-ordination a) Customer Demand: Customers will shift to and do business with the manufacturer that provides the highest value. Customers are showing increasing interest for response times. Response times can be basically related to two time variables: the lead time to supply a product to the customer for a specific order and the time to develop a product from concept to

- 18 -

final market delivery. Any time beyond the time necessary to add value to the product represents excess costs and waste to both the supplier and the customer. A customer will pay a premium to the supplier who can supply its product faster and more reliably than the competition. In consequence of the change from a “seller’s market” to a “buyer’s market” and of accelerated technological developments, enterprises are forced to reorganize their business processes, to be able to quickly and cost effectively react to fast-changing market demands b) Manufacturing Systems: In industrial practice, a manufacturing system accomplishes one or more of three things: it converts inputs into outputs, it moves material from one location to another, and it transfers information. The challenge for the manufacturing organization is to integrate and synchronize all these tasks and align them with the company’s chosen goals, as well as to do that better than most or all of its competitors. Coordinating departments and integrating operations can require changes in the location and sequencing of equipment, reductions in setup times, faster and more accurate information about production problems, and improvements in process quality. The complexity and uncertainty in manufacturing have led over the years to an array of tools, techniques, structuring manufacturing systems: job shop, project shop, cellular system, flow line, and continuous system. Each of these approaches also refers to different situations of the co-ordination of production rate with demand rate (for instance, flow line is suitable for stable demand rate and standard products, and job shop for unstable demand rate and customized products). In the actual manufacturing world, these standard system structures often occur in combinations, or with slight changes. The choice of the structure depends on the design of the parts to be manufactured, the lot sizes of the parts, and the market factors (i.e., the required responsiveness to market changes). c) Logistic: Company’s performance can be strongly affected by logistics and distribution processes. The co-ordination of production rate with demand rate is strictly related to the effectiveness of logistics and distribution processes. To reduce stocks and response times and to increase efficiency, in the large consumer market the logistics function can play the critical role of coordinating the distribution and production planning activities. The integration of all logistics processes, from the acquisition of raw materials to the distribution of end-customer products, makes up a logistic chain consisting of multiple actors. Developments in telecommunication and information technology (IT) have created many opportunities to increase the integration and to increase the performance of the total logistic chain providing each participant with benefits (from material supplier to end-customer) d) Computer-Integrated Manufacturing: CIM system includes components such as computer-aided design, computer-aided process planning, database technology and management, expert systems, information flow, computer-aided production control, automated inspection methods, process and adaptive control, and robots. All of these elements work together using a common database. However, as the degree of automation grows, the production control function becomes the key to the success of the manufacturing facility. The co-ordination of production rate with demand rate has to be assured also by an effective order-handling system. The computer-based integration needs the redesign of the organization before implementing flexible technology Infact flexible technology alone does not address the causes of manufacturing problems. A correct approach should begin by examining and eventually redesigning the interdependencies that exist among functional units within the overall manufacturing organization. Then, investments in flexible technologies should be considered as the last resort. e) Lean Production: From the point of view of co-ordination between production and demand rates, lean production systems are customer driven. All activities are team-based, coordinated, and evaluated by the flow of work through the team or the plant, rather than by each department meeting its plan targets in isolation. The whole system involves fewer actors (including suppliers), all integrated with each other. The system is based on stable production volumes, but with a great deal of flexibility. Due to global competition, faster product development, and increasingly flexible manufacturing systems, a large variety of products are competing in markets. Despite the benefits to consumers, this phenomenon is making it more difficult for manufacturers and retailers to predict which of their goods will sell and to synchronize production rate with demand rate. A manufacturer might hope to be fast enough to produce in direct response to demand, virtually eliminating the need for a forecast. But in many industries, sales of volatile products tend to occur in a concentrated season, which means that a manufacturer would need an unjustifiably large capacity to be able to make goods in response to actual demand. Using quick response or JIT also may not be feasible if a company is dependent on an unresponsive supplier for key components. ii) Techniques to ensure the co-ordination between the production and demand flows: a) Demand Management:

- 19 -

Demand management concerns forecasting, order entry, order-delivery-date promising, customer order service, physical distribution, and other customer-contact-related activities. Demand management also takes in consideration other sources of demand for manufacturing capacity, including service-part demands, intra-company requirements, and pipeline inventory stocking. The objective of the demand management module is always to bridge the firm and the customer. However, companies differ for the types of uncertainty that affect them. Master production scheduling and demand management can facilitate the buffering against uncertainty. Distribution activities are planned using information developed in the demand management function. Distribution requirements planning (DRP) provides the basis for tying the physical distribution system to the manufacturing planning and control system. Customer delivery promise dates, inventory resupply shipments, interplant shipments, and so on are used to develop short-term transportation schedules. Information used for master production schedules can be integrated with distribution planning and used to plan and control warehouse resupply. Then, a DRP system supports management anticipate future requirements in the field, closely match materials supply to demand, effectively deploy inventory to meet customer service requirements, and rapidly adjust to the variability of the marketplace. DRP is a link between the marketplace, demand management, and master production scheduling. Plans derived from the DRP information and from the resultant shipping requirements are the basis for managing the logistics system. By planning future replenishment needs, DRP establishes the basis for more effective transportation dispatching decisions. These decisions are continually adjusted to reflect current conditions. Long-term plans help to determine the necessary transportation capacity. As actual field demands vary around the forecasts, DRP continually makes adjustments, sending products from the central warehouse to those distribution centers where they are most needed. If insufficient total inventory exists, DRP provides the basis for deciding on allocations. Some policies to manage such a situation are providing stock sufficient to last the same amount of time at each location, or favoring the “best” customers, or even accurately saying when delivery can be expected. Forecasts of demand are an important input to MPC systems. b) Inventory Planning and Control: Basically, inventory allows to decouple successive operations or anticipate changes in demand. In terms of customer service, maintaining final product inventories ensures the availability of a product at the time the customer requests it and minimizes the customer’s waiting time or the product’s delivery time. The inventory planning function is primarily concerned with the determination of appropriate levels of inventory. These are determined by minimizing the costs associated with inventories. These costs are often conflicting. Then, optimal inventory levels correspond to trade-off that is acceptable by management. Inventory-related costs are numerous and need to be accurately defined to enable managers to correctly carry out the inventory planning and control function. A common measure of inventory performance, inventory turnover, relates inventory levels to the product’s sales volume. Inventory turnover is computed as annual sales volume divided by average inventory investment. High inventory turnover suggests a high rate of return on inventory investment. Common measure of customer service is the fill rate, which is the percentage of units immediately available when requested by customers. The level of customer service can be also measured by the average length of time required to satisfy backorders, or percentage of replenishment order cycles in which one or more units are back ordered. To cope with these uncertainty sources, safety stocks are created. Two criteria are often used to determine safety stocks: the probability of stocking out in any given replenishment order cycle, or the desired level of customer service in satisfying product demand immediately out of inventory (the fill rate). c) Production Planning: The process of production planning provides a plan for how much production will occur in the next time periods, during an interval of time called “planning horizon” that usually ranges between six months to two years. Production planning also determines expected inventory levels, as well as the work force and other resources necessary to implement the production plans. Production planning is based on an aggregate view of the production facility, the demand for products, and even of time (using monthly time periods, for example). The production plan provides a direct and consistent dialogue between manufacturing and top management, as well as between manufacturing and the other functions. Then, the plan must be necessarily in terms that are meaningful to the firm’s no-manufacturing executives. An important linkage exists between production planning and demand management. This module quantifies every source of demand against manufacturing capacity, such as interplant transfers, international requirements, and service parts. The match between actual and forecast demand is monitored in the demand management module. As actual demand conditions do not correspond to forecast, the necessity for revising the production plan increases. Resource planning is directly related to production planning, since, in the short term, the available resources represent a set of constraints to production planning. In the longer run, to the extent that production plans call for

- 20 -

more resources than available, financial considerations are needed. If the production rate is maintained constant and large inventories are used to absorb the demand fluctuations (a “production smoothing” strategy), inventory holding costs and costs due to possible customer dissatisfaction become high. On the contrary, if the production rate is continually adjusted to be synchronized with demand rate fluctuations (a “chase” strategy), overtime and work-force adjustment costs, and poor facility utilization arise. There are many ways to change the production rate, including subcontracting, hiring, and overtime, and the specific approaches to be used depend on the available alternatives. The production plan must be disaggregated into specific products and detailed production actions d) Master Production Scheduling: A master production schedule (MPS) provides the basis for making customer delivery promises, utilizing plant capacity effectively, attaining the firm’s strategic objectives as reflected in the production plan, and resolving trade-off between manufacturing and marketing. The most important decisions concern how to construct and update the MPS. This involves processing MPS transactions, maintaining MPS records and reports, having a periodic review and update cycle (rolling through time), processing and responding to exception conditions, and measuring MPS effectiveness on a routine basis. On a day-to-day basis, marketing and production are coordinated through the MPS in terms of order promising. This is the activity by which customer order requests receive shipment dates. When customer orders create a backlog and require promise dates that are unacceptable from a marketing viewpoint, trade-off conditions are established for making changes. Three classic types of MPS approaches have been identified depending on the environment: make-to stock, make-to-order, and assemble-to-order. Each approach depends on the unit used for the MPS: end items, specific customer orders, or some group of end items and product options, respectively. The MTS company produces in batches, carrying finished goods inventories for most, if not all, of its end items. The MPS determines how much of and when each end item is to be produced. Firms that make to stock usually produce consumer products as opposed to industrial goods, but many industrial goods, such as supply items, are also made to stock. As MPS unit, all use end items, but many tend to group these end items into model groupings until the latest possible time in the final assembly schedule. e) Material Requirements Planning (MRP): The output of the MPS process is a master schedule for final assembly/production. The master schedules for all components that feed into the end item need to be determined. These master schedules are usually defined in terms of when the components need to be released to manufacturing or a vendor to satisfy the end-item master schedule. MRP is concerned with both production scheduling and inventory control. It provides a precise scheduling (priorities) system, an efficient and effective materials control system, and a rescheduling mechanism for revising plans as changes occur. It keeps inventory levels at a minimum while assuring that required materials are available when needed. The major objectives of an MRP system are simultaneously to:

• ensure the availability of materials, components, and products for planned production and for customer delivery;

• maintain the lowest possible level of inventory; and • plan manufacturing activities, delivery schedules, and purchasing activities.

In the MRP logic, the concept of dependent demand is fundamental. The demand (gross requirements) for subassemblies or components depends on the net requirements for their use in the end item. To do correct calculations, bill of material, inventory, and scheduled receipt data are all necessary. With these data, the dependent demand can be exactly calculated. It does not need to be forecasted. On the other hand, independent demand items are subject to demand from outside the firm and their needs have to be forecasted. The three major inputs of an MRP system are the master production schedule, the inventory status records, and the product structure records. The master production schedule outlines the production plan for all end items. The product structure records contain information on all materials, components, or subassemblies required for each end item. f) Capacity Requirements Planning: Capacity planning decisions are usually adopted starting from an overall plan of resources, proceeding to a rough-cut evaluation of a particular MPS’s capacity needs, and moving to a detailed evaluation of capacity requirements based on detailed material plans. Resource planning and rough-cut capacity planning have been briefly described previously. There has to be a correspondence between the capacity required to execute a given material plan and the capacity made available to execute the plan. Without this correspondence, the plan either will be impossible to execute or inefficiently executed. g) Production Activity Control:

- 21 -

Production activity control (PAC) concerns the execution of detailed material plans. It deals with the planning and release of individual orders to both factory and suppliers. PAC also takes care, when necessary, of detailed scheduling and control of jobs at the work centers on the shop floor (shop-floor monitoring and control systems, scheduling, and dispatching), and it takes care of purchasing and vendor scheduling (vendor scheduling and follow-up). Then, an effective PAC system can ensure the co-ordination of production rate with demand rate. The shop floor monitoring and control system communicates priorities between the firm’s planning system and the shop floor, evaluates job progress on the shop floor, and gathers shop performance information for the management control. SFMCS functions can be grouped into three areas: shop planning, shop monitoring, and shop controlling. Shop planning deals with the preparation of shop packet information, and the determination of a job’s due date/release to the floor. Shop monitoring involves the collection of data on the status of a job, as it moves on the shop floor, and on the work center performance. Shop controlling determines the priorities of open-order jobs and communicates these priorities to the shop floor in order to coordinate production activities. Scheduling can be defined as the allocation of resources over time to perform a certain collection of tasks. Detailed scheduling of the various elements of a production system is crucial to the efficiency and control of operations. Orders have to be released and translated into one or more jobs with associated due dates. The jobs often are processed through the machines in a work center in a given sequence. Queueing may occur when jobs have to wait for processing on machines that are busy; preemption may occur when high-priority jobs arrive at busy machines and have to proceed at once. Then, the scheduling process has to interface with the shop floor control. Events that happen on the shop floor have to be taken into account as they may have a considerable impact on the schedules. iii) More Advanced Approaches for Co-ordination: The increasing complexity of the production environment is basically caused by the high innovation rate in manufacturing processes and by the market globalization. As the complexity increases, the manufacturing management requires more advanced approaches. In manufacturing systems, artificial intelligence and multimedia tools are applied to different issues such as manufacturing planning and control. These tools can aspire to support several kinds of problem solving and decision-making tasks providing manufacturing systems with intelligence. Intelligent manufacturing systems make use of different technologies like expert systems, neural networks, and multimedia. Expert systems are suitable tools for manufacturing planning systems because these systems can support decision making to solve problems that are dynamic and have multiple criteria. These types of problems occur in quality, material shortages, vendor deliveries, forecast errors, and in the timing for new product roll-outs. Expert systems represent an attractive and reasonable set of tools able to provide alternatives for intelligent managerial choice. Expert systems are usually adopted in interactive decision support systems (DSS). Artificial neural networks (ANNs) have been receiving increasing attention as tools for business applications. ANNs seem not suitable for administrative functions such as customer order processing or inventory control. ANNs can support master production scheduling in the case of no complex demand forecasts. 2.5.0 Critical overview of TQM Literature: Total Quality Management (TQM) and Total Productive Maintenance (TPM) have gained considerable acceptance in Indian manufacturing industry to take on the challenge of transition from protected economy to global competition. These two improvement drives are being adopted and adapted for raising performance standards of Indian companies to world class level. TQM and TPM are considered complementary to each other and thereby being implemented simultaneously by many companies to achieve synergy (Seth and Tripathi48). A combined application brings out significantly higher improvements than individual drives. The study is based on data collected through a questionnaire as a research instrument and statistical analysis using Microsoft EXCEL 2000.Empirical evidences are provided on the comparative contributions of two drives to improve business performance in the context of Indian manufacturing industry. It also tries to establish a synergetic effect of TQM and TPM, when implemented in tandem. Robust computer aided simulation and modeling tools help to visualize, analyze and optimize complex production processes with a reasonable amount of time and investment. A review of the literature shows that simulation and modeling have not been extensively applied in just-in-time (JIT) manufacturing environments. Also there remains a lack of a comprehensive mechanism to identify the most significant JIT drivers for the purpose of system process optimization. A Systematic Modeling and Simulation Approach for JIT Performance Optimization (Sandanayake et al46) tries to close this gap by applying computer based simulation tools and linear mathematical modeling to identify the impact of selected key JIT parameters on performance in an automotive component-manufacturing environment. Research shows that variables such as inconsistent task distribution, variation on operator

- 22 -

performance, misconception of total quality management philosophy and lack of set-up time elimination plans disrupt ideal JIT production. In this study, ProModel simulation and modeling software is used to model and simulate different experimental scenarios in order to understand and quantify the impact of selected input key JIT variables on objective functions (i.e. process time and takt time). The outcome is a robust mathematical model that highlights the significance of JIT drivers in the manually operated mixed-model assembly lines. Since the early 1980’s, when Japanese manufacturing firms in a number of industries—including auto, electronics, and machinery—achieved high levels of international competitiveness, Japanese manufacturing practices—particularly those associated with just-in-time manufacturing (JIT)—have attracted considerable attention in North America. Transfer to the United States of JIT is characterized by special production management practices involving inventory and quality control, industrial relations, and supplier-manufacturer relationships. Because so many different aspects of plant operation are involved, the transfer of JIT requires a substantial effort on the part of U.S. manufacturers. Despite this barrier, anecdotal evidence suggests that substantial transfer of Japanese production methods has taken place and that this transfer has had a significant impact on the performance of U.S. manufacturing plants. However, there is little empirical evidence of this process that is based on broad samples of plants and workers from various manufacturing industries (Nakamura et al33). Using a sample of U.S. - and Japanese-owned manufacturing plants in the United States in three different industries, it is show that the implementation of JIT has improved many of the performance measures for these U.S. manufacturing plants Researchers and practitioners have acknowledged the need to understand causal relationships among various elements of total quality management (TQM). In this paper, we model TQM as an organizational innovation. Using the innovation diffusion perspective from the information systems and organizational innovation literature (Ahire and Ravichandran1) it is theorized that TQM implementation translates top management’s quality intent into plant-level operational performance through a four-stage process of adoption, adaptation, acceptance, and use. The top management adopts the TQM philosophy in the first stage. This commitment influences the adaptation of the organizational members’ ability and attitude to the new quality management philosophy. In the acceptance stage, the organizational members demonstrate acceptance of the new quality focus through cooperative teamwork, relationships with suppliers, and quality-related learning. The diffusion of the new philosophy is confirmed through routinization of core quality improvement through effective design, tracking, assurance, and improvement of quality. The four-stage transformation process leads to plant-level measures of product and process quality. They tested this framework on a sample of 407 plants in the automobile parts suppliers industry and found good support for the model. Their results suggest that firms should ensure appropriate technical and behavioral preparation of employees and suppliers before and concurrent to actual TQM implementation. Furthermore, the results also suggest that firms should implement TQM in an integrated fashion covering all sociotechnical elements detailed in the framework. Total Quality Management (TQM) is an integrative management philosophy aimed at continuously improving the quality of products and processes to achieve customer satisfaction. The TQM literature is replete with practitioner oriented `do-everything-right’ articles and case studies. There is no complete agreement on operating system elements of TQM (Joseph et al22). Empirical research on the development of an instrument for TQM implementation in business units in India identified 150 measures of quality management. After a pre-test, 111 measures were used to develop a questionnaire. These items were empirically tested by data collected from 50 respondents. A factor analysis uncovered ten underlying dimensions of TQM with a total of 106 items. These factors and items were found to be reliable and valid. An analysis of TQM practices being followed by maintenance and repair workshops which are raised to provide intimate maintenance and repair support for wide variety of highly complex and sophisticated equipment in the diverse terrain and weather conditions in India identified ten primary elements of TQM (Sahu1 et al45). Thus the survey focused on ten critical factors (Organizational commitment, Human resource management, Quality policy, Role of quality department, Quality Information system, Operating procedures, Technology utilization, Supplier focus, service design, and training) with 92 items comprehensive instrument for explaining and predicting quality management practices. Their work reveals that Indian repair workshops are gearing up to respond fast to the growing awareness for high quality amongst equipment users. In order to strengthen management technology strategy, a new management technology principle, New JIT, based on TMS, TDS, TPS and TQM-S has been developed (Amasaka3). In developing ‘‘Global Marketing’’ that can win the global competition for quality and cost, the key for domestic and foreign companies is to successfully achieve ‘‘Global Production’’ that enables simultaneous production start-up (the same quality and production at optimal locations) throughout the world. It is analyzed and proved about the significance of strategically applying New JIT—a global production strategy activity called AWD6P/J—for epoch-making innovation of the work environment, as verified at Toyota. While many vehicle assembly shops depend on a young, male workforce,

- 23 -

innovation in optimizing an aging workforce is a necessary prerequisite of TPS—a production strategy of New JIT. Elements necessary for enhancing work value and motivation, and work energy, including working conditions and work environment (amenities and ergonomics), were investigated through objective survey and analyzed from labor science perspectives. In the context of TQM, it is essential that the organizations identify a few key critical success factors, which should be given special attention for ensuring successful implementation of TQM program. The concept of critical success factors (CSFs) and their use in supporting planning efforts was originated from the approach associated with the development and implementation of management information systems (Wali et al56). Such factors are considered as conducive to the success of TQM implementation. Based on an exploratory study of Indian organizations engaged in manufacturing and services, CSFs have been identified. Concurrent engineering (CE) tools are intended to increase the concurrency of multidisciplinary design by integrating various enabling technologies such as computer-aided design, computer-aided manufacturing, group decision support systems, expert systems, and communication networks. If the long term viability of CE depends on effectively developing and deploying CE tools, the assumptions about how CE design tasks are most successfully performed and the roles of tools in facilitating that work should be carefully reviewed (King and Majchrzak23). Human factors assumptions made by the CE tool development community have been identified and compared to conclusions drawn from existing literature on the role of technologies in performing technical work. This comparison suggests that the assumptions made by the CE tool development community are likely to inhibit CE tools from successfully enabling the CE process. Recommendations for remedying this state of affairs are offered in the form of restated assumptions that are consistent with documented behaviors of people using similar technologies and potential development strategies for CE tool developers. Development in Indian Business has revealed that ISO 9000 has been a big hit with corporate India. Increasing competition, both local and global is making it difficult for Indian companies to continue in business without focusing on the needs of customers and initiating continues improvement. Customers in India and abroad demand assurance that the products and services they are paying for, will meet their specifications. This growing emphasis on quality in products and services is forcing the Indian Industries to adopt internationally recognized quality Management system (QMS) like ISO: 9000 (Sedani and Lakhe47). Critical factors to attain ISO 9000 registration have been identified by an exhaustive survey of Indian SMES located in central India. The survey also leads to conclusion that ISO 9000 registration processes is an important step towards achieving Total Management. During the past two decades, Total Quality Management (TQM) programs have been implemented in many organizations. There has been a paucity of systematic empirical research to prescribe what factors are really crucial to the successful implementation of TQM programs (Mohanty and Lakhe32). Study carried out in an attempt to identify the critical factors for TQM implementation through survey-based research carried out in Indian industry. Meanings and operational measures of such critical factors are articulated and developed by involving the industry managers as the appropriate subjects. The measures are subject to internal consistency and reliability tests. A factor model is evolved which may facilitate the articulation of global perspectives, and understand business imperatives and undertake strategic initiatives to implement TQM programs across different industry sectors. Multiple simultaneous change initiatives are in general more difficult to successfully implement than are individual change initiatives. This is because multiple initiatives, at a minimum, compete for scarce attention, time and resources and often actually conflict in their objectives or in the behavior required of employees. However, CIM and TQM complement each other and are best done together (Clemson and Alasya8). It has been shown that Companies that currently lag their competitors in quality and CIM can implement both at the same time as a catch-up strategy. It is important to explore the cultural, managerial, and human resources dimensions of implementing JIT (Meredith et al30). The cultural and human factors might affect implementation strategies. This requires an examination of the characteristics of the changes required for industries to consider JIT. Then specific methods for implementing JIT can be developed. Two models are proposed to integrate statistical process control (SPC), engineering process control (EPC) and Taguchi’s quality engineering (TQE) (Duffuaa et al12). The models employ the concept of Taguchi’s quadratic loss function to determine whether to take an EPC action by comparing the cost of the action and the cost of quality. A case study is used to compare these two models with the model in the literature where SPC and EPC have been integrated. The results have shown that the first model resulted in about a 25% saving and the second model resulted in even greater saving of about 30% for the case under consideration. To be competitive in the Private Branch Exchange (PBX) business, it is necessary to offer to the customer leading edge, customized products at the lowest cost and with the shortest interval. Based on this business environment, the AT&T Denver Works embarked on an aggressive program to significantly strengthen its manufacturing position by

- 24 -

implementing and integrating a mixture of Material Requirements Planning (h4RP) and Just-In-Time/Total Quality Control (JIT/TQC) systems. This task is especially challenging because of the high complexity and high demand volatility of the product. Furthermore, MRP and JIT/TQC systems can be in conflict if applied incorrectly (George and Larry18). An attempt has been made to review the criteria used in selecting production control systems and discuss some of the analytical techniques used to implement these systems and their hybrids to achieve the "best of both approaches". It is found that though ISO 9000 provides basic guidelines for documenting work procedures and performs valuable service as a common language of quality, to be fully effective in achieving the firm’s strategic objectives over time, IS0 9000 needs to be integrated with TQM(Liao et al26). The debate about the impact of ISO 9000/1994 on performance has been waging since its inception. While there is a general agreement regarding the positive impact of TQM on performance, there has been less agreement among the academics about the impact of ISO 9000/1994. Perhaps in response to such debate, the new ISO 9001/2000 has appeared purporting to be more in line with the TQM philosophy. As of now, how this 2000 version actually affects performance is yet to be explored (Costa et al9). A survey evaluated its impact on company performance with a sample of 713 Spanish industrial companies. They have also examined if the 2000 version of ISO is taking us closer to the implementation of TQM. It did not follow the past studies methodologically by considering performance as a formative construct rather than a reflective construct. Based on the mean and covariance structural (MACS) analyses, it was concluded that ISO 9001/2000 certified companies do not perform noticeably better than ISO 9000/1994 or non-certified companies. However it has been found that ISO 9001/2000 certified companies apply TQM at a higher level than ISO 9000/1994 certified companies, but whether they actually perform better is less clear. TQM and vendor development efforts must precede the launch of major JIT programs (Mahadevan27). Automobile industry in the country has made significant improvements in areas such as multi-skilling of work force, setup time reduction and small lot sizes. Hence such factors do not belong to the concern category. However, the current study indicates that the immediate priorities for this sector are TQM, TPM and JIT purchasing. These three factors appear to be the concern for different classification schemes and thus constitute the basic requirements for successful JIT implementation in any firm. Managers need to prioritize TQM programs ahead of TPM. JIT and TQM are considered as two sides of the same coin. While JIT provides an organizational framework for the exposure of waste and problems, TQM programs provide an organizational framework to solve these problems. Hence, it is important for managers to understand that neither JIT nor TQM can “stand alone” in the long run. The major thrust of JIT is to identify problems. It surfaces the barriers and constraints that are causing waste. TQC corrects them. Without an effective means of overcoming the barriers and constraints, JIT would simply pile up missed opportunities. Very quickly, everyone would become frustrated at the lack of progress. Problems, the ones creating waste, for TQC to attack. It's possible for a company to install TQC without JIT. JIT, however, insures that TQC is working on the right priority problems (Goddard19). It has been seen that together, JIT and TQC are effective tools in attacking waste. Not just in the factory, but all activities within a company are challenged: is it necessary: can we eliminate it: if not, can we simplify it; if we must do it, let's excel at it. Wasteful activities exist in all departments. All employees, should be actively involved in applying JIT/TQC McKnighf29 has described how a particular business unit implemented various JIT techniques inside the factory and promoted these techniques to outside suppliers. Focus will be on how the internal manufacturing layout was modified to support a true pull system along with discussing various supplier delivery methods were integrated into Abbott’s manufacturing process The quest for competitive advantage through quality excellence in recent years has led US companies to focus more and more on Total Quality Management. It has been found the Malcolm Baldrige National Quality Award (MBNQA) criteria, IS0 9000 standards, and SEI Capability Maturity Model (CMM) to be effective tools for evaluating and improving their quality systems. Some companies are using these models to drive their quality initiatives. Although a company can have an excellent approach to TQM, the deployment and management of the system are critical factors for overall success. Fallah15 reviews quality system models, introduces AT&Ts Total Quality Approach (TQA), and describes the TQA implementation strategy The Just-in-Time (JIT) production system has been implemented with widely varying degrees of success by many North American manufacturing companies. Effective implementation and management of a manufacturing system such as Just-in-Time, requires a good conceptual model of the system. Safayeni and Duimerhg44 have presented a theoretical view of the JIT system, in which the impact of JIT on organizations is considered in terms of an increase in interdependence in the organization, as a result of the removal or reduction of inventory buffers in the system. It is further theorized that the impact of increased interdependence amongst organizational functions will ultimately impact the structure of the organization by changing it from a functionally organized structure, towards a product

- 25 -

based structure. In particular, the manner in which organizations manage interdependencies, and reduce and cope with variability, at both the shopfloor level and throughout the organizational system, has direct implications for the success of JIT implementation. This theoretical view is illustrated using examples drawn from empirical studies which investigated the relationship between a large automotive manufacturer and its JlT suppliers, the JIT implementation difficulties experienced by a high-tech manufacturer, as well as the impact of JIT on the structure of organizations. An advanced production management principle, the New Japan Production Model to further advance TPS (Toyota Production System) called the Advanced TPS is proposed, which involves the systematization of Japanese production management methodology for strategic production. The New Japan Production Model—a new management technology principle, proposed and verified in previous studies—was developed through establishing a Global Production Technology and Management Model based on New JIT utilizing three core technologies (TMS, TDS, and TPS), which relates to hardware systems, and Science TQM, which relates to software systems. Formation of the model through utilization of these core technologies signifies the high linkage of business processes that enables a speedy production cycle by using “Intelligent Quality Control System, TPS-QAS”, “Highly Reliable Production System, V-MICS”, “Renovating Work Environment, TPS-IWQM” and “Bringing up Intelligent Operators, V-IOS”. Effectiveness of the proposed New Japan Production Model was verified by Amasaka4 at Toyota Motor Corporation. In the midst of the rapid advancement of ―globalization – worldwide quality competition, Japanese manufacturers are struggling for the realization of ―simultaneous achievement of QCD (Quality, Cost and Delivery). Against this background, for the purpose of sharing intellectual information and strengthening cross-department cooperation, study carried out by Yamaji and Amasaka57 proposes ―New Japan Quality Management Model mainly in connection with the Quality Assurance Div. and TQM Promotion Div. Further, the effectiveness of this model is verified at Toyota. As a result, QCD was achieved simultaneously by collaboration of QA Division and TQM Division In today's climate of renewed interest in Total Quality Management (TQM) there exists the question "So what is new?" Quality in the past has often been an after the fact inspection process where todays quality effort is to eliminate quality problems through a work culture that prevents problems through design, planning, and good management practices. Some enterprises are having difficulty achieving the improved quality that the TQM process is touted to provide. We believe that there are specific actions that can be taken to improve the situation. Bachert5 discussed the need for an enterprise to begin the development of a TQM process by better understanding its structures, functions, and performance in the context of a total integrated operation. From this knowledge base the enterprise can set objectives, define strategies, and plan an effective application of the TQM process and the use of resources that match an enterprise's strengths and weaknesses. The methodology, techniques, and tools for analyzing, planning, and change management that empower an enterprise to effectively apply TQM has been elaborated. The intense competition in the current marketplace has forced firms to re-examine their methods of doing business. Economic liberalization and globalization of economy is becoming a worldwide phenomenon. However survival of industry and its economic growth is dependent on the productivity level. This is very true in developing countries like India because of higher population growth, higher interest rates, rising inflation, domestic and international competition, scarcity of raw materials, fiscal deficit etc. Singh et al49 have carried out an empirical study of impact of quality management practices like Just-in-Time, 5S's tools, Suggestion schemes, Workers participation, Quality circles and ISO certification on performance of SMEs. The performance parameters incorporated for the study were manpower and assets utilization, inventory management, quality aspects, cost aspects, time performance and purchasing procedure. A comprehensive questionnaire was developed and circulated to the different firms and responses were collected for analyzing the data. On the basis of literature review and the survey of the industry, objectives were identified; the impact was validated on the basis of correlation analysis. Dyck et al13 presented a rationale for understanding JIT as an interdependent set of elements which must be properly meshed into a coherent system. They advocated to document the interaction between the JIT-K system and the level of quality by measuring the system characteristics as the level of quality changes. These system characteristics include system output, utilization rates and in-process inventory at work stations and buffers. The study measured the effect of the number of kanbans per stage on the system. It provides a strategy for the implementation of JIT allowing for small, but continuing quality improvements over time. Total quality management (TQM) has been developed around a number of critical factors. However, TQM is much more than a number of critical factors; it also includes other components, such as tools and techniques for quality improvement. Tarli and Sabater54 carried out an empirical study in order to verify the importance of these tools and techniques for TQM improvement and their effect upon TQM results in 106 ISO-certified firms in Spain.

- 26 -

The developing countries like India found themselves in highly competitive global business environment. Though on a limited basis, literature finds some JIT implementation in the India. Wakchaure et al55 have reviewed the JIT literature from India to identify the JIT practices and implementation issues. The survey results of these practices also compared the Indian and Japanese manufacturing philosophies and highlighted the factors responsible for slow implementation in India. Mann28 provides a review of the views of quality practitioners on TQM at that time and shows the most commonly used quality improvement activities (65 tools and techniques) that had been implemented by TQM companies. The research contributed to the understanding of TQM by presenting the quality activities of which it can be composed, as well as the areas of the organization the quality activities primarily address. The research can assist organizations in determining which quality activities to use to improve performance in specific areas of operation Low volume automotive plants, challenged in their search for leanness in producing quality throughput are pressured for more efficiency in reducing waste and more effective in achieving higher asset turnover. In essence these categories of plants need to deal with JIT philosophy by being insightful during implementation. Goldman et al20 highlight some cost effective measures to improve five key areas when optimizing JIT implementation. These are; not to overdo standard operating procedure, knowing what, where and when to trigger kanban, systematically replace unsuitable parts, empowering line control and make the lot traveler pull the system. TQM literature suggests that hard TQM has a profound impact on organizational performance. However, most empirical studies have examined the impact of each dimension of TQM on performance separately. Rahman and Bullock41 argued that it is more appropriate to investigate the direct impact of soft TQM on the dilution of hard TQM, and then assess the direct impact of hard TQM on performance. Analysis of 261 Australian manufacturing companies revealed significant positive relationships between soft TQM and hard TQM elements. In addition to direct effects, soft TQM also has an indirect effect on performance through its effect on hard TQM. During the examination process for the Deming Prize, an intense emphasis is placed on auditing- the systems and processes in place to improve the product quality. Specifically, the effective implementation of SPC is a fundamental element of the examination process. SPC implementation included: a well structured and documented SPC plan; a comprehensive SPC training program for engineers, managers, and production employees; a team approach; and senior management involvement and reviews. The key elements of the SPC Plan are: product process quality metrics, process control, process improvement, and capability indices improvement. Donnell and Singhal10 put forward the details of this plan and resulting improvements in the product quality. The SPC implementation and product quality improvement exceeded the expectations of the JUSE (Japanese Union of Scientist and Engineers) examiners. One of the foremost challenges facing manufacturing industry nowadays is the large-scale integration of their enterprise systems. Based on analyzing the processes and operating characters of JIT automotive supply logistics, within the reference model for enterprise logistics systems based on ISO 9001 standard, architecture process for JIT automobile supply logistics was identified and analyzed and modeled by IDEF9000 by Piao et al35. The model is not only as a technique for integrating and simulating complex JIT supply logistics systems but also as an approach that allows the transformation of informal knowledge into the form of a pragmatic, formalized and structured knowledge that could be spread and shared throughout the organization Eker and Pala14 made an empirical investigation into the use of multiple performance measures in manufacturing organizations. Specifically, the relationship between multiple performance measurement system and competition factors, JIT practices and TQM practices is examined through the data collected from 122 manufacturing firms from the Turkish top 500 companies in 2005. The results show there is a linear relationship between using multidimensional performance measurement system and the firms that have high market position are those that are using JIT and TQM more than others. The impact of just-in-time (JIT) implementation and International Standards Organization (ISO) 9000 certification (as specified by the original standards of the early 1990s) on quality management efforts of manufacturing firms has been studied by Dreyfus et al11. Responding firms in the study were grouped into four categories based on their ISO 9000 and JIT orientation: i) firms that are ISO 9000 certified but have not implemented JIT (ISO firms); ii) firms that are ISO 9000 certified and have implemented JIT (ISO-JIT firms); iii) firms that have implemented JIT but are not ISO 9000 certified (JIT firms); and iv) firms that have not implemented JIT and are not ISO 9000 certified (traditional firms). These groups were compared along 13 plant-level total quality management (TQM) implementation elements and five TQM outcome measures using MANCOVA procedure. Analyses resulted in distinct sets of firms reflecting the impact of the ISO-JIT orientation on its TQM implementation and TQM outcomes. Results support the contingency view that a firm’s ability to implement effective TQM practices is enhanced: i) marginally by ISO 9000 efforts; ii) significantly by JIT implementation; and iii) most by conjoint ISO-

- 27 -

JIT efforts (though not much more significantly than JIT implementation alone). These insights have significant practical implications for firms investing in JIT implementation, ISO 9000 certification, and TQM implementation. Over the last two decades, many organizations around the world have adopted Total Quality Management (TQM) in some form. Rigorous attempts have been made to identify critical elements of TQM. These elements can be classified into two broad categories: soft TQM and hard TQM. Empirical studies indicated that only a handful of the soft aspects of TQM dimensions contribute to organizational performance. The elements of soft TQM, such as training and education, loyalty, leadership, teamwork and empowerment are essentially ‘people’ aspects. The coverage of such elements in the management literature is high and, in fact, broadly, management theory and soft TQM are identical. With rapid change and uncertainty in the market and greater emphasis on core competencies, organizations are transforming themselves into modular corporations, and thus the importance of the elements of traditional soft TQM is rapidly diminishing. Rahman42 tries to answer a fundamental question: what is the future of TQM? Just in time purchasing is an important approach to shorten the response time of supply chain efficiently and implement the agile operation of supply chain. The ideal JIT purchasing is such delivery which happened until the demand occurred for the supply of a small number and many time, requiring a high level of quality availability, and avoiding unnecessary wastage of the test. From the supply chain perspective, the success of JIT purchasing must depend on the effective coordination and cooperation between the upstream and downstream enterprises in supply chain system. According to the different powers of purchaser and supplier in JIT purchasing and the different strategies of purchaser and supplier, the Stackelberg game and the cooperative model on purchaser and supplier is discussed by Yan58. The equilibrium solutions of two models are compared and analyzed. A coordination method is put forward which can realize the allocation of costs, so the cooperation of purchaser and supplier can realize. The results can help for purchaser and supplier to determine corresponding strategy of purchase and strategy of supply respectively. The study presented Prajogoa and Sohal37 examines the fit of total quality management (TQM) practices in mediating the relationship between organization strategy and organization performance. By examining TQM in relation to organization strategy, the study seeks to advance the understanding of TQM in a broader context. It also resolves some controversies that appear in the literature concerning the relationship between TQM and differentiation and cost leadership strategies as well as quality and innovation performance. The empirical data for this study was drawn from a survey of middle/senior managers. The analysis was conducted using structural equation modeling (SEM) technique by examining two competing models that represent full and partial mediation. The findings indicate that TQM is positively and significantly related to differentiation strategy, and it only partially mediates the relationship between differentiation strategy and three performance measures (product quality, product innovation, and process innovation). The implication is that TQM needs to be complemented by other resources to more effectively realize the strategy in achieving a high level of performance, particularly innovation. Stoll and Zubas52 summarized the implementation of Total Quality Management (TQM) at the Raytheon Electromagnetic Systems Division and includes lessons learned. Particular attention is given to the TQM start-up process demonstrating the criticality of management commitment and initial planning. Results to date are discussed. Is total quality management (TQM) a predecessor of enterprise resource planning systems (ERP) implementation? This question draws a lot of interest from business managers. Many firms intend to implement both TQM and ERP systems to match the market competition. The question is: Which system should be implemented first? Using US manufacturing companies that have a focus on TQM and have implemented ERP systems, Li et al25 examined the relationship among TQM, ERP implementation, operations performance, customer satisfaction, and a firm’s performance and subsequently provides a better understanding about the synergistic relationship between TQM and ERP implementation. Structural equation modeling was applied to analyze the data from 154 manufacturing companies in the US. They emphasized that TQM is a philosophy that emphasizes process improvement, whereas an ERP system is an IT mechanism that implements enterprise-wide process management. Conceptual development and their study suggested ERP implementation can be successful if it is preceded by a TQM focus. Popular views on reengineering include: reengineering has supplanted TQM, reengineering alternates with TQM allowing for continuous improvement between periods of radical change, and reengineering has peaked and is on its way out. Falllah and Weinrnan16 examined these concepts and views, explore the relation of reengineering to TQM, and demonstrate that these views are incorrect or, at best, incomplete. 2.6.0 Conclusions: Based on various TQM principles outlined in this chapter, the following points can be concluded

- 28 -

• In today’s business environment, the industry must, in effect, keep “both hands on the wheel” to move forward successfully. Hoshin Kanri provides a planning structure for bringing critical processes up to the desired level of performance, exceed the customer’s expectations and stay competitive using long range strategic plan. Relevant detailing has accordingly been done in this chapter for “Managing Business in a Competitive Business environment through Hoshin process of long range strategic planning”.

• Globalization is posing several challenges to manufacturing sector. Design and operation of manufacturing systems are of great importance. In this context, companies need to synchronize their manufacturing activities. In view of this “Achieving Co-ordination between production rate and demand rate in manufacturing system” has been outlined in detail in this chapter.

• TQM is a strategic competitive philosophy with significant implications for CIM design and management. The basis of TQM is customer satisfaction, so it is important that all entities within the organization synchronize their operations to achieve this objective. CIM is a major tool for achieving competitive advantage should be synergized with TQM philosophy to attain this ultimate goal

• Taguchi Philosophy is also helpful in the accomplishment of TQM objectives.