Taguchi methods

Taguchi methods

Taguchi methods are statistical methods developed by Genichi Taguchi to improve the quality of manufactured goods, and more recently also applied to biotechnology, [ Ravella Sreenivas Rao, C. Ganesh Kumar, R. Shetty Prakasham, Phil J. Hobbs (2008) The Taguchi methodology as a statistical tool for biotechnological applications: A critical appraisal "Biotechnology Journal" 3:510–523.
R. Sreenivas Rao, R.S. Prakasham, K. Krishna Prasad, S. Rajesham,P.N. Sarma, L. Venkateswar Rao (2004) Xylitol production by Candida sp.: parameter optimization using Taguchi approach, "Process Biochemistry" 39:951–956
] marketing and advertising. Taguchi methods are considered controversial among some traditional Western statisticians, but others accept many of his concepts as being useful additions to the body of knowledge.

Taguchi's principal contributions to statistics are:
#Taguchi loss function;
#The philosophy of "off-line quality control"; and
#Innovations in the design of experiments.

Contributions

Loss functions

Taguchi's reaction to the classical design of experiments methodology of R. A. Fisher was that it was perfectly adapted for seeking to improve the mean outcome of a process. As Fisher's work had been largely motivated by programmes to increase agricultural production, this was hardly surprising. However, Taguchi realised that in much industrial production, there is a need to produce an outcome "on target", for example, to machine a hole to a specified diameter, or to manufacture a cell to produce a given voltage. He also realised, as had Walter A. Shewhart and others before him, that excessive variation lay at the root of poor manufactured quality and that reacting to individual items inside and outside specification was counterproductive.

He therefore argued that quality engineering should start with an understanding of quality costs in various situations. In much conventional industrial engineering, the quality costs are simply represented by the number of items outside specification multiplied by the cost of rework or scrap. However, Taguchi insisted that manufacturers broaden their horizons to consider "cost to society". Though the short-term costs may simply be those of non-conformance, any item manufactured away from nominal would result in some loss to the customer or the wider community through early wear-out; difficulties in interfacing with other parts, themselves probably wide of nominal; or the need to build in safety margins. These losses are externalities and are usually ignored by manufacturers. In the wider economy the Coase Theorem predicts that they prevent markets from operating efficiently. Taguchi argued that such losses would inevitably find their way back to the originating corporation (in an effect similar to the tragedy of the commons), and that by working to minimise them, manufacturers would enhance brand reputation, win markets and generate profits.

Such losses are, of course, very small when an item is near to nominal. Donald J. Wheeler characterised the region within specification limits as where we "deny that losses exist". As we diverge from nominal, losses grow until the point where "losses are too great to deny" and the specification limit is drawn. All these losses are, as W. Edwards Deming would describe them, "unknown and unknowable", but Taguchi wanted to find a useful way of representing them statistically. Taguchi specified three situations:

#Larger the better (for example, agricultural yield);
#Smaller the better (for example, carbon dioxide emissions); and
#On-target, minimum-variation (for example, a mating part in an assembly).

The first two cases are represented by simple monotonic loss functions. In the third case, Taguchi adopted a squared-error loss function on the following grounds:

*It is the first symmetric term in the Taylor series expansion of any reasonable, real-life loss function, and so is a "first-order" approximation;
*Total loss is measured by the variance. As variance is additive, it is an attractive model of cost; and
*There was an established body of statistical theory around the use of the least-squares principle.

The squared-error loss function had also been used by John von Neumann and Oskar Morgenstern in the 1930s.

Though much of this thinking is endorsed by statisticians and economists in general, Taguchi extended the argument to insist that industrial experiments seek to maximise an appropriate "signal-to-noise ratio", representing the magnitude of the mean of a process compared to its variation. Most statisticians believe Taguchi's "signal-to-noise ratios" to be effective over too narrow a range of applications, and they are generally deprecated.

Off-line quality control

Taguchi realised that the best opportunity to eliminate variation is during the design of a product and its manufacturing process (Taguchi's rule for manufacturing). Consequently, he developed a strategy for quality engineering that can be used in both contexts. The process has three stages:

#System design;
#Parameter design; and
#Tolerance design.

ystem design

This is design at the conceptual level, involving creativity and innovation.

Parameter design

Once the concept is established, the nominal values of the various dimensions and design parameters need to be set, the detail design phase of conventional engineering. Taguchi's radical insight was that the exact choice of values required is under-specified by the performance requirements of the system. In many circumstances, this allows the parameters to be chosen so as to minimise the effects on performance arising from variation in manufacture, environment and cumulative damage. This is sometimes called robustification.

Tolerance design

With a successfully completed "parameter design", and an understanding of the effect that the various parameters have on performance, resources can be focused on reducing and controlling variation in the critical few dimensions (see Pareto principle).

Design of experiments

Taguchi developed much of his thinking in isolation from the school of R. A. Fisher, only coming into direct contact in 1954. His framework for design of experiments is idiosyncratic and often flawed, but contains much that is of enormous value. He made a number of innovations.

Outer arrays

Unlike the design of experiments work of R. A. Fisher, Taguchi sought to understand the influence that parameters had on variation, not just on the mean. He contended, as had W. Edwards Deming in his discussion of analytic studies, that conventional sampling is inadequate here as there is no way of obtaining a random sample of future conditions. In R. A. Fisher's work, variation between experimental replications is a nuisance that the experimenter would like to eliminate whereas, in Taguchi's thinking, it is a central object of investigation.

Taguchi's innovation was to replicate each experiment by means of an outer array, possibly an orthogonal array that seeks deliberately to emulate the sources of variation that a product would encounter in reality. This is an example of judgement sampling. Though statisticians following in the Shewhart-Deming tradition have embraced outer arrays, many academics are still skeptical.See Montgomery (1991) "Design and analysis of experiments" John Wiley and Sons]

Later innovations in outer arrays resulted in "compounded noise". This involves combining a few noise factors to create two levels in the outer array. First, noise factors that drive output lower, and second, noise factors that drive output higher. This still emulates the extremes of noise variation but with fewer test samples required.

Management of interactions

Many of the orthogonal arrays that Taguchi has advocated are saturated, allowing no scope for estimation of interactions. This is a continuing topic of controversy. However, this is only true for "control factors" or factors in the "inner array". By combining an inner array of control factors with an outer array of "noise factors", Taguchi's approach provides full information on control-by-noise interactions. His concept is that those are the interactions of most interest in achieving a design that is robust to noise factor variation. In this sense, the Taguchi approach provides more complete interaction information than typical fractional factorial experiments.

*Followers of Taguchi argue that the designs offer rapid results and that interactions can be eliminated by proper choice of quality characteristics and by transforming the data. That notwithstanding, a confirmation experiment offers protection against any residual interactions. If the quality characteristic represents the energy transformation of the system, then the likelihood of control factor-by-control factor interactions is greatly reduced, since energy is additive.

*Western statisticians argue that interactions are part of the real world and that Taguchi's arrays have complicated alias structures that leave interactions difficult to disentangle. George Box and others have argued that a more effective and efficient approach is to use sequential assembly.

Analysis of experiments

Taguchi introduced many methods for analysing experimental results including novel applications of the analysis of variance and "minute analysis". Little of this work has been validated by Western statisticians.

Assessment

Genichi Taguchi has made seminal and valuable methodological innovations in statistics and engineering, within the Shewhart-Deming tradition. His emphasis on "loss to society", techniques for investigating variation in experiments, and his overall strategy of system, parameter and tolerance design have been massively influential in improving manufactured quality worldwide. Much of his work was carried out in isolation from the mainstream of Western statistics and, while this may have facilitated his creativity, much of the technical detail of "Taguchi methods" and their benefits to experimentation and research is only now being studied in the West.

Bibliography

*León, R V; Shoemaker, A C & Kacker, R N (1987) Performance measures independent of adjustment: an explanation and extension of Taguchi's signal-to-noise ratios (with discussion), "Technometrics" vol 29, pp. 253–285
* Moen, R D; Nolan, T W & Provost, L P (1991) "Improving Quality Through Planned Experimentation" ISBN 0-07-042673-2
*Nair, V N ("ed.") (1992) Taguchi's parameter design: a panel discussion, "Technometrics" vol34, pp. 127–161
*Bagchi Tapan P and Madhuranjan Kumar (1992) "Multiple Criteria Robust Design of Electronic Devices", Journal of Electronic Manufacturing, vol 3(1), pp. 31–38
*Ravella Sreenivas Rao, C. Ganesh Kumar, R. Shetty Prakasham, Phil J. Hobbs (2008) The Taguchi methodology as a statistical tool for biotechnological applications: A critical appraisal "Biotechnology Journal" Vol 3: pp. 510–523.

ee also

*Quality management
*Six sigma
*Tolerance (engineering)

References


Wikimedia Foundation. 2010.

Игры ⚽ Нужно сделать НИР?

Look at other dictionaries:

  • Taguchi methods — Methods of testing the design of a new product or service in the most extreme circumstances likely to occur. In theory, all possible design variables and combinations are investigated to achieve the optimum combination. In reality, this would be… …   Big dictionary of business and management

  • Taguchi — is a common surname of Japanese origin. Taguchi may refer to: *Genichi Taguchi, statistician, engineer and originator of the Taguchi methods *Hiroko Taguchi, a Japanese voice actress *Junnosuke Taguchi *Nobutaka Taguchi, an Olympic gold medalist… …   Wikipedia

  • Taguchi Method Of Quality Control — An approach to engineering that emphasizes the roles of research and development, product design and product development in reducing the occurrence of defects and failures in products. The Taguchi method considers design to be more important than …   Investment dictionary

  • Genichi Taguchi — Gen ichi Taguchi (田口 玄 Taguchi Genichi) (born January 1, 1924 in Tokamachi, Japan) is an engineer and statistician. From the 1950s onwards, Taguchi developed a methodology for applying statistics to improve the quality of manufactured goods.… …   Wikipedia

  • Design of experiments — In general usage, design of experiments (DOE) or experimental design is the design of any information gathering exercises where variation is present, whether under the full control of the experimenter or not. However, in statistics, these terms… …   Wikipedia

  • Multivariate testing — Software Testing portal In statistics, multivariate testing or multi variable testing is a technique for testing hypotheses on complex multi variable systems, especially used in testing market perceptions.[1] …   Wikipedia

  • Six Sigma — Not to be confused with Sigma 6. The often used Six Sigma symbol Part of a series of articles on I …   Wikipedia

  • Dorian Shainin — Born September 26, 1914(1914 09 26) San Francisco, California …   Wikipedia

  • Dorian Shainin — est l´inventeur de la méthode de résolution de problèmes Shainin. Né le 26 septembre 1914 à San Francisco, California, États Unis. Décédé le 7 janvier 2000 à Manchester, Connecticut, États Unis. Profession Ingénieur Qualité/Reliabilité et… …   Wikipédia en Français

  • Robustification — is a form of optimisation whereby a system is made less sensitive to the effects of random variability, or noise, that is present in that system’s input variables and parameters. The process is typically associated with engineering systems, but… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”