Problem Of The Month
January 1998-Six Sigma

Every time you turn around someone is talking about six sigma!      

  • What is six sigma?        
  • Why is it important to know about the subject?        
  • How is it used? 

You can:
       1) Page down for this month's problem statement.
       2) Return to the list of monthly problems by clicking here.
       3) Bypass the background information and go directly to the problem statement
           by clicking here.

Walter A. Shewhart, Ph.D. wrote an important book in 1931 called "Economic Control of Quality of Manufactured Product"-this book is available from ASQ and the republished book contains a forward from Dr. W. Edwards Deming.

Shewhart's book established the concepts for statistical process control (SPC). Shewhart argued in the preface that "...the object of industry is to set up economic ways and means of satisfying human wants and in so doing to reduce everything possible to routines requiring a minimum amount of human effort". Shewhart argued for a routine for manufacturing and established a criteria for knowing when you must take action to redirect the process. Shewart said "Deviations in the results of a routine process outside such limits indicate that the routine has broken down and will no longer be economical until the cause of trouble is removed." Demings argued (in the Dedication section of the republished book that the "...Japanese learned in 1950, productivity moves upward as the quality of process improves."

Shewhart established these issues for control:

  1. Use past experience to predict (within limits) how the process will perform in the future
  2. The future process will perform with some variability in the output
  3. The future process will provide a probability for falling within limits
  4. Special causes of variations (assignable causes of variation) must be removed by weeding out these identifiable roots of special causes of variation-this must occur before you reach the state of statistical control
  5. The future process must be driven by chance causes (very difficult to remove because they are large in quantity and small in their individual impact because in the aggregate the variations are left to chance)
  6. Variations from chance causes can be quantified with a statistical term called the standard deviation and this is symbolized by the Greek letter called sigma, which describes dispersion in the results of things measured. Scatter in the bell-shaped normal curve will include six sigma = ±3s = 6s = 99.73 of the area under the curve and thus the probability of an occurrence is 99.73%. Small standard deviations usually indicate data are closely clustered about the arithmetic mean. Large standard deviations usually indicate data are spread-out and widely dispersed.
  7. The measure of central tendency can be established with the arithmetic mean called X-bar.

Shewhart argued the advantages of controlled conditions was:

  1. Reduction in inspection costs
  2. Reduction in cost of rejects
  3. Controlling the quality of components to tight limits controls the quality of the finished unit for minimum variability and this is an advantage in the market place to maximize the quantity of production
  4. Control of the items produced is very important for items that must be destroyed during an inspection/test so that the entire batch of product faithfully reproduces the performance of the product
  5. Reduction in tolerance limits for the items produced

Shewhart recognized fluctuations in both the mean (X-bar) and standard deviation (sigma = s). He argued that both (X-bar and s) criteria must be controlled to set limits on variability in products. He argued that product limits be set at least X-bar ± 3*s to achieve a high percentage of successes. Shewhart used the limits of X-bar ± 3*s to set control limits for his charts and when data was acquired that was outside of this range, he concluded that corrective action was required because the chance for such an event to occur was very small.

Shewhart also addressed how much data should be acquired for the measurements. The variation in the mean was considered to be DX-bar = Z*(s/n^0.5). The term Z says how much risk you're willing to take, s is the estimated standard deviation, and n is the number of data points. For example, if you want a 90% probability for success (a 10% chance for failure) that an error of 5% in the mean (where X-bar = 874.89) is acceptable when s = 85.31 then the number of data required is 10. For 90% probability this is a Z value of 1.645 describing the area under the normal curve. The DX-bar = 5%*874.89 = 43.74. The calculation is 43.74 = 1.645*85.31/n^0.5, and solving for n = 10.29 or 10 samples required.

You can:
       1) Page down for this month's problem statement.
       2) Return to the list of monthly problems by clicking here.

The Problem
What does six sigma really mean in today's modern society?

Consider the situation where the mean (X-bar) is 100 and the standard deviation (s) is 10 for the following example.

The normal curve has the following appearance:

This plot was made in WinSMITH Weibull as a probability plot. It was exported to WinSMITH Visual as a probability density function plot. It was exported from WinSMITH Visual as a graphics file with HGL extension and imported into PowerPoint. Details were added to the graph in PowerPoint to show how much area was covered by each standard deviation.  .  If you’re interested in making “Board Of Directors” graphics, then you need to read about the HGL graphic conversion details for PowerPoint.

For today's manufacturing, the minimum and maximum tolerance limit is established at ±6*s. Further more, the mean (X-bar) is allowed to float above and below the center point of the tolerance range. The reason for float in the mean is that you never really know the location of the mean until after you've produced your product and thus the mean can never really be located at the true center point of the tolerance zone. This condition results in the figure shown below which were also produced in WinSMITH Weibull and WinSMITH Visual.

The tolerance band is ±6*s = 12*s = 12*10 = 120 units wide. The probability density function (pdf) curves can shift either right or left by (1/4)*(120/2) = 15 units. This would put the shifted X-bar at 85 or 115 when at the extremes. Of course the objective is to keep the process centered, but such is not always the case.

When the curve is shifted to the left by 1.5*s, then the left tail of the curve is cut at the Z-score of (6-1.5) = 4.5 where the area under the tail is 3.397674356*10^(-6) and the right hand tail that lies outside the tolerance zone at Z-score of (6+1.5) = 7.5 is 3.186340081*10^(-14). Thus the area under the shifted curve is 0.999996602 and the chance for failure is rounded off to 0.0000034. These large significant figure calculations were made using MathCad software.

Why is it important to know about the subject of six sigma?

The value of 0.0000034 says you should expect about 3.4 failures out of a million parts as advertised by Six Sigma Academy who has been active with training at ASQ, General Electric, Motorola, etc. The reason you should know about this subject is to modernize your processes to achieve the small number of failures---the motivation is simply money. Money you can make, rather than money you can loose.

How is six sigma used?

Suppose you have a physical tolerance for manufacturing a hole. The size of the hole is 1-inch and tolerance for the hole is ±0.001-inches. Then you need a process 12*s= 2*0.001. Thus the process must be capable of a standard deviation of 0.000167-inches or less. The mean hole size can float between ±1.5s= 0.00025 which is the same as saying the average hole size can be 0.99975-inches up to 1.000167-inches. When no special causes exist, then we would expect this process would produce 3.4 out of tolerance parts in each million parts produced.

The standard deviation is an absolute measure of dispersion that expresses variation in the same units as the original data. You cannot really understand dispersion in a set of data until you know:

  1. The standard deviation
  2. The mean
  3. How the standard deviation compares to the mean

The coefficient of variation provides a relative measure of dispersion compared to the mean and the unit of measure is %, The equation for the coefficient of variation is s/X-bar. The coefficient of variation for the example above of hole and it's tolerance is 0.000167/1 = 0.0167% which is difficult to achieve in many applications.

The relative amount of scatter defined by the coefficient of variation is an important concept. For example, if s=10 and X-bar= 10,000 then the amount of scatter is insignificant and the coefficient of variation shows 10/10000=0.001 or 0.1%, whereas if s=10 and X-bar=20, then the amount of variation is huge and the coefficient of variation shows 10/20=0.5 or 50%.

The coefficient of variation will be important in future Problems Of The Month for February '98.

Other pages you may want to visit concerning similar issue are:

·         Production Output/Problems

·         Coefficient of Variation

·         Nameplate Capacity

·         Production Reliability Example With Nameplate Ratings

·         Process Reliability Plots With Flat Line Slopes

·         Process Reliability Line Segments

·         Papers On Process Reliability As PDF Files For No-charge Downloads
- New Reliability Tool for the Millennium: Weibull Analysis of Production Data
- Process Reliability and Six-Sigma
- Process Reliability Concepts

Return to the list of problems by clicking here. Return to top of this problem statement clicking here.


Refer to the caveats on the Problem Of The Month Page about the limitations of the following solution. Maybe you have a better idea on how to solve the problem. Maybe you find where I've screwed-up the solution and you can point out my errors as you check my calculations. E-mail your comments, criticism, and corrections to: Paul Barringer by     clicking here.

Thanks to Christer Olsen of Scania Trucks in Sweden for finding a typo and error in the Mathcad calculation. (6/4/99):

First, the incorrect sentence: When the curve is shifted to the left by 1.5*s, then the left tail of the curve is cut at the Z-score of (6-1.5) = 4.5 where the area under the tail is 2.397674356*10^(-6) and the right hand tail that lies outside the tolerance zone at Z-score of (6+1.5) = 7.5 is 3.190891673*10(-14).

Now the correct sentence should read: When the curve is shifted to the left by 1.5*s, then the left tail of the curve is cut at the Z-score of (6-1.5) = 4.5 where the area under the tail is 3.397674356*10^(-6) and the right hand tail that lies outside the tolerance zone at Z-score of (6+1.5) = 7.5 is 3.186340081*10^(-14).

I solved the Mathcad problem by using the integration with limits, and Christer used the Mathcad build-in function generator "cnorm(z)" to get a better answer.

Technical tools are only interesting toys for engineers until results are converted into a business solution involving money and time. Complete your analysis with a bottom line which converts $'s and time so you have answers that will interest your management team!

Last revised 4/20/2004
© Barringer & Associates, Inc. 1999

Return to Barringer & Associates, Inc. homepage