Tell people that you have a Six Sigma certification, and you’re likely to draw a few blank stares. Your friends and family may be in-the-know, but without a background in mathematics, most people are unlikely to know of the methodology and why it is named after this statistical term.
Thinking through how to describe Six Sigma to the uninitiated not only helps you better explain what you do, but it helps you better understand exactly how the methodology measures process performance and improves it.
Why Call It Six Sigma?
Six Sigma is a statistical term used to measure the number of defects that processes create. The term implies high-quality performance because a process performing at a Six Sigma level allows only 3.4 defects per one million opportunities.
The higher the sigma level the better the quality of the product or service and the fewer the defects. Organizations with a Six Sigma quality have an advantage over others who perform at three, four or even five sigma levels.
Let’s make Six Sigma performance non-theoretical, and consider a real-life example.
Suppose that a bank’s new accounts department processes 360,200 applications per year (about 1,000 every day), and that there were nine different ways for each application to be processed incorrectly. Different sigma levels of quality would lead to the following number of defects.
Three Sigma quality – This level of performance produces a defect-free product 93.32% of the time. 770 applications would be processed incorrectly and would require rework every day.
Four Sigma quality – This level of performance yields a defect-free product 99.349% of the time. With four sigma quality 73 applications would need to be corrected every day.
Five Sigma quality – Five Sigma performance produces defect-free products and services 99.977% of the time. Every week the bank would need to correct 13 application errors.
Six Sigma quality – Six Sigma performance produces a defect-free product 99.99966% of the time; allowing only 3.4 errors per one million opportunities. 10 applications would need to be corrected during the entire year.
Four sigma and six sigma levels of performance both have an error free rate over 99% of the time. However, the large volume of applications in this example makes all of the difference. With numbers this big, it turned out that the four sigma process made 18,710 more errors than the six sigma process.
How to Calculate Your Process’s Baseline Sigma
A key factor in determining a process’s sigma level is the defects per million opportunities (DPMO). Six Sigma professionals can measure a process’s DPMO and gauge its level of performance using the following information:
- The number of units the process produces
- The number of defect opportunities per unit
- The total number of defects
Once you have this information, calculate the defects per opportunity (DPO) by dividing the total number of defects by the total number of units, times the number of opportunities for error per unit. Then multiply DPO by 1,000,000 to determine DPMO.
The defects per million opportunities figure can then be plugged into a conversion chart or an Excel spreadsheet to determine the process’s sigma level of quality.
Calculating the process’s baseline sigma is the first step toward understanding how well it’s performing, and how much work will be required to achieve Six Sigma quality.