you are here:   / News & Insights / Engineering Advantage Blog / Six Sigma and Simulation

Six Sigma and Simulation

April 4, 2014 By: James Kosloski

Ask a structural engineer "What is Six Sigma?" and he may reply: “σx σy, σz, σxy, σyz, σxz." Well, hopefully we know that the sigma in a six sigma analysis has nothing to do with stress, it is standard deviation.  What it means is that a design must meet all the requirements 99.99966% of the time.

In a manufacturing sense this is pretty straight forward; the manufacturing process must be set so that there are less than 3.4 defective units out of 1 million produced.  Machining tolerance must be defined so that they will be able to be met 99.99966% of the time. Scrapping a product is expensive; therefore manufacturers want to set the tolerances loose enough so that their processes are able to meet this requirement.

This is where simulation comes into play.  We cannot just simply open up tolerances on dimensions to meet manufacturing capabilities without understanding what impact on the design those changes will have. On the other hand, we don’t want to set tolerances so tight that they cannot possibly be met on the manufacturing side.

The issue is, of course, that we tend to generate our model using nominal dimensions, average material properties, and average loads.  How do we assess whether variation in several of these parameters will still result in an acceptable design?  One’s first reaction might be to say, “I will just model the worst case, and if that meets my criteria then I am all set”.  There are two flaws in that logic:

1. Determining the worst case is not necessarily simple.  Does worst case mean modeling the smallest dimension within the specified tolerance, or the largest?  If we are modeling a pressure vessel wall, than the smallest thickness would likely be the worst case; in an interference fit, the largest dimension may be the worst case.  Understanding how different design quantities interact is critical in assessing a design across the full range of potential variation.

2. Even if you could determine the worst case value for every possible variable in the model, this would not be the right thing to do.  The chance that every variable winds up being manufactured at its worst possible configuration is extremely small, much less than the 3.4 out of 1 million that a six sigma process requires.  So, why design to a situation that will never occur in reality?

This is where probabilistic analysis comes into play.  If we know the statistical distribution of the variation in all of our input parameters, then we can run a series of analyses, a design of experiments (DOE), which will determine the probability that a design will not meet the requirements.

Of course most engineers’ reaction to this is: “That is too difficult to set up” or “I don’t have time to run all the analyses required for a design of experiments”.  With modern finite element software and computing capabilities, this is no longer really an issue.

ANSYS Workbench, for example, allows one to make inputs parametric with a simple click of a check box.  A statistical distribution can be defined for each input and a design of experiments can be run to examine how different combinations of the variations in the input affect the design.  Once that is done, the probability that a design falls outside of the requirements can be determined.

Since combinations of inputs required to define the DOE are known up front, with enough resources, all of the analyses could be run simultaneously.  ANSYS provides licensing solutions that allow users to multiply their existing licenses so that many analyses can be run simultaneously in a cost-effective manner.  There are also several cloud computing solutions that provide hardware to support this effort.

Therefore, with the right combination of hardware and software, it is possible to determine whether a design meets six sigma requirements with about the same amount of time and effort as it would take to analyze a single run with all nominal dimensions. The resulting benefits in terms of product quality and reduction in manufacturing cost introduced by using this approach can be significant to any organization. I welcome any observations from readers who have implemented this technique and their experiences.