you are here:   / News & Insights / Engineering Advantage Blog / Is Your FEA Process Verified & Validated?

Engineering Advantage

Is Your FEA Process Verified & Validated?

December 12, 2014 By: Nick Veikos

I have recently written a few articles for R&D MagazineDesignNews, and about how best to integrate simulation into a product development process. A critical piece of this integration is verification and validation, sometimes referred to in quality circles as “V&V”. Verification and Validation forms the backbone for any type of quality assurance plan related to simulation. Without proper attention to these items, it is unlikely that a simulation plan will succeed, and a 50-50 chance that simulation will do more harm than good by leading the design process in the wrong direction.

Yes, I said it; simulation can actually lead you astray – if you don’t have the right checks and balances in place. Just so we are on the same page, let’s first clarify what is meant by V&V. I will use structural analysis using FEA procedures as a specific example because it is what I am most familiar with. The overall concepts are easily extended to CFD analysis or other analysis types.

Verification is the process by which we check that the FEA was conducted properly and Validation is the process to check whether the (hopefully verified) results reflect reality. I came across the following definition a long time ago, which helps me clarify the difference: Verification is how we see if we have solved the problem correctly and Validation is how we see if we solved the correct problem.


Two things that we know about finite element analysis are that it is approximate and it is not robust. Small errors in modeling, data input, and boundary conditions can lead to very large errors in the results. Even worse, they can cause relatively small errors which are difficult to identify, but which have significant impact on performance and service life.

For example, forgetting to assign nodal temperatures in a structural model may only affect the stresses by 10% - not enough to raise a flag when comparing FEA results to a hand calculation, but more than enough to change fatigue life by a factor of two. Yes, I am guilty - this happened to me many years ago as a “rookie” analyst. Luckily for me, and a lot of helicopter pilots, the error was found during a subsequent analysis audit.

Many years later, I was asked to perform an independent review of a finite element comparison of two different shaving systems. Although similar in design, one was performing so much better than the other. The results seemed counterintuitive, but the analyst had prepared an elaborate explanation to explain the unusual phenomenon and defend his results. What he forgot to do was check the model input and see that he had mistyped a thickness for some of the shell elements in the model. Mystery solved!

These types of errors are inevitable in finite element models – the more complex the model, the higher the probability that they will occur. The important thing is to put a process in place to catch them before they do any damage – the Verification process.

The Verification process can take many different forms, with details depending on the type of analysis to be conducted, the parts to be analyzed, the accuracy required, and the level of risk involved. The analysis must begin with a clear objective of the analysis goals, required output and the accuracy required, and key assumptions to be used.

In the context of these necessary prerequisites, some typical items to check as part of the Verification process for a static structural analysis might include:

  • Geometry – do key model dimensions agree with the actual part dimensions?
  • Does the FE model mass and CG compare well with the actual part?
  • Are the material properties correct and are they properly associated to model regions?
  • For non-isotropic materials, are the principal axes correctly aligned?
  • Are the correct types of elements being used, consistent with their underlying assumptions – e.g – long, slender geometry for shells and beams?
  • Are element properties like shell thickness or beam inertia correct and properly associated to model regions?
  • Is the mesh sufficiently refined to produce the required accuracy?
  • Are there cracks in the mesh?
  • Do the elements pass shape testing criteria?
  • Are nodal temperatures mapped correctly?
  • Are the applied loads correct – location, magnitude, and direction?
  • Are the correct constraints applied – location, magnitude and direction. Are they sufficient to preclude rigid body motion?
  • Are constraint equations properly defined?
  • Are assemblies properly connected together?
  • Does the model pass free thermal expansion and rigid body motion checks?
  • Does a 1G static load produce the expected reaction forces?
  • Have the errors and warning messages been reviewed and reconciled?
  • Do the reaction forces balance the applied loads in each direction?
  • Are the deformations and stresses physically believable – magnitude and direction? Do they compare well with hand calculations?
  • Are the results consistent with the assumptions- e.g. – small deformation or small strain?
  • Are stresses continuous across elements?
  • Are stress gradients through one element excessive?
  • Are natural stress boundary conditions satisfied – e.g. – stress normal to a free surface close to zero?

The above list is not close to being exhaustive, but provides a decent start for basic verification. Additional items should be added depending on the specifics of the analysis.

What kinds of verification do you do in addition to the items above?

More details on verification items can be found in our FEA Best Practices training class and in Reference 1, below. Once the analysis has been properly verified, you can be reasonably confident that the problem was solved correctly. The next step is to see if the correct problem was solved. That, of course, refers to Validation, which will be the subject of my next post.

1. Beattie, G.A. “Management of Finite Element Analysis: Guidelines to Best Practice.” Publication number R0033, Glasgow: NAFEMS, 1995.