WHAT IS PROCESS TOLERANCE?
Whether it’s the weight of an ingredient or the thickness of paint, no two products are exactly the same. These differences are not only caused by measurement uncertainty, they are also caused by tolerance built into the procedure. Process tolerance is part of an age-old struggle between product quality and costs. In the simplest terms, tolerance can represent “good enough.” The phrase “good enough” may send chills down your spine and make you retort, “Only perfection is good enough,” but the use of tolerance is absolutely necessary. You’ll never see it in a marketing campaign, but even high-quality luxury brands utilize this “good enough” factor in virtually every step of their process.
In practice, Process Tolerance defines the minimum and maximum standard deviation allowed in the production of goods. This deviation is generally represented by a +/- value following a reference amount. For example, the procedure to make a sauce may call for 10.0(+/- 0.5) oz of basil. This means the end products can have between 9.5 and 10.5 oz and still be considered in tolerance. It’s worth noting that it does not need to be an even split. A tolerance could have different upper and lower limits such as 10.0(+0.2 -0.5) oz.
WHY IS IT IMPORTANT?
Don’t be fooled. Tolerance is not a symptom of being cheap or lazy; it is the exact opposite. Since consistently hitting a desired target exactly is impossible, an acceptable guideline must be established to ensure quality and safety. Generally, a tighter tolerance will increase quality while making the process exponentially slower, more expensive, and increase the potential for waste. These costs can greatly impact the success of a company and the costs to consumers.
HOW TO DETERMINE YOUR PROCESS TOLERANCE
When determining a process tolerance, the goal is to balance product quality/consistency with production costs. This is normally achieved by combining Worst-Case Tolerance, Statistical Tolerance, and other unique factors for your process into the same analysis.
Worst-Case Tolerance:
This method sets the upper and lower limits to the absolute largest and smallest ranges based on a recipe or design. This version mostly ignores the real-world process and simply focused on specific “correct” limits. Since this guarantees everything manufactured falls within the ideal or functional range, Worst-Case tends to be the safest way to determine tolerance. Unfortunately, Worst-Case Tolerance can set unrealistic goals and leads to tighter tolerances, increased costs, and increased waste.
Statistical Tolerance:
Statistical Tolerance is determined based on the desired target, standard deviation, and distribution. In simpler terms, this method determines the upper and lower limits based on the average of actual minimum and maximum amounts recorded while aiming for the target amount. Using Statistical Tolerance tends to be far more realistic than Worst-Case Tolerance but can lead to additional quality issues. If left uncontrolled, the tolerance can slowly creep out of the center, or and assumption can be made that this is the best quality that can be produced.
Other Possible Considerations:
Effect on the final product
Cost of ingredient/component
Chance of error
Rework costs
Effects on production speed
Equipment limitations
Let’s look at our imaginary sauce recipe from earlier. If we conduct a taste test and conclude the ideal sauce has 10.0 oz of basil, this becomes our reference point. During our test, we also find the sauce taste virtually the same between 10.2 oz and 9.8 oz. The initial instinct may be to set the tolerance to +/-0.2. This would be setting the process tolerance strictly on Worst-Case Tolerances.
As discussed earlier, it is also important to account for Statistical Tolerance. For example, let’s imagine an average of 30% of all sauce produced falls outside of the +/-0.2 oz range. Let’s also imagine 100% fall within a +/- 0.5 oz range, and the sauce contains 10.0 oz of bail, on average. If the sauce tastes slightly different but receives an overall positive rating between 10.5 oz and 9.5 oz, is it worth reworking or throwing out 30% of the sauce produced? If the answer is no, it may be a good idea to expand tolerances to +/-0.5oz.
The two traditional methods are only used as a guide, not as the absolute answer. Numerous other factors may not adequately represent specific considerations for your process. What if expanding your tolerance to +/- 0.7 oz could increase production speed by 10%? What if the price of basil skyrockets? Do you essentially give away money by adding more than the 10.0 oz needed for an ideal taste? The point is, it’s complicated. Ultimately, it is up to you to determine what is most valuable and feasible for your company.
YOU’VE DETERMINED A TOLERANCE NOW WHAT?
Once an ideal tolerance has been set, it is critical to make sure your equipment can repeatably perform this task accurately. For years, the rule of thumb has been the 10:1 or Rule of 10. This rule dictates that a measurement instrument must be 10 times as accurate as the characteristic it is measuring. However, as quality and technology advances tolerances are becoming so tight that the 10:1 rule can not always possible. When the 10:1 rule is not a feasible option, the minimum recommendation is to have measuring instruments that can accurately measure your Worst-Case tolerances.
It is worth noting, just because a piece of equipment displays a value does not mean it is accurate to that value. If you need help determining your process tolerance or assessing your measurement equipment, we offer free services like our Tool Roundup Program guide you on that journey.