Login

Your Name:(required)

Your Password:(required)

Join Us

Your Name:(required)

Your Email:(required)

Your Message :

Your Position: Home - Machinery - Resources For Manufacturing, Inc.

Resources For Manufacturing, Inc.

Author: Geym

Feb. 04, 2024

Machinery

Many companies experience difficulties regarding product acceptance to specifications simply because they fail to utilize the ten to one rule when making the choice of a measurement instrument to determine adherence to specifications.

Simply stated the “Rule of Ten” or “one to ten” is that the discrimination (resolution) of the measuring instrument should divide the tolerance of the characteristic to be measured into ten parts. In other words, the gage or measuring instrument should be 10 times as accurate as the characteristic to be measured. Many believe that this only applies to the instruments used to calibrate a gage or measuring instrument when in reality it applies to the choice of instrument for any measuring activity. The whole idea here is to choose an instrument that is capable of detecting the amount of variation present in a given characteristic.

If we were to plot on a run chart the achieved values from a gage that has been selected that is one to one or even two to one resolution to the part tolerance, the graph would show almost a straight line. This is because the instrument is not capable of detecting the inherent normal variation that exists in the part.

In order to achieve reliable measurement, the instrument needs to be accurate enough to accept all good parts and reject all bad parts. Conversely the gage should not reject good parts nor accept bad ones. The real problem will arise when your company uses an instrument that is only accurate enough to measure in thousandths and accepts parts based upon that result and the customer uses gages that discriminate to ten-thousandths and reject parts sent to them for being .0008” over the specification limit.

Any company that controls their processes through the use of statistical tools will have a very difficult task to meet SPC indices of acceptable levels if the data they collect is based upon numbers achieved with gages that will not reflect the normal variation present in the process.

One statistical tool that is used to test the worthiness of a gage to control the production process is called a Gage R&R Study (Repeatability & Reproducibility). Repeatability is the ability of one operator to achieve the same results when measuring the same dimension after repeated trials. Reproducibility is the ability of multiple operators to achieve the same results when using the same gage to measure the same dimension on the same parts after repeat trials.

Acceptance of the gage to perform the task at hand is determined when after performing the test (study) meets the following criteria:

10% of the total tolerance (or process variation) or less = the gage is acceptable
11-30% = acceptable only based upon the application, and must be closely monitored.
31% or over = the gage is unacceptable for use on this application.

Often it is quite difficult to pass the gage R&R study even when the ten to one rule is used. So, in order to give yourself the advantage to be begin with, start by choosing a gage that is accurate to 1/10   th of tolerance of the characteristic to be measured.

To the uninitiated, the terms accuracy, repeatability and precision might appear to all be pretty mutually exclusive. However, in the world of instrumentation, there is actually a distinct difference between the terms, making it important to understand what each one means and its relation to the others.

Suggested reading:
Questions to Ask Before Joining a Carbon Program
Industrial Ceiling Fan Buying Guide
Unlocking Efficiency with Advanced Scrap Metal Crushers
Refrigerated Vs. Desiccant Compressed Air Dryer
Which motor is best for drone?
Will a Glock 27 stop a bear?
Why isn't my Bluetooth working on my iPhone?

Let’s start by looking at their definitions.

 

Accuracy

The accuracy of instrumentation is determined by the difference of a measured value compared to its actual (true) value.  As no measurement is 100% exact an element of inaccuracy needs to be considered, hence the reason why accuracy figures are quoted with ‘±’.  Ultimately, accuracy measures how close you come to the correct result. Your accuracy improves when your instruments or tools are calibrated properly.

 

Precision

Precision is the repeated measurement of a set of values relative to each other, rather than the actual value.  Precision improves when using finely incremented tools that require less estimation; better equipment and improved procedures equals better precision.

 

Repeatability

Repeatability allows you to measure how close a particular result or set of data is compared to the same measurement, made with the same device or instrument, under the exact same circumstances.  In other words, the measurement procedure, observer, device or instrument, testing conditions and location would all need to be exactly the same and testing needs to be conducted over a short space of time.

 

Illustrating the difference

Getting to grips with these definitions and how they affect each other can be quite complicated. To make things easier to understand, consider the simple diagrams below.

Resources For Manufacturing, Inc.

Are accuracy, precision and repeatability the same thing?

Suggested reading:
What is Metal Plates Forming Presses
How to Choose the Right Telescopic Boom Lifting Height?
Telescopic vs. Articulating Boom Lifts: Choosing the Right Aerial Work Platform
How to Choose a Portable Induction Heating Machine
What is a concrete extruder?
What is the DTH method of drilling?
What are the different types of RC drilling rig?

135

0

0

Comments

0/2000

All Comments (0)

Related Articles

Guest Posts

If you are interested in sending in a Guest Blogger Submission,welcome to write for us!

Your Name:(required)

Your Email:(required)

Subject:

Your Message:(required)