temperature and humidity environmental conditions for calibration

3
Temperature and Humidity Environmental Conditions for Calibration If you base your considerations off the old MIL-STD-45662, 5.3 Environmental controls states: Measuring and test equipment and measurement standards shall be calibrated and utilized in an environment controlled to the extent necessary to assure continued measurements of required accuracy giving due consideration to temperature, humidity, vibration, cleanliness, and other controllable factors affecting precision measurement. When applicable, compensating corrections shall be applied to calibration results obtained in an environment which departs from standard conditions.Extremely vague, but, (then) NBS suggested 68 to 72 Degrees @ 30% humidity. With the move to the ISO standards ISO-10012-1 AND ANSI-Z540-1, additional considerations were added. The most accepted standard conditions for almost all dimensional calibrations is 20°C+/-3°C (68°F+/-4°F) with <50% Humidity. (Personally, I prefer 65 degrees min to about 75 degrees Max, with <40% humidity.) And now is where things get a bit tricky. To control uncertainties, consideration (adjustments) must be made as temperature range varies. So, in a nutshell, gauge blocks and instruments grow with temperature increases. They grow (assuming steel gauge blocks and micrometer spindles) at a rate of 11.5μm/m/°C (6.4μin/in/°F) ± about 10%. Therefore, with each °C change from 20°C (68°F Considered nominal), we add to uncertainty of 2.3μm/m (or 0.06μm per 25mm). Example: Uncertainty on a micrometer at ±2.5μm (0.0001") over a 25mm (1") span would dictate that you would not want to stray from 20°C by more than about 5°C (68°F ± 9°F). If your micrometers are less accurate, then you could stray more. If they are more accurate, then you need a tighter environment control. This is all reliant on the base uncertainty of the company calibrating your tools, because if they start high, it lessens the room for error allowable for temperature. The biggest worry is usually rate of change, because the mass of a gauge block versus the mass of a set of calipers or micrometer spindle is a big difference. If your environment can't stay within about 2°C/hour, then calibrations start really giving false measurements. If the temperature of the instrument and the temperature of the standard vary by much, errors really start adding up quickly (hence the little chunks of plastic on the micrometer are known as 'heat shields', and are meant to slow heat transfer from your hand to the micrometer). Calipers are more forgiving (because of the 10:1 accuracy of a mic vs. a caliper), and you can generally get away with about 20°C ± a city block.

Upload: mtcengineering

Post on 01-Nov-2014

34 views

Category:

Documents


8 download

TRANSCRIPT

Page 1: Temperature and Humidity Environmental Conditions for Calibration

Temperature and Humidity Environmental Conditions for Calibration

If you base your considerations off the old MIL-STD-45662, 5.3 Environmental controls states: “Measuring and

test equipment and measurement standards shall be calibrated and utilized in an environment controlled to the

extent necessary to assure continued measurements of required accuracy giving due consideration to temperature,

humidity, vibration, cleanliness, and other controllable factors affecting precision measurement. When

applicable, compensating corrections shall be applied to calibration results obtained in an environment which

departs from standard conditions.”

Extremely vague, but, (then) NBS suggested 68 to 72 Degrees @ 30% humidity.

With the move to the ISO standards ISO-10012-1 AND ANSI-Z540-1, additional considerations were added.

The most accepted standard conditions for almost all dimensional calibrations is 20°C+/-3°C (68°F+/-4°F) with

<50% Humidity. (Personally, I prefer 65 degrees min to about 75 degrees Max, with <40% humidity.)

And now is where things get a bit tricky. To control uncertainties, consideration (adjustments) must be made as

temperature range varies.

So, in a nutshell, gauge blocks and instruments grow with temperature increases. They grow (assuming steel

gauge blocks and micrometer spindles) at a rate of 11.5µm/m/°C (6.4µin/in/°F) ± about 10%.

Therefore, with each °C change from 20°C (68°F Considered nominal), we add to uncertainty of 2.3µm/m (or

0.06µm per 25mm).

Example: Uncertainty on a micrometer at ±2.5µm (0.0001") over a 25mm (1") span would dictate that you would

not want to stray from 20°C by more than about 5°C (68°F ± 9°F). If your micrometers are less accurate, then

you could stray more. If they are more accurate, then you need a tighter environment control. This is all reliant

on the base uncertainty of the company calibrating your tools, because if they start high, it lessens the room for

error allowable for temperature.

The biggest worry is usually rate of change, because the mass of a gauge block versus the mass of a set of

calipers or micrometer spindle is a big difference. If your environment can't stay within about 2°C/hour, then

calibrations start really giving false measurements. If the temperature of the instrument and the temperature of

the standard vary by much, errors really start adding up quickly (hence the little chunks of plastic on the

micrometer are known as 'heat shields', and are meant to slow heat transfer from your hand to the micrometer).

Calipers are more forgiving (because of the 10:1 accuracy of a mic vs. a caliper), and you can generally get away

with about 20°C ± a city block.

Page 2: Temperature and Humidity Environmental Conditions for Calibration

Humidity does not affect caliper or micrometer calibration, although, taking gage blocks into an environment

with >60% humidity will set conditions that would make ANY tool rust quickly.

Here are some figures from a couple of readily available documents --

NCSL Recommended Practice RP-14 (March 1999) Guide to selecting standards-laboratory environments.

(NCSL International, Boulder CO USA)

ISA Recommended Practice ISA-RP52.1-1975 (June 1975) Recommended environments for standards

laboratories. (ISA, Research Triangle Park, NC USA)

TEMPERATURE NOTES:

Dimensional, Optical and Mass

NSCL: 20 °C +/- 0.5 °C for general calibrations

ISA: 20 °C +/- 1 °C overall and +/- 0.3 °C at the point of measurement

All other disciplines

NCSL: 23 °C +/- 2 °C for general calibrations

ISA: 23 °C +/- 1.5 °C

RELATIVE HUMIDITY NOTES:

Dimensional, Optical, Mass

NCSL: 40% +/- 5% RH at 20 °C

ISA: 45% RH maximum at 20 °C

All other disciplines

NCSL: 40% +/- 5% RH at 23 °C

ISA: 20 - 55% RH at 23 °C

Each of these recommended practices contain a lot of other information, of course. Both of them are targeted at

"standards" laboratories; if you are in a company's production lab, this is the type of lab you would send your

master standards. In both cases, the values above are the most relaxed but still might be tighter than what you

want to maintain.

Page 3: Temperature and Humidity Environmental Conditions for Calibration

Measurements using gage blocks: 20 °C +/- 0.5 °C and 10 - 45% RH

Other dimensional/optical/mass: 23 °C +/- 3 °C and 20 - 60% RH

Electrical/Electronic/RF/Microwave: 23 °C +/- 5 °C and 20 - 60% RH.

Other considerations are: Do you perform all calibrations in the same area? Is there an extreme difference

between the inspection area and the manufacturing area? Do you have a method to monitor your environment so

you can plan or schedule calibration accordingly?

The NIST web site is an excellent source of gage calibration data. Here are a couple of items that may be of great

use: The State Weights and Measures Laboratory Handbook (NIST Handbook 143), and the Gage Block

Handbook (NIST Monograph 180).

The final item to consider that isn’t written is common sense.

If it’s raining cats and dogs and 98 degrees, you would consider checking plastic parts with +/- .001 tolerances.

On the other hand, if you’re measuring +/-.030, the thermo expansion influence would be negligible.

Tony Casillas

Sr. Engineer