- Residential Market
- Light Commercial Market
- Commercial Market
- Indoor Air Quality
- Components & Accessories
- Residential Controls
- Commercial Controls
- Testing, Monitoring, Tools
- Services, Apps & Software
- Standards & Legislation
- EXTRA EDITION
IR devices, such as the Fluke 61 and 65 IR thermometers, make it safe to take surface temperature measurements of items like rotating, hard-to-reach, electrically live, or dangerously hot targets. For preventive maintenance tasks, they cut measurement time to almost zero with the ability to take a surface temperature reading in less than one second.
IR thermometers can be used to conduct a variety of different types of measurements, including:
Proper Use Of IR TechnologyAlthough IR temperature measurement will never be as accurate as a calibrated contact temperature device, a typical reading will be within 2 degrees F of the absolute temperature when the instrument is properly applied.
Putting IR technology to use is easy but there are two critical parameters that must be understood to ensure proper and consistent temperature measurements with infrared-type devices:
Optical ResolutionOptical resolution refers to the sample area the IR meter is measuring at a given distance (see Figure 1). Optical resolution is also referred to as the "distance-to-spot-size ratio" or "field-of-view."
Know your application! A device with a 4:1 optical resolution cannot effectively be used to measure an item's temperature 15 feet away - even if the laser beam sight can go that far. On the other hand, the 4:1 ratio gives a smaller minimum spot size than the 10:1 (i.e., the 4:1 can sample a smaller spot than the 10:1).
Try to determine how you are going to apply the IR thermometer before purchasing, and then buy the one that provides the appropriate optical resolution for the application. Many erroneous readings are taken because the technician unknowingly samples a larger area than the object he is trying to measure.
EmissivityEmissivity indicates the ability of an object to emit infrared energy. Emissivity is based upon the material from which the object is constructed and the surface finish. Values can range from less than 0.1 for a highly reflective body to 1.0 for an ideal black body (see Figure 2).
Items such as soft-drawn copper are very smooth and shiny even under a microscope, while other objects such as lacquer paint appear quite porous under the microscope. The porous object will have a relatively high emissivity (typically 0.7 to 0.98), while new soft-drawn copper (shiny, not oxidized) will have a low emissivity (typically below 0.2). Shiny objects have a tendency to reflect IR energy from objects surrounding them, which dilutes the IR energy from the measured object. A porous body tends to absorb surrounding IR energy, thus emitting its IR energy without dilution (like a black body).
Low-cost IR measurement instruments (under $400) are typically fixed at 0.95 emissivity (the Fluke 61 and 65 IR thermometers have a 0.95 emissivity). To get an effective absolute temperature reading, the surface being measured must have an emissivity close to 0.95. This can be accomplished by measuring a surface that is not too reflective. A shiny surface can be coated with black paint, electrical tape, felt pen, or anything else that will be less reflective.
If a 0.95 fixed emissivity IR instrument is used to measure an object that is not close to 0.95, the reading will be incorrect as follows:
The Effect Of Incorrect ApplicationUnderstanding the optical resolution and emissivity ratings of your IR thermometer and the target you plan to measure will help you avoid inaccurate measurements. The example below illustrates how an incorrect application can lead to inaccurate results.
A technician needs to take a temperature measurement on an air conditioning system's new chill water line to calculate the efficiency of the system's heat exchanger. The technician has just purchased his first IR thermometer and is anxious to put it to use. He decides to compare it against his digital contact thermometer. Here are the facts:
Attempt # 1:
The technician already knows the proper line temperature of the chill water line is around 55 degrees F. He holds the IR thermometer 12 inches away from the line and reads a temperature of 72 degrees F. He moves the thermometer to within three inches of the target and the measurement on the IR thermometer lowers to 68 degrees F. A slight improvement, but still not close to the expected value.
In this first attempt, the technician has adjusted his distance from the target to fall within the instrument's optical resolution, but has not accounted for emissivity differences between the instrument rating and the material.
Attempt # 2:
Ten minutes later, the technician returns to the air conditioning unit with his IR instruction sheet and some black electrical tape in hand. He applies a few pieces of the electrical tape to cover the supply and discharge of the shiny copper chill water line. Upon remeasuring at a distance of three inches with the IR thermometer targeted at the electrical tape, the thermometer registers 56 degrees F, which is within the accuracy specifications for the instrument.
He then measures the discharge at the heat exchanger, which is 72 degrees. With the two temperature readings, he can now calculate the heat removal rate on the heat exchanger, which is the measure of the system's efficiency. The technician leaves the jobsite with a better understanding of IR technology and the constraints of optical resolution and fixed emissivity.
Reprinted with permission from the Fluke Corp. Application Note "Non-contact temperature measurements using IR thermometers." For more information, visit www.fluke.com.
Publication date: 05/30/2005