Like HowStuffWorks on Facebook!

How Thermal Imaging Works


Thermal Imaging Heats Up
Sir William Herschel, the astronomer who discovered infrared wavelengths. He’s also credited with discovering the planet Uranus.
Sir William Herschel, the astronomer who discovered infrared wavelengths. He’s also credited with discovering the planet Uranus.
©Stock Montage/Getty Images

Thermographic cameras are high-tech, modern-day devices. But the discovery of infrared light came a long, long time ago.

In 1800, a British astronomer named Sir William Herschel discovered infrared. He did so by using a prism to split a ray of sunlight into its different wavelengths and then holding a thermometer near each color of light. He realized that the thermometer detected heat even where there was no visible light -- in other words, in the wavelengths where infrared exists.

Throughout the 1800's, a series of intrepid thinkers experimented with materials that changed in conductivity when exposed to heat. This led to the development of extremely sensitive thermometers, called bolometers, which could detect minute differences in heat from a distance.

Yet it wasn't until after World War II that infrared research really started heating up. Rapid advances took place, in large part thanks to the discovery of transistors, which improved the construction of electronics in a multitude of ways.

These days, the evolution of infrared cameras has diverged into two categories, called direct detection and thermal detection.

Direct detection imagers are either photoconductive or photovoltaic. Photoconductive cameras employ components that change in electrical resistance when struck by photons of a specific wavelength. Photovoltaic materials, on the other hand, are also sensitive to photons, but instead of changing resistance, they change in voltage. Both photoconductive and photovoltaic cameras both require intense cooling systems in order to make them useful for photon detection.

By sealing the imager's case and cryogenically cooling its electronics, engineers reduce the chance of interference and greatly extend the detector's sensitivity and overall range. These kinds of cameras are pricey, more prone to failure and expensive to fix. Most imagers don't have integrated cooling systems. That makes them somewhat less precise than their cooled counterparts, but also much less costly.

Thermal detection technology, however, is often integrated into tools called microbolometers. They don't detect photons. Instead, they pick up on temperature differences by sensing thermal radiation from a distant object.

As microbolometers absorb thermal energy, their detector sensors rise in temperature, which in turn alters the electrical resistance of the sensor material. A processor can interpret these changes in resistance and use the data points to generate an image on a display. These arrays don't need any crazy cooling systems. That means they can be integrated into smaller devices, such as night vision goggles, weapons sights and handheld thermal imaging cameras.


More to Explore