A firefighter enters a burning building. Visibility drops to zero as thick smoke fills every corridor. Yet somehow, through the thermal imaging camera mounted on the helmet, the outline of a child becomes visible behind a couch. Minutes later, another firefighter points a thermal camera at a window and sees nothing but a reflection—the glass appears as a solid wall to the infrared sensor. What makes these two scenarios so different?
The answer lies in the fundamental physics of electromagnetic radiation and how different materials interact with different wavelengths of light. Understanding this distinction requires diving into the invisible world of infrared radiation that surrounds every object, every moment of every day.
The Invisible Spectrum
In 1800, astronomer William Herschel made an unexpected discovery while testing different colored glass filters for solar observations. Using a thermometer to measure the heating effect of different colors in a prism-separated spectrum, he noticed the temperature continued rising beyond the red end of visible light. He had discovered infrared radiation—the “invisible rays” that would eventually revolutionize everything from military targeting to building inspections.
Every object with a temperature above absolute zero emits infrared radiation. This isn’t a metaphor or an approximation—it’s a fundamental property of matter. The stovetop burner glows red when hot, but even at room temperature, it’s emitting infrared radiation that human eyes cannot detect. The relationship between temperature and radiation follows the Stefan-Boltzmann law, which states that the total energy radiated increases with the fourth power of temperature:
$$P = \varepsilon \sigma A T^4$$Where $P$ is the radiated power, $\varepsilon$ is emissivity, $\sigma$ is the Stefan-Boltzmann constant, $A$ is surface area, and $T$ is absolute temperature. This fourth-power relationship means that a doubling of absolute temperature results in sixteen times more radiated energy.
How Thermal Cameras Detect What Eyes Cannot
Traditional cameras capture reflected visible light. A red apple appears red because it reflects red wavelengths while absorbing others. Thermal cameras work differently—they capture emitted infrared radiation. This distinction explains why thermal cameras function in complete darkness: they don’t need an external light source because every object generates its own thermal signature.
Modern thermal cameras primarily use microbolometer arrays to detect this radiation. A microbolometer is essentially a grid of tiny thermal sensors, each pixel functioning as a miniature thermometer. When infrared radiation strikes a pixel, it heats the sensing material, causing a change in electrical resistance. This resistance change gets measured and converted into a temperature value, which then gets mapped to a color in the final image.
graph TB
subgraph "Microbolometer Pixel Structure"
A[IR Radiation<br/>8-14 μm] --> B[Absorbing Layer<br/>VOx or a-Si]
B --> C[Resistance Change]
C --> D[ROIC<br/>Readout Circuit]
D --> E[Temperature Value]
B -.- F[Suspended Bridge<br/>~2 μm gap]
F -.- G[Thermal Isolation]
G -.- H[Vacuum Package]
end
The architecture of a microbolometer pixel reveals why thermal cameras achieve relatively low resolutions compared to visible-light cameras. Each pixel must be thermally isolated from its neighbors and the underlying readout circuit. The sensing material hangs suspended on a bridge-like structure approximately 2 micrometers above a silicon substrate, with a vacuum-sealed package to prevent heat loss through convection. This complex structure demands significantly more physical space than the photodiodes in conventional camera sensors.
The Wavelength Problem: LWIR vs. Glass
Thermal cameras operate primarily in the long-wave infrared (LWIR) band, typically 8-14 micrometers in wavelength. This specific range aligns with an atmospheric “window” where water vapor and carbon dioxide absorb minimally, allowing clear imaging over distance. More importantly, this wavelength range corresponds to the peak emission of objects at terrestrial temperatures—around 300 Kelvin (27°C), the peak emission wavelength is approximately 10 micrometers according to Wien’s displacement law.
Here’s where the glass problem emerges. Ordinary glass is transparent to visible light (0.4-0.7 micrometers) but essentially opaque to LWIR radiation. The molecular structure of silicon dioxide—the main component of glass—absorbs infrared radiation beyond about 3 micrometers. When a thermal camera points at a window, the glass itself emits radiation based on its own temperature while reflecting radiation from the camera operator and surroundings. The result: thermal cameras cannot see through glass windows.
Smoke, however, presents an entirely different scenario. Smoke particles are typically between 0.1 and 1 micrometer in diameter—smaller than LWIR wavelengths. While these particles scatter and absorb visible light effectively (creating the opaque appearance), they have minimal interaction with the longer infrared wavelengths. The 10-micrometer infrared waves essentially flow around the smoke particles like ocean waves around a pier piling.
This size-dependent scattering relationship follows Mie scattering theory, where particles comparable to the wavelength of light cause maximum scattering. For visible light with wavelengths around 0.5 micrometers, smoke particles scatter light efficiently, creating visibility problems. For 10-micrometer infrared waves, the same smoke particles appear relatively insignificant.
The Two Dominant Sensor Technologies
Within the microbolometer world, two materials dominate the market: vanadium oxide (VOx) and amorphous silicon (a-Si). These aren’t arbitrary choices—they represent fundamentally different approaches with distinct trade-offs.
Vanadium oxide sensors, developed first in the late 1970s by Honeywell, offer higher thermal sensitivity. The temperature coefficient of resistance—the percentage change in resistance per degree of temperature change—reaches approximately -2% per Kelvin for VOx materials. This translates to noise-equivalent temperature differences (NETD) of 20-30 millikelvin, meaning these sensors can distinguish temperature differences smaller than 0.03°C under ideal conditions.
Amorphous silicon sensors emerged about a decade later and offer manufacturing advantages. They integrate more easily with standard CMOS fabrication processes, require lower deposition temperatures, and achieve faster response times. However, their NETD typically falls around 50 millikelvin, and they exhibit higher 1/f noise—a type of noise that increases at lower frequencies and can obscure subtle temperature differences.
| Property | Vanadium Oxide (VOx) | Amorphous Silicon (a-Si) |
|---|---|---|
| NETD (Sensitivity) | 20-30 mK | ~50 mK |
| Market Share | ~70% | ~13% |
| TCR | ~-2%/K | Lower than VOx |
| 1/f Noise | Lower | Higher |
| CMOS Integration | More complex | Easier |
| Response Time | Standard | Faster |
The market reflects these performance differences. VOx technology commands approximately 70% market share, with defense and high-end industrial applications preferring its superior sensitivity. Amorphous silicon holds about 13% of the market, primarily in cost-sensitive commercial applications.
The Emissivity Complication
A thermal camera pointed at a polished metal surface will display misleading temperatures. This phenomenon stems from emissivity—the measure of how efficiently a surface emits thermal radiation compared to a perfect blackbody.
A perfect blackbody has an emissivity of 1.0, emitting the maximum theoretically possible radiation at any temperature. Real materials range widely: asphalt approaches 0.98, human skin around 0.97, while polished aluminum drops below 0.05. Low-emissivity surfaces reflect more infrared radiation from their surroundings than they emit themselves, creating “thermal mirrors” that show the camera operator’s own heat signature rather than the surface temperature.
Professional thermographers must adjust camera settings for different materials or apply high-emissivity tape to problematic surfaces. Without these corrections, a thermal image of electrical connections might show false hotspots or miss genuine problems entirely.
From Military Secret to Smartphone Accessory
The development of thermal imaging followed a familiar technology trajectory: military origins, gradual declassification, and eventual consumer accessibility. The US military developed single-element infrared detectors in the late 1950s, with Texas Instruments, Hughes Aircraft, and Honeywell leading early efforts. These initial systems required cryogenic cooling—often using liquid nitrogen or Stirling-cycle refrigerators—making them bulky, expensive, and slow to deploy.
Honeywell’s development of the uncooled microbolometer in the late 1970s changed everything. By eliminating the need for cooling systems, thermal cameras could become smaller, cheaper, and more reliable. The US government declassified this technology in 1992, opening the door for commercial applications.
The first thermal imaging camera designed specifically for firefighting appeared in 1998. By 2006, home inspectors and contractors had begun adopting the technology for building diagnostics. In 2014, smartphone-attachable thermal cameras entered the market, with modules like the FLIR Lepton shrinking the sensor smaller than a dime.
Modern Applications and Remaining Limitations
Thermal imaging now serves roles far beyond its military origins. Building inspectors identify heat leaks and moisture intrusion. Electrical maintenance teams detect overheating connections before failures occur. Medical professionals use thermography for vascular screening and inflammation detection. Firefighters locate victims in smoke-filled rooms. Automotive systems incorporate thermal sensors for night-vision pedestrian detection.
Yet limitations persist. Thermal cameras cannot measure internal temperatures—they only see surface radiation. Atmospheric conditions affect performance, with high humidity attenuating LWIR signals. Reflective surfaces require careful technique. And the resolution gap between thermal and visible-light cameras remains substantial: while smartphone cameras exceed 12 megapixels, high-end thermal cameras typically max out around 1.3 megapixels (1280×1024), with many commercial units operating at 320×240 or lower.
The physics that enables thermal cameras to see through smoke while being blocked by glass—a consequence of wavelength-dependent interactions with matter—continues to define both their capabilities and their constraints. Understanding these principles allows users to deploy the technology effectively, recognizing that thermal imaging reveals temperature, not transparency, and that the invisible world of infrared radiation operates by its own rules.
References
- Herschel, W. (1800). Experiments on the Refrangibility of the Invisible Rays of the Sun. Philosophical Transactions of the Royal Society of London.
- Wikipedia contributors. (2026). Microbolometer. Wikipedia, The Free Encyclopedia.
- Wikipedia contributors. (2026). Thermography. Wikipedia, The Free Encyclopedia.
- FLIR Systems. (2020). How Do Thermal Cameras Work? FLIR Discovery.
- Axiom Optics. (2024). The Differences Between SWIR, MWIR, and LWIR Cameras.
- GST-IR. (2024). Uncooled Infrared Detectors: VOx Vs. α-Si.
- InterNACHI. (2024). The History of Infrared Thermography.
- Optris. (2024). NETD: Noise Equivalent Temperature Difference.
- Stefan, J. (1879). On the relationship between thermal radiation and temperature. Sitzungsberichte der mathematisch-naturwissenschaftlichen Classe der kaiserlichen Akademie der Wissenschaften.
- Boltzmann, L. (1884). Ableitung des Stefan’schen Gesetzes. Annalen der Physik.