Every object above absolute zero glows. Turn up the temperature and the glow gets brighter, bluer, and its spectrum shifts toward shorter wavelengths. Here are the three laws that describe it — and the mystery those laws eventually solved.
Radiation is one of the three ways heat gets from A to B (the other two being conduction and convection). Unlike the other two, it doesn't need a medium — the Sun heats the Earth through 150 million km of empty space. The hot body emits electromagnetic waves, the cold body absorbs them, and energy flows from hot to cold. This is the topic of Physics 25, chapter 14.7.
An object that is perfect at absorbing and re-emitting EM radiation at every wavelength is called a blackbody. It has emissivity \(\varepsilon = 1\). Real objects are worse: polished silver is a poor emitter (\(\varepsilon \approx 0.02\)), tungsten filaments sit around \(\varepsilon \approx 0.5\), human skin in the infrared is almost ideal (\(\varepsilon \approx 0.97\)), and carbon black is essentially a blackbody (\(\varepsilon \approx 0.99\)).
Three famous laws govern thermal radiation from a hot surface. They fit together like this:
Stefan–Boltzmann law — how total radiated power scales with temperature:
$$ P \;=\; \sigma\,\varepsilon\,A\,T^{4}, \qquad \sigma = 5.67 \times 10^{-8}\;\tfrac{\mathrm{W}}{\mathrm{m^{2}K^{4}}}. $$Wien's displacement law — where the peak of the spectrum sits:
$$ \lambda_\text{max}\, T \;=\; b, \qquad b = 2.898 \times 10^{-3}\;\mathrm{m\cdot K}. $$Planck's law — the shape of the whole spectrum:
$$ B(\lambda, T) \;=\; \frac{2hc^{2}}{\lambda^{5}}\;\frac{1}{e^{hc/(\lambda k_B T)}-1}. $$And for an object at temperature \(T\) surrounded by a cooler environment at \(T_\text{surr}\), the net rate of heat flow out of the object is
$$ \frac{Q_\text{net}}{t} \;=\; \sigma\,\varepsilon\,A\,\bigl(T^{4} - T_\text{surr}^{4}\bigr). $$Each interactive block below lets you play with one of these laws. Stefan–Boltzmann tells you how much energy leaves. Wien tells you what color the glow is. Planck tells you the whole story — and in doing so, launched quantum mechanics.
Move the three sliders. The sphere on the left is a chunk of matter with surface area \(A\), emissivity \(\varepsilon\), and absolute temperature \(T\). Watch what \(P\) does as you change each one. The fourth-power dependence is dramatic: double the temperature and power jumps by a factor of 16. That's why the Sun (5800 K) shines so much more fiercely than a red-hot iron (1000 K), even though the surface temperature ratio is only six.
How is the radiated power spread across wavelengths? Late-19th-century physicists tried to answer this using classical electromagnetism, dividing the EM modes inside a hot cavity evenly among wavelengths. That approach (the Rayleigh–Jeans law) gave
$$ B_\text{RJ}(\lambda, T) \;=\; \frac{2ckT}{\lambda^{4}}. $$Fine at long wavelengths — it matched the experiments — but as \(\lambda \to 0\), it blows up to infinity. Why does Rayleigh–Jeans fail at short wavelengths? Classical equipartition hands every electromagnetic mode in the cavity the same average thermal energy \(k_BT\), no matter its frequency. But the number of available modes per unit wavelength grows as \(1/\lambda^{4}\) — squeeze to shorter \(\lambda\) and you get vastly more modes, each still carrying \(k_BT\). Summed up, the classical prediction sends the short-wavelength intensity to infinity. Real hot bodies obviously don't radiate infinite ultraviolet, so something was badly wrong with classical physics. This embarrassment was christened the ultraviolet catastrophe.
The quantum fix, as you'll see below, is that a short-wavelength mode costs a discrete photon of energy \(hc/\lambda\). When \(hc/\lambda \gg k_BT\), the Boltzmann factor \(e^{-hc/\lambda k_BT}\) becomes tiny — thermal kicks simply can't afford to populate those modes, so Planck's curve plunges toward zero instead of diverging.
Max Planck fixed it in 1900 by forcing the oscillators inside the body to have quantized energies \(E_n = nhf\). The extra rule suppresses the short-wavelength modes (a photon \(hc/\lambda\) becomes too energetic for the body's thermal kicks to excite), and out pops the correct spectrum:
$$ B(\lambda, T) \;=\; \frac{2hc^{2}}{\lambda^{5}}\;\frac{1}{e^{hc/(\lambda k_B T)}-1}. $$This was the first appearance of the quantum constant \(h\) in physics. Planck did not believe his own result at first; it took Einstein's photoelectric paper five years later to convince him (and everyone else) that energy quantization was real, not a bookkeeping trick.
Drag the slider to change the temperature. Toggle the Rayleigh–Jeans overlay and watch it launch off the top of the chart at short wavelengths while Planck's curve politely tapers to zero. The vertical dashed line marks Wien's peak \(\lambda_\text{max} = b/T\), which slides toward shorter wavelengths as \(T\) rises.
Wien's displacement law connects the peak of the Planck spectrum to temperature:
$$ \lambda_\text{max} \;=\; \frac{b}{T}, \qquad b = 2.898 \times 10^{-3}\;\mathrm{m\cdot K}. $$That's why the color of a glowing object tells you its temperature. A room-temperature rock (\(T = 300\) K) has \(\lambda_\text{max} \approx 9.7\;\mu\text{m}\), deep in the thermal infrared — invisible to our eyes. Put it in a forge and it starts to emit visible photons: first a dull red around 900 K, brightening to orange near 1400 K, yellow near 2500 K, and white by 5000 K. Past about 6500 K the peak slides into the blue end of the spectrum and the object looks blue-white, like a hot star.
Drag the temperature slider below. The rock's surface color is computed from the Planck spectrum at that \(T\) (converted to RGB); it's the same color a real blackbody of that temperature would look to your eye.
Every object radiates and absorbs. Put a small rock in sunlight. The Sun pumps energy into it at some rate \(P_\text{in}\). The rock, at its own temperature \(T\), radiates energy back into space at \(\sigma\varepsilon A T^{4}\). The rock warms up until those two rates balance:
$$ P_\text{in} \;=\; \sigma\,\varepsilon\,A\,T_\text{eq}^{4} \;\;\Rightarrow\;\; T_\text{eq} \;=\; \left(\frac{P_\text{in}}{\sigma\,\varepsilon\,A}\right)^{1/4}. $$That steady-state temperature \(T_\text{eq}\) is the one at which emission and absorption just cancel. If you boost the Sun's intensity, the rock has to get hotter (since \(P_\text{out}\) is steep in \(T\)) to match it. If you lower the rock's emissivity, it's worse at dumping heat, so again \(T_\text{eq}\) climbs. This is exactly the energy-balance argument behind greenhouse warming, pottery kilns, and the equilibrium temperature of planets.
Real numbers for a sanity check (Giancoli Physics 23, §14–8). At the top of Earth's atmosphere the Sun delivers the solar constant \(S \approx 1350\,\mathrm{W/m^{2}}\); on a clear day about \(1000\,\mathrm{W/m^{2}}\) reaches the ground. An object that faces the Sun at angle \(\theta\) from vertical intercepts \(P_\text{in} = S\,A\cos\theta\). A 0.1 m² rock in full sun at noon (\(\cos\theta \approx 1\)) therefore gets \(\sim 100\,\mathrm{W}\). Try plugging that in below: with \(\varepsilon = 0.9\) and \(A = 0.1\) m², \(T_\text{eq} \approx 370\) K — about right for asphalt on a hot day.
The animation shows photons streaming from the Sun onto the rock and the rock's own (cooler, redder) re-emitted photons heading back to space. Toggle Let it reach equilibrium to run a slow simulation where the rock's temperature relaxes to \(T_\text{eq}\). Change the Sun's luminosity or the rock's emissivity to see the balance shift.