It looked like Mars, or the Southern Californian wasteland in Blade Runner 2049, or the deserts of Dune. Almost 100 wildfires have ravaged the western United States in the past month, scattering particles of ash and smoke into the air and forcing 500,000 people to evacuate their homes in Oregon alone. On Wednesday, residents across the West, already suffering from a pandemic, economic collapse, wildfires, and dangerously bad air quality, woke up to a dark, bronzed sky that nearly shut out all daylight. As the day wore on, the smoke thickened and receded, making the city seem red at some hours, amber at others. Masks to ward off the coronavirus now served double duty.

But as people tried to capture the scene, and the confusion and horror that accompanied it, many noticed a strange phenomenon: Certain photographs and videos of the surreal, orange sky seemed to wash it out, as if to erase the danger. “I didn’t filter these,” tweeted the journalist Sarah Frier, posting photos she took of San Francisco’s haunting morning sky. “In fact the iPhone color corrected the sky to make it look less scary. Imagine more orange.” The photos looked vaguely marigold in hue, but not too different from a misty sunrise in a city prone to fog. In some cases, the scene seemed to revert to a neutral gray, as if the smartphones that captured the pictures were engaged in a conspiracy to silence this latest cataclysm.

The reality is both less and more unnerving. The un-oranged images were caused by one of the most basic features of digital cameras, their ability to infer what color is in an image based on the lighting conditions in which it is taken. Like the people looking up at it, the software never expected the sky to be bathed in orange. It’s a reminder that even as cameras have become a way to document every aspect of our lives, they aren’t windows on the world, but simply machines that turn views of that world into images.


Before digital cameras, film set the look of a photograph. But when digital photography was created decades ago, color had to be recreated from scratch. Camera sensors are color-blind—they see only brightness, and engineers had to trick them into reproducing color using algorithms. A process called “white balance” replaced the chemical, color tone of film. Most cameras now adjust the white balance on their own, attempting to discern which objects in a photo ought to look white by compensating for an excess of warm or cool colors. But automatic white balance isn’t terribly reliable. If you’ve tried to take a smartphone photo of a scene with multiple types of light, such as a city sunset, you’ve probably watched the image change tones from redder to bluer as you frame or reframe it. The device struggles to figure out which subject should look white, and which exposure (the amount of light to capture) might best represent it.

Under the blood-red San Francisco sky, white balance doesn’t have a reference against which to calibrate accurately. Because everything was tinted red, the software assumed that the entire scene was generally neutral. People felt confused or even betrayed when their phone cameras transformed the tiger sky into images that washed out the orange, or in some cases made it look mostly gray, like an overcast day.

[Read: The West has never felt so small]

When people started to figure out what was going on, they downloaded apps allowing them to set the white balance on their own. “Here’s what it really looks like out there in San Francisco,” Frier tweeted alongside revised versions of her earlier, viral images. But that’s not what’s really going on, either. You can’t ever “turn off” color correction in a digital camera, because its sensor doesn’t see color in the first place. Color is always constructed in a picture, never simply reproduced.

The same is true of film cameras: Different stocks of film and development processes had their own renditions of color. Kodak Portra sought balanced skin tones, Fuji Velvia aimed for vibrancy, while ordinary color film was balanced for the outdoor tone of light (photographers call it temperature; blame physics). That could make indoor photos look unnaturally yellow, but most folks didn’t notice. A snapshot was a memory, and the colors would seem true enough days or weeks later, when you finally held it in your hands.  

Today, some cameras and apps allow a user to choose a white-balance preset, such as “daylight.” But despite the seemingly descriptive name, the setting is really just a way for the camera to choose a specific color temperature, not a surefire way to make daytime images look right. Others have sliders that allow a user to select a desired tone, dialing in an appearance that matches a desired ideal. That’s not duplicitous—it’s what all photographs have always done.

I don’t live in a place whose sky is flushed by fire, so I asked the author Robin Sloan, who lives in Oakland, California, to take photos illustrating the phenomenon. The image on the left, below, is from the iOS camera. The one on the right was taken with the Halide app, which lets you change exposure settings manually, including white balance.

Robin Sloan

“I would say the reality is about halfway between them,” Sloan told me. He also shared another image taken with a Sony camera set to “daylight” white balance, which made the scene look much stranger than in person. The high contrast in that image, which appears below, tricks the eye into thinking the orange is brighter.

Robin Sloan

For Californians gawking at their fiery sky, an image might never be able to capture the embodied sensation of life beneath it. The smoke would have been moving in concert with the dynamics of the air, for example, causing the apparent colors to shift and dance in person. That phenomenon might be impossible to capture fully in a still image, or even a video. Likewise, the eerie claustrophobia of being surrounded by pure orange wouldn’t translate to a screen, much like a James Turrell installation looks less impressive photographed than in person. The images going viral on social media are evocative. But are they real? No, and yes.


Blaming cameras for their failures or making a That’s just how photography works defense of them can be tempting. But images and videos have never captured the world as it really is—they simply create a new understanding of that world from the light that emits from and reflects off of objects.

People who practice photography as a craft think of their work as a collaboration with materials and equipment. They “make” images—they don’t “capture” them—like how an artist creates a painting with canvas and pigment and medium, or a chef creates a meal with protein, vegetables, fat, and salt. But the equipment has become invisible to the rest of us—a window that steals part of the world and puts it inside of our smartphones.

[Read: Two disasters are exponentially worse than one]

The irony is that software now manipulates images more than ever. Today’s smartphones do huge volumes of software processing apart from just automatically adjusting white balance. Features such as “Portrait” mode, high dynamic range (HDR), and low-light capability strive to invent new styles of pictures. And then there are th
e filters—whose very name was borrowed from the optical instruments used to color-correct film. They make it plainly obvious that images are always manipulated. And yet, somehow, filters further entrenched the idea that images bear truth. An Instagram post tagged #nofilter makes an implicit claim against artifice: This is how it really looked. And yet, there is no such thing as an unfiltered image, just differently filtered ones.

People have become more aware of the risks of accepting a computer’s account at face value. Computer vision systems can exhibit racial and gender bias, for example, assumptions that can produce grave consequences when used to automate hiring or policing. Phone cameras failed to capture scenes amid the fires because camera software could once reasonably assume, at some level, that images taken in daylight should bear somewhat similar colors.

Nobody expected the noon sky to be orange, and even supposedly sophisticated equipment struggles to make sense of it. Maybe the current apocalypse in the West will abate and cameras will feel normal again. But maybe it won’t, and the equipment people use to account for their world will require adjusting or replacement. Everything is falling apart, it seems, even the sensor and the software that runs your camera.

Similar Posts