How do Digital Cameras Work?

The digital camera was, not so long ago, the poor relation when compared to chemical-based film. But in recent years, with advancing technology and lowering prices, digital cameras have become ubiquitous. Large numbers of people have cameras with them wherever they go, either as part of their phone or in pocket-size point-and-shoot devices. High end DSLR cameras have overtaken film for professional photography, and film cameras for stills are becoming obsolete.

At the heart of all digital cameras is an image sensor, which converts light information transmitted via a lens into an electrical signal that can then be stored and called up later by a computer, which reveals it as a photograph. There are various technologies used in image sensors, but by far the most popular is the charge-coupled device (CCD). A CCD is an array of capacitors that are sensitive to light — when you hear cameras advertised by their resolution, it's the number of these capacitors that is being referred to. As particles of light (photons) strike the capacitors, they generate electrons. This creates an overall charge that can then be read as an indication of light intensity.

CCDs don't read the color of the light, just its intensity, so to produce color photographs there must be a way of discriminating the intensity of the various colors of incoming light. These colors are known as the additive primaries: green, blue, and red. All the colors you see in a digital photograph are built of these colors. One way to do this, and the most expensive, is to have three CCDs in each camera and use a prism to split the light up before directing each color to a different sensor. A less expensive method is to use a colored lattice called a Bayes filter mosaic, which is similar to a three-colored chess board. Each 2 x 2 section of the lattice consists of a pair of green squares diagonally opposite each other and a red and blue square. There are two green squares because of the way the human eye works; it determines intensity using mostly green light. The filter only allows light of those colors through to the CCD, so each capacitor measures the intensity of one color. The camera — or the computer the pictures are uploaded to — then has to use a demosaicing algorithm to fill in the missing information. Ironically, the Bayes filter was developed at the labs of Kodak Eastman, which has recently filed for bankruptcy because its core business of making film is no longer viable .

The image quality of a digital camera is affected by more than just the resolution of the sensor. The size of sensor has an impact too. Smaller cameras, like those in phones and point-and-shoot cameras, come with smaller sensors, usually around 6 mm, which results in poor low-light performance. DSLR cameras have much larger sensors, up to 36 mm, which create images more suitable to professional photography with less cropping and narrower depth-of-field.

As digital photography advances, the marketing battle will shift from concentrating on resolution — there's only so much the human eye can discriminate and higher resolutions will bring diminishing returns. Instead they will focus on different aspects of camera design, like the mirrorless interchangeable-lens camera which offers the same quality as DSLR but in a much slimmer body.

Live Science Staff
For the science geek in everyone, Live Science offers a fascinating window into the natural and technological world, delivering comprehensive and compelling news and analysis on everything from dinosaur discoveries, archaeological finds and amazing animals to health, innovation and wearable technology. We aim to empower and inspire our readers with the tools needed to understand the world and appreciate its everyday awe.