Modern Photography/Digital processing

The image sensor of a digital camera replaces traditional analog film as the camera's recording medium.

The sensor is often a permanent part of the camera, paired with an image processor that generates digital image data from captured input. The model of camera body chosen determines what sensor and processor are installed.

Comparison to film

edit

Advantages

edit
Lower operating cost
Less expensive to operate, with no continuing costs for film purchase and development.
Essentially unlimited storage options
Widespread availability of cost-effective, light and portable digital storage today.
Color shooting
Almost always color, rather than some being in color, and others being black and white.
Versatility
More versatile shooting, allowing the photographer to change settings between every shot without physically ejecting and replacing a roll of film.
Greater sensitivity range
Far more sensitive than commercially available film, with ISO ratings of 12800 not uncommon (at the close of the 20th century it was generally hard to find ISO 800 film across much of the world, ISO 400 being the highest typically acquirable sensitivity, sometimes only ISO 200).

Disadvantages

edit
Burn-out
There is a very strong tendency for bright areas such as sun-lit clouds to 'burn out' and become a solid block of perfect white, resulting in a loss of detail.
Lower dynamic range
The difference between the darkest shade before black and the brightest shade before white that can be recorded within a scene is known as the dynamic range. Film generally surpasses digital image sensors in this regard, though sensors are rapidly improving.
Susceptible to dust and damage
As the sensor is not changed between shots, dust particles may be visible across an entire batch of shots. Damage to individual pixels—whether from manufacturing defects, physical trauma, or overexposure—is permanent.
Require power and storage
While film can be operated manually and records each image, image sensors require a source of power, and a storage location for captured images.

Format

edit

Sensors come in a wide range of formats—the physical size of the sensor—from 4mm sensors used in smartphones to 8″×10″ sensors for large-format cameras. Camera models based on 35mm designs often come in either a "full frame" format of 36×24mm, or a "cropped frame" format of roughly 24×16mm. Compact cameras may have formats stated in confusing fractions such as 1/2.5″, all of which are much smaller than "full frame". Because the sensor is often a permanent feature of the camera, this limits the availability of other features such as the image processor and interchangeable lenses.

Like film, the sensor format directly affects the size of lenses that can be used with it, and the resulting field of view. Cameras can be miniaturized by pairing a small sensor with a lens of a small focal length, yet can still produce wide angle or telephoto shots. The larger lenses available for larger sensors offer advantages in fidelity, color rendition, range of focal lengths, and reduction of various distortions, but often at greater cost and weight. The lenses available for use on a camera are determined entirely by the lens mount on the camera body, not by the type of image sensor.

Photosites and pixels

edit
 
A macro photo of a digital camera sensor.

The sensor is divided into millions of photosites, each one responsible for capturing light from the scene. Photosites are analogous to film grains, with larger ones offering increased sensitivity and dynamic range, at a cost of having fewer of them. Smaller photosites allow for more detailed images, but with increased image noise and vulnerability to damage. Unlike film grains, photosites are arranged in a rectangular grid.

The size of each photosite is consistent, and correlates with how sharp any detail in the image can be, and depth of field created by the lens. Light focused from the lens needs only be as large as one photosite in order to be sharply rendered.

The image processor translates the sensor values into the pixels (picture elements) that form a digital image. A camera's pixel count is usually determined by the number of photosites. While it may seem logical for each photosite to directly register values for each pixel, this is not necessarily the case for all cameras.

Pixel count alone is not indicative of image quality—cheap cameras may have fewer photosites than pixels, sensor format does not correlate to size or count of photosites or pixels, and smaller format sensors have disadvantages stated above. Despite this, manufacturers often state a pixel count in lieu of a sensor size or photosite count, and only professional models might state the size of the photosites.

Auxiliary sensors

edit

The image sensor may contain sensors for other purposes.

Exposure sensors calculate how much light is present in the scene. They help determine settings for programmed/automatic exposure, or inform the shooter whether manual exposure settings will result in the desired image.

Focus sensors help determine whether the lens is properly focused on the subject. The shooter may be able to choose from different focus targets to allow the subject to be placed in different parts of the scene.

On DSLR cameras, the image sensor is blocked by the viewfinder mirror until the shot is taken. The auxiliary sensors will usually be placed elsewhere on the camera body, with a second mirror diverting light from the scene.

On mirrorless cameras, and DSLR cameras used in "Live preview" mode, the image sensor is exposed, and so the auxiliary sensors may be integrated into the iamge sensor. Less sophisticated cameras may use the actual image data, which reduces costs but lacks the utility of dedicated sensors.

Controls

edit

Unlike film, the image sensor can be configured for different shooting situations.

The image sensor directly controls sensitivity, one of the three main exposure controls. (Shutter speed is controlled by the camera body, and aperture is controlled by the lens. The shooter may also have control over how much light is present in the scene.) Sensitivity, often given as an "ISO number", relates how much light must be captured before it is considered fully exposed. Sensitivity has an indirect relation with image noise—unwanted variations in color. Higher sensitivity (corresponding with a higher ISO number) allows for quicker exposure but risks increased noise; lower sensitivity requires more time but reduces noise captured.

The sensor can also adjust white balance along a wide range of "color temperature" along an amber–blue axis, as opposed to film typically offered only in either "daylight" or "incandescent". Various light sources can have strong color casts. Human vision compensates for these casts, while sensors must either interpret the scene or receive human input. Mismatched white balance may cause, for example, a shaded subject to appear blue due to light from the sky, or a candle-lit subject to appear amber. The sensor may also adjust for "tint" along a magenta–green axis. Fluorescent lighting can occasionally be strongly tinted green.

Some camera models offer special shooting modes:

  • Burst shooting allows a quick sequence of still images to be captured.
  • Bracketing allows exposure settings to be quickly changed during a burst, and allow for the creation of high dynamic range images.
  • Video modes allow for the capture of video instead of still images.
  • Bulb shooting allows for the sensor to be exposed for long periods of time (multiple seconds to multiple hours). The sensor must be protected from overexposure for bulb shots taken in daylight.

Image mode and color space

edit

Image sensors generally capture three channels of color—red, green, and blue—matching what human eyes are capable of seeing. Invisible wavelengths such as infrared and ultraviolet are typically filtered out to prevent the sensor from misinterpreting them as visible light. Monochrome images can be computed from 3-channel input by the image processor or image editing software. Some specialty cameras may capture a fourth color channel for additional color fidelity, a single channel for reduced cost, or a single channel with increased precision.

Currently available image processors generate 8–16 bits of color depth—the number of steps in between a completely dark pixel and a completely exposed one. As most monitors and popular image formats are limited to 8 bits per channel (24-bit color, sometimes erroneously called 32-bit color), many cameras offer JPEG compression, a widely-used file format that allows the photograph to be immediately shared.

Advanced cameras will generally offer more bitdepth, which gives an advantage in editing precision and dynamic range. Such cameras can produce a raw format image—the raw, unaltered data as captured from the sensor. As raw format is often unique to a specific image processor model, the user will need to install a "codec" that allows their computer operating system to interpret the data, and suitable image editing software that is capable of working with larger bitdepth and exporting more common file formats.

To ensure colors are consistent across all sorts of viewing devices, a color profile is assigned to each image. This will often be the sRGB color space, widely used by monitors and operating systems. Alternative color spaces can describe a wider range of color than can actually be viewed on a monitor, but which still carry advantages in color rendition for the professional photographer.

Output

edit

As mentioned before, image processors can output either JPEG or raw format images, and some professional models can output both.

The images must either be transferred to storage media such as a memory card or drive; or to a separate storage device via USB, an Ethernet wired network connection, or a wireless network connection such as Wi-Fi or Bluetooth. Otherwise, the camera may refuse to capture an image, or may simply purge the image after another shot is taken.