What Makes HDR Video So Special?
Written by Lynn Orlando
Published on April 17, 2020
Most people can appreciate the art of beautifully implemented cinematography, yet one might argue that those of us who are fans of classic cinema and television are especially attuned to the miracle that high dynamic range (HDR) brings to the modern video viewing experience. HDR delivers whiter whites, blacker blacks, and a wider range of colors in more vibrant hues than ever before. The colors are deeper and brighter, the lines are sharper and more distinct, and the overall digital representation of an object or scene is closer to that which can be seen by the human eye.
The Evolution of HDR
HDR is a style of imaging that captures images in a way that closely resembles the way in which they are seen by people. To do so, it achieves a greater tonal range by merging multiple images taken at varying exposure levels. The result is a composite in which both dark and bright details are included in a single image. The technique was developed for, and is most associated with, digital photography; however, it has currently become linked and adapted to several other technologies and concepts.
As early as the 1850s, photographers began toying with the idea of combining different images to increase the luminance range of a single composition. In 1856, French artist Gustave Le Gray began producing a series of seascapes that would earn him world renown. Le Gray used two separate negatives, one for the sky and the other for the sea, combining them to recreate a scene with a dynamic range closer to what we see with our eyes. The process was coined combination printing, but it’s easy to see how this was the genesis for what we now know as HDR.
The next major step toward achieving modern HDR came when photographers began increasing the dynamic range of images through dodging and burning. Ansel Adams was extremely adept at this technique. By dodging, or decreasing the exposure time to lighten darker regions, and burning lighter regions with increased exposure, Adams was able to bring his famous landscapes to life.
What Is a Dynamic Range?
To understand the “how” of HDR, let’s first look at “why” it’s so important. It all begins with dynamic range, which is the range of light, dark to bright, that can be measured in a single image. Dynamic range is measured in stops, similar to f-stops on a camera. Each consecutive stop has exactly double the amount of light as the previous stop.
If you’ve ever taken a picture and wondered why the image looks different from what you saw with your eyes, then you already have some basic experience in observing dynamic range. What we observe in the world with our eyes contains far more detail because our eyes can process a wider range of light than a single camera exposure. In fact, our eyes can cover a dynamic range of 16-20 stops, while a single camera exposure can cover only around 7 stops. HDR was developed to overcome that disparity and capture images more closely to the way they are seen.
The Classic HDR Process
We’ve already learned that the HDR process involves merging multiple images taken at differing exposures. This merging is mainly done, in both analog and digital mediums, by bracketing exposures. Bracketing is done by finding the correctly balanced exposure for a photo, then setting the camera to capture additional sets of lighter and darker exposures. The amount of exposure value (stops) between bracketed shots can also be adjusted. An example of a bracket could read as follows: -2,0,2 or -3,-1,0,1,3 with “0” being the balanced exposure and the additional numbers corresponding to the stop values of bracketed shots. Generally, the bracketed sets are equidistant from the middle exposure with regard to the stop value.
After the bracket is complete, the photos can then be combined using editing software like Adobe’s Photoshop or Lightroom, or any other similar imaging software. When the bracketed exposures are combined, the dynamic range is extended, filling the image with an amount of detail closer to what our eyes can process and giving us our HDR image. Additional tuning to developer settings is usually done to get the image just right.
As imaging technology advances, quality HDR images are easier to achieve. Algorithms enable cameras and editing software to instantly process HDR images. One of the major selling points for the newest editions of both iPhone and Galaxy smartphones is the multi-lens camera system that promises pictures “closer to eye sight.” Most new smartphones have cameras with HDR settings that increase detail through dynamic range.
HDR Video is the newest and most relevant iteration of HDR-related technology at the moment. The basic concepts of HDR apply to this new video format given that there is a much wider range of luminance included. The biggest and most important difference, however, is that HDR Video also includes a wider range of color data. The resulting video is considerably more brilliant and vibrant than previous video formats, 4K included. HDR video is the only format that can keep up with the rapidly advancing capabilities of new TVs and displays. HDR10, Dolby Vision, and HDR10+ (Samsung) are the most commonly used formats of HDR video.
An increased dynamic range is always going to be the central concept for HDR technology. The mode in which the dynamic range is increased can change, but the desired result will always be a wider range, more light and dark data, a more vibrant and defined image–and a more realistic viewing experience.