The color revolution we will notice. We understand HDR and suggest you try it


Hi all! It just so happens that modern industry rarely comes to one standard, and if it does, some companies often supplement simple and understandable general technologies with their crutches. The last major “format war” can be called HD-DVD vs BluRay, in which the latter won. Well, a new round of technology has rolled out new “guns”: one of the main TV trends in 2016 is support for the HDR (high dynamic range) format.

What is it, how does it work, how does it differ from HDR, which is used in photography and games, and what formats are fighting this time - in our post today.

HDR - what is it and why is it necessary?

HDR (High dynamic range) is a technology for expanding the dynamic range of an image.
The human eye adapts to the level of ambient light. When shooting a photograph or video, the parameters of the optical system (that is, the sensitivity of the photographic material or sensor, the aperture and shutter speed) affect the formation of the frame, and the recorded information contains a certain range of brightness. From what is conventionally considered “black” (but may not be black in reality, because the amount of reflected light simply was not enough to record a different shade) to what is considered “white” (similarly, the object can reflect / emit more light than the sensor can capture at current settings without being white from an objective point of view). The difference between the brightest and darkest shades is the desired dynamic range. If the scene has a greater dynamic range than the technology is capable of capturing, then the picture will contain burnout of light areas and/or blackening of shadows, that is, loss of brightness and color information.

Increasing the dynamic range increases the range of brightness perceived by the camera, but at the same time reduces the contrast of the image. To restore contrast, various algorithms for increasing local contrast and tone mapping are used. But if, when recording content, the issue of converting extended dynamic range to normal is resolved, then why is HDR needed on TVs and how does it work?

Living example

When it comes to any new format, a reasonable question arises as to whether there is enough HDR content to justify buying a new TV. So: it exists! Firstly, this is an actively developing format, and there will be much more of it in the future. Secondly, many modern gadgets (cameras and smartphones) can already shoot content in 4K HDR. Channels and services such as Amazon, Netflix or HBO are actively filming their own series in 4K HDR format, Amazon has launched a Prime subscription to stream films in this format, and HDR content will be available on Netflix in the near future, including in Russia. And if it still seems that there is not enough content, there is another bonus: on Sony XD93 series TVs, even an ordinary HD signal can be extended to 4K HDR using upscale.

They have a 4K processor X1™, it is designed to process 4K HDR signal, while simultaneously enhancing the detail of the image received from any source, even far from 4K quality: TV broadcasts, DVD and Blu-ray discs, video materials from the Internet and digital photographs. Of course, the processor will not be able to make a full 4K image from a standard image, but it can significantly improve it. The processor optimizes texture, sharpness and color while reducing noise: the system scales every pixel, analyzing individual parts of each frame and matching them with a special image database of tens of thousands of reference recordings collected by Sony over years of film and TV production. This processing allows you to correct and improve even a blurry picture. Last but not least, this effect is due to the fact that the Sony XD93 series uses 14-bit signal processing, which provides, accordingly, 14-bit color gradation, even if the input was a standard 8-bit signal. As a result, the Sony XD93 series has 64 times more color levels than conventional TV displays, an impressive difference: a regular 8-bit signal provides 256 color levels, and a 14-bit signal provides 16,383 levels. And as you can see in the table below, this bit depth gives the output more than 4 trillion shades of color.

It is clear why 14-bit gradation seems unnecessary, since the eye, in general, does not need so many shades of color, it simply does not see them. But this is important for television images. The fact is that shades of color are distributed unevenly in a dynamic picture. Most of them go into brightly lit areas and much less into dark ones. The human eye, on the contrary, is evolutionarily more sensitive to shadows rather than bright areas. A contradiction arises. And here, a higher bit depth has a numerical advantage: regardless of the uniformity of the distribution, there are actually 4 times more shades in the shadows, and they no longer seem flat to the human brain.

HDR on TV

TV HDR does the same thing: it increases the amount of luminance values ​​allowed between the darkest darks and the brightest highlights, so that a properly encoded video stream can display more different hues in a frame.
In general, the picture becomes more detailed, brighter and clearer due to an increase in micro-contrast in the “border” zones, which on a regular video stream and a regular TV would look monochrome or low-contrast, and would be completely lost in motion. HDR support needs to be provided in several places at once. The TV must be able to decode and display such a signal - once. The content must be encoded appropriately and contain the necessary information - two. It is also necessary to deliver content with an increased bitrate, that is, to have support from the “transport” of the signal (interface cable). Three. And the content needs to be somehow filmed, distributed, and so on. To ensure the compatibility of all this diversity should (in theory) be a set of recommendations Rec. 2021. Let's sort everything out in order.

TV

No, no one will install 30-bit matrices on TV; they will still remain 18- or 24-bit, and HDR content will be converted to “normal” before display.
It’s just that in the new format for recording the color space, special transformations will allow you to encode the brightness component with a larger number of bits. A special algorithm will convert the original signal into the “correct” one for the TV, due to which the picture will be both more detailed and will not lose contrast. And here is the first problem. There are two HDR-TV standards: HDR10 and DolbyVision. Samsung and Sony have relied on HDR10, most Hollywood studios and some other companies have relied on DolbyVision. And the Dolby brand itself is much more “promoted”. Modern content providers have solved the issue in their own way - Netflix and Amazon simply support both formats and you can choose in which form to receive HDR movies. LG decided not to participate in the format war and simply provided support for both technologies.

Content and media

To support HDR video, an appropriate data recording format is required. Of the “old” formats, the Rec. 2021 support H.264 (with some restrictions), of the “new” ones - HEVC, also known as H.265. Of course, “additional” information requires more space, so 4k2k-HDR content is only delivered online for now. BluRay 4k and the only (at the time of writing) widespread and relatively easily purchased player of such discs are produced by Samsung, and we already know what format this industrial giant has relied on. Technically, nothing prevents us from implementing DolbyVision support in BluRay 4k, but at the moment we have what we have.

Data transfer

There is content, there is a TV.
All that remains is to transfer the content to TV. Here (for now) the HDMI format reigns supreme, TVs with DisplayPort support - the cat cried. HDMI 2.0 theoretically supports 4k2k video with HDR, but with certain limitations: 24/25/30 fps for full color with 12-bit sampling and 50/60 fps for 4:2:2 or 4:2 chroma subsampling :0. At the moment, HDMI 2.0 is a bottleneck in TV technologies, and there is no need to talk about a real breakthrough in picture quality until the advent of a standard that supports “honest” 4k2k 120Hz / 8k4k 60Hz.
The problems with HDMI are in the area of ​​backward compatibility - HDMI allows you to transmit a signal that is compatible even with an analog b/w TV, which imposes certain restrictions on the development of the standard.

Color improvements

The next key parameter for HDR video is color enhancement. This applies to all major HDR standards, and all new 4K UHD TVs with wide dynamic range must have 10-bit color processing. This is already a deep color, and instead of 256 gradations of RGB (red, green, blue), it offers 1024 shades of each of them.

This amounts to a total of 1.06 billion colors instead of the 16 million offered on older 8-bit color TVs. Thus, the transitions between shades and different tones in the screen content present the viewer with a much greater degree of realism.

With HDR TV standards, things get a little more complicated. We don't actually need 4K HDR TVs to display all the colors of a 10-bit signal, they just need to be able to process it to deliver an image based on that information. This is a kind of indirect "10-bit color bitness".

In addition, according to HDR TV standards, it is also necessary to manage the so-called P3 color (a color space from SMPTE that simulates the palette of color film) or at least 90% of it. P3 refers to the P3 portion of the overall color spectrum, which is only slightly larger than the Rec.709 color spectrum used in previous 4K TV models.

All of the above HDR video parameters make up “Wide Color Gamut” - a wide color gamut, ideal according to 4K TV manufacturers, meaning a wider coverage of the color spectrum and smoother transitions between all possible shades covering the spectral space. This is an additional part of developing a highly realistic display with high-quality HDR.

What to choose

If you're buying a TV in 2021 and want to get the most out of it, it would be logical to look at models with HDR support. For some reason, manufacturers are embarrassed to write HDR Ready on the TVs themselves, so in the store you need to look for the UltraHD Premium
. Samsung has such TVs in the 9000 series (2016), LG, for example, has flagship OLEDs, 65EG6.

Of the disc players, there are two solutions available on the market - Samsung UBD-K8500 (with HDR10 support) and Panasonic DMP-UB900. Foreign reviewers strongly recommend taking the Panas - it has better picture quality, and (theoretically) should support Dolby Vision, but for now it only works with those disks that are available (and they record the HDR10 format).


In practice, all these devices cost... a lot, and until the popularization of HDR video formats (as well as their “descent” to 1080p resolution), prices are unlikely to fall / these features will appear in “budget” TV lines.

Purely theoretically, the relatively budget Samsung KJ-7500 should be able to do HDR (though without 4k playback, only FullHD), but it has not yet been possible to test this.

However, just recently Nvidia introduced new desktop video cards from the Pascal family: GTX 1080 and 1070: they support new HDR formats, so game developers will be able to use “honest” HDR (and not its imitation) in their products, and you will be among the first to appreciate these innovations.

However, it is not yet known when these features will appear in games. But as soon as they appear, you will be one of the first to evaluate them.

Have questions? Or perhaps you're one of the lucky ones who is already enjoying the benefits of HDR TV? We are waiting for you in the comments!

HDR video standards

As is often the case with new developments in technology, there are now a number of different standards for HDR video. The two main formats are Dolby Vision and HDR10.

What is Dolby Vision format?

Dolby Vision is a proprietary HDR video format developed by Dolby. This standard uses dynamic metadata to adjust brightness levels from scene to scene and even frame to frame. This means Dolby Vision preserves even more detail in the shadows of dark scenes or in bright scenes. The Dolby Vision HDR standard and the HDR10 standard comply with ITU-T Rec. 2021 color scheme. This means that HDR is capable of reproducing more colors in addition to capturing more detail in the shadows and highlights. Additionally, Dolby Vision uses metadata to fine-tune the video based on the hardware used to display it. Lighting levels and color levels are adjusted based on values ​​determined by Dolby and the TV or projector manufacturer. This means that the image is optimized based on the specific capabilities of the device on which it is displayed. Dolby Vision is a licensed format and compatible equipment must be Dolby certified.

What is HDR10 format?

Unlike Dolby Vision, HDR10 uses static metadata. This means that light and color levels are absolute values ​​for the video and do not change from scene to scene.

The HDR10 format defines a range of specifications for the content and displays it must meet. However, it does not adjust video playback to suit the individual capabilities of a particular screen.

One of the advantages of HDR10 over Dolby Vision is that it is an open standard. This means that any company can use it without having to pay a license fee.

There are two other formats to be aware of, HDR10+ and Hybrid Log-Gamma (HLG).

What is HDR10+ format?

A variant of HDR10, Samsung designed HDR10+ to include dynamic metadata. HDR10+ adjusts the light levels displayed for each scene or frame of video, but like HDR10, it does not take into account specific screen characteristics.

What is Hybrid Log-Gamma (HLG) format?

Hybrid Log-Gamma was developed jointly by the BBC of Great Britain and NHK of Japan. Designed for broadcast, HLG is backwards compatible with standard dynamic range (SDR) displays. HLG does not contain metadata. Instead, it uses a combination of a gamma curve for the brightness of SDR content and a logarithmic curve for higher brightness levels when using an HDR display.

What types of HDR are there?

Yes, there are several versions of HDR. HDR10 is currently the most common, open standard used by various content producers, followed by Dolby's own variant called Dolby Vision, which allows for even more detailed optimization. Other HDR standards include HDR10+, which is gaining popularity, and Advanced HDR, a standard created by Technicolor. They all work by sending content metadata to your TV to tell it what brightness and color settings it needs and when to change them.


The difference between a standard image and an image with Dolby-Vision

Rating
( 2 ratings, average 5 out of 5 )
Did you like the article? Share with friends:
For any suggestions regarding the site: [email protected]
Для любых предложений по сайту: [email protected]