HDR for dummies. What is HDR10, HDR10+ and Dolby Vision on smartphones


HDR for dummies. What is HDR10, HDR10+ and Dolby Vision on smartphones

Last updated:5 months ago

Reader rating for this article: 4.9

(309)

In all our smartphone reviews, we always include information about whether the screen of a particular model supports HDR mode . And if so, which one: HDR10, HDR10+ or ​​Dolby Vision.

Despite the fact that this technology is becoming extremely popular not only in the world of mobile gadgets, but also televisions, the vast majority of people either do not quite correctly understand its essence and purpose, or do not understand what we are talking about at all.

In this material I will try to tell you as simply as possible and at the same time in detail about what is so special about HDR displays, how this technology works and which smartphones support HDR.

HDR has nothing to do with this!

Some people, seeing the letters HD in the abbreviation HDR, believe that this technology is somehow related to screen resolution. On the Internet you can sometimes even find questions about which is better - HDR or 4K. In reality, HDR has nothing to do with resolution. HDR video can be in either FullHD or 4K (and even 8K) resolution.

An even more common myth is that HDR is a video in which the picture looks very bright and saturated. About the same as on AMOLED screens from 10 years ago with oversaturated “acid” colors.

The reality is that in most cases, standard video looks brighter than HDR. And on forums every now and then questions appear about how to disable HDR so that you can at least see something on the screen of your smartphone.

After reading this article, you will have a good understanding of why this happens and how to properly watch content in HDR.

Now let's get to the heart of the matter.

What is HDR? Or maximum image realism

HDR (High Dynamic Range) is a technology that allows you to display video with high bit depth (depth), wide color gamut and extended dynamic range.

If this definition doesn’t tell you anything and even confuses you, great! Next we will analyze each phrase in detail.

But, basically, it is important to understand the following - HDR video allows you to see the picture exactly as the manufacturer (film company) intended it.

HDR video must accurately convey the atmosphere intended by the author. For example, in one episode of Stranger Things, there is a scene in a dark room where several people walk in with a camera and take pictures using flash.

On a regular smartphone, this scene is nothing interesting. You can clearly see the dark room and don’t really pay attention to the camera flash. But on a smartphone with an HDR screen, things are different. The room appears darker, and when the flash occurs, the screen brightness momentarily rises so high that you can't help but blink as if it happened in your room.

Moreover, sometimes during video playback, warning messages may appear on the screen indicating that the following scenes use bright special effects that may cause discomfort to light-sensitive viewers.

In other words, an HDR display allows you to reproduce video as realistically as possible and evoke the sensations from the picture that the author wanted to convey.

How is it possible to achieve such an effect?

HDR video and color bit depth

First of all, HDR technology uses a minimum of 10 bits to encode color, while the regular format uses 8-bit color. What does all of this mean?

The color of each pixel on the screen is formed from 3 primary colors: red, green and blue, which are mixed in the required proportion. If we do this kind of work for several million points, we get a colorful frame.

So, to store information about each of the 3 colors, a certain amount of data (bits) is used. One bit can only take two values: zero or one. And if we used just 1 bit to store color information, each dot would be either pure white (if bit is 1) or black (bit = 0).

Two bits would allow us to store 2 times more information. That is, we get the following set of values ​​and the corresponding colors:

2 bits allow you to store 4 colors

With 3 bits it would be possible to store even more information (8 colors): 000, 001, 010, 011, 100, 101, 110 and 111. And so on.

Almost all modern smartphones have 8-bit displays. Accordingly, the blackest color will have the code 00000001, a little lighter - 00000010, and up to the whitest color with the code 11111111 (this is conditional, since in reality color coding does not start from the first bit). With just 8 bits we can store (encode) up to 256 values ​​of different shades.

Since to create the color of one pixel, information about 3 primary colors is needed, respectively, each of them can have 256 shades, which in total gives 16 million colors (256 red shades * 256 green shades * 256 blue shades).

Using 10 bits we can get more than 1 billion different shades (1024 of each of the three primary colors), and 12 bits will give us more than 68 billion colors!

The first requirement of HDR: the HDR10 and HDR10+ standards use 10-bit color depth, and the Dolby Vision standard (Dolby's version of HDR) uses 12-bit!

And now the important question:

Do you think it makes sense to increase the color depth to, say, 10 or even 12 bits?

At first glance - of course! After all, the more colors, the more colorful the image will be! Right? No.

To understand this, let's imagine that the following picture shows all the shades of red that the display is theoretically capable of displaying:

And now the task is to decide what color depth our screen should support. If you select a depth of 3 bits, then the screen can display a total of 8 colors. This is what they will roughly look like:

As you can see, we no longer have such a smooth gradient. The transitions between the first color (code 000), the second (001) and all the others are very clearly visible in the picture. And in the original picture the color transitions very smoothly from dark to light.

Solving this problem is very easy. It is enough to increase the color depth from 3 bits to 8! Now there will be 256 gradations from the darkest shade to the lightest. Is it possible in this case to see the difference between two adjacent colors? Perhaps someone will see it, but not always.

If you use a 10-bit screen that can display 1024 shades of red, then not a single person will notice the difference between two adjacent colors, since it will be negligible. In other words, a depth of 10 bits and above is already redundant. Our vision is not capable of distinguishing such slight changes in brightness.

But why then does HDR use 10 bits? This question brings us to the second important definition of HDR, as it was originally said to use a very wide color gamut.

HDR video and wide color gamut

Imagine that the following picture shows all the possible colors that the human eye can distinguish:


The entire visible color spectrum

Now look at what part of these colors the screen of an inexpensive smartphone can actually display (everything that is inside the white triangle):


sRGB color space

This is only about 36% of the amount of color that we are able to see!

The white triangle in our example is called a color space (or gamut) and means a specific set of possible colors. If a device has an sRGB color gamut, it means it can only display colors that are within this set.

And no matter what color depth we use, only these colors will be encoded. Accordingly, for the sRGB space 8 bits are quite enough. Colors with such a number of shades will have very smooth transitions.

However, modern high-end smartphones are capable of displaying many more colors than what is included in the sRGB color space. Accordingly, other spaces were standardized: DCI-P3 and Rec.2020 . Here's what they look like compared to sRGB:


Rec.2020, DCI-P3 and sRGB

Even though the Rec.2020 color space includes many more colors, it still cannot cover the entire spectrum of visible colors (to be more precise, Rec.2020 covers 75.8% of visible colors). But, in any case, such a screen will display a much more realistic picture. Note, not more intense, but realistic.

And now that the screen is capable of displaying more colors, we, accordingly, need greater color depth. The usual 8 bits will no longer be enough. To better visualize this, let's return for a second to the example with the color red:

Now we need to display not this range of colors, but a wider one:

And the wider the screen’s color space (the more colors it can display), the more bits are needed to convey all the shades with a smooth transition from one to another.

That is, we can draw the following conclusion: color space (coverage) tells us how many colors the screen can display, and color bit depth (depth) shows the number of gradations between these colors and affects the smoothness of the transition from one color to another.

Second HDR requirement: The HDR standard (HDR10/HDR10+/Dolby Vision) uses the Rec.2020 color space.

The only problem is that at the moment there is no smartphone in nature whose screen would support the same color space (Rec.2020). The most expensive smartphones (iPhone 11 Pro Max, Samsung Galaxy S20 Ultra and other flagships) at best support slightly more colors than are included in the DCI-P3 space.

For example, the Samsung Galaxy Note10 screen covers about 113% of the DCI-P3 space.

Now let's summarize. As I said at the very beginning, HDR is a standard that allows you to display video with high bit depth, wide color gamut, and high dynamic range. By this point, the first two points should already be clear (bit depth and color gamut). All that remains is to figure out the dynamic range.

Some may already have a question - so what to do with the fact that there are no smartphones on the market that support the Rec.2020 color space, then how do they display HDR video? In fact, there are no smartphones on the market with a true 10-bit screen. Moreover, now we will talk about dynamic range and find out that the best screen does not meet the third requirement at all.

Does all this mean that there really is no HDR on smartphones? We'll talk about this a little below.

HDR video and wide dynamic range

Dynamic range is the difference between the darkest and brightest pixel. Wide dynamic range means that the screen is able to show both very bright objects and dark ones in one frame. Moreover, details should be visible everywhere, and not just white and black spots.

Here, for example, is a frame from a film with a very low dynamic range (the frame is deliberately spoiled to demonstrate the effect):


Still from the movie “Chasing Bonnie and Clyde”

As you can see, there are no details on the shirts or against the sky - just white spots. There is also a problem with the display of details in the shadows - there are spots on the pants and suit again, only this time in black.

If the screen had higher dynamic range, this shot would look completely different:


High dynamic range example

Now all the details have appeared in those areas where there were only white and black spots, and in general the picture looks more natural and pleasant.

So, the ability to display very bright and contrasting colors depends on the maximum brightness of the display of our device. The higher the brightness, the wider the dynamic range is supported.

There are no problems with black color on AMOLED screens, since at minimum brightness the pixel is simply turned off and we have the deepest black color possible. But with maximum brightness everything is not so simple.

Brightness is measured in candelas, or more precisely candelas per square meter (from the Latin candela - candle). That is, the luminous intensity emitted by one candle is equal to 1 candela (cd). Another unit of measurement called nit (abbreviated nt) is also sometimes used. This is exactly the same as cd/m2 only called differently. Accordingly, 1 nt = 1 cd/m2.

The screen brightness of each smartphone is different. For example, the maximum brightness of the Samsung Galaxy A51 display is about 400 nits (in automatic mode in the sun it can reach up to 636 nits), the Redmi Note 8 Pro has 460 nits or 640 nits in auto mode in bright sunlight).

But expensive flagship models have even brighter displays. Moreover, they are capable of producing peak brightness in HDR mode, which is not available during normal use. For iPhone 11 Pro, peak brightness in HDR reaches 1300 nits , and for Galaxy S20 - 1342 nits !

However, one should not be deluded by such high values. The fact is that in AMOLED screens, each pixel itself emits light when voltage is applied to it. To display a white picture, each pixel, consisting of 3 sub-pixels (red, green and blue), must glow brightly. If we want to display a black picture, all pixels must be turned off.

Given that the maximum power consumed by the display is limited, you can allocate more power to the backlight of a specific pixel by turning off others. Accordingly, the peak brightness of the HDR displays of the listed smartphones is achieved only when displaying a small area of ​​the image (no more than 10-20% of the screen) for a short period of time.

For example, you can display stars in the night sky with very high brightness (up to 1300 nits) or a flash of light. And this is enough for HDR, but during normal use such brightness is physically unattainable.

Therefore, when choosing a smartphone, if you want to enjoy HDR content, you should definitely pay attention to the peak brightness in HDR mode. Many manufacturers provide such data about their smartphones.

The third requirement of HDR : when mastering HDR content, brightness values ​​from 1000 nits (HDR10) to 4000 nits (HDR10+ and Dolby Vision) are used, and Dolby Vision supports peak brightness up to 10,000 nits.

Speaking of brightness, it’s also worth pointing out a very important difference between regular video and HDR content.

What about the brightness of HDR video!? Or how to watch HDR content correctly

The fact is that in ordinary video the signal is interpreted in relative values. For example, the brightness of an explosion should be displayed at 90% of the maximum screen brightness, and the brightness of stars in the night sky should be displayed at 10% of the maximum.

In principle, at one time there were no problems with this, since 100 nits was taken as the maximum value and the manufacturer roughly understood how the picture would look on any screen.

But then technology moved far ahead and that same 10% of maximum brightness on a screen with a brightness of 600 nits no longer looked completely as planned.

As an analogy, we can use the example of transport. Will you experience the same sensation when driving at 90% of maximum speed in a bus as in a sports car? Of course not, since the maximum speed of a sports car and a bus are completely different. The same goes for video brightness.

HDR uses a completely different concept. Here the manufacturer does not work with relative values, but directly sets the brightness in nits. For example, instead of brightening a candle at 1% of maximum, the manufacturer sets the brightness to 1 nit. And no matter how bright your smartphone screen is, the scene in which the candle is burning will always have a brightness of 1 nit.

If we take a 10-bit color (that is, each color has 1024 brightness values), then in the case of HDR the number 100 would not mean 10% of the maximum brightness, but a specific value, say 0.3 nits. The number 300 will already mean 9 nits, and 500 - 82 nits. And so on until the last value, where 1024 would mean maximum brightness = 1000 nits (or 4000 nits, depending on the HDR standard).

You may have noticed that the numbers somehow distribute brightness non-linearly. Logically, if the number 1024 means maximum brightness (1000 nits), then the number 300 should correspond to 300 nits, but in reality it only corresponds to 9 nits. Indeed, about 50% of the 10-bits are used to encode the first 100 nits of brightness, and the second half for the remaining 900 nits. This is due to the peculiarity of our vision. We distinguish shades much better precisely at the lower limit of perception. Therefore, for better image detail, more than 50% of all bits are used to encode the first 10% of brightness (zero to 100 nits).

And to ensure that the viewer does not spoil anything, when playing HDR content, the smartphone turns on maximum brightness and blocks its manual change. That is, you will not be able to move the usual brightness slider in the notification shade.

Therefore, for the most realistic image, it is advisable to turn off auto-brightness and set the screen brightness to 100% before watching a video.

And now another important point. Returning to our fictitious scene with a candle that glows at 1 nit, would you be able to see anything on the screen if you were watching the video in a brightly lit room?

Look at two squares of the same color:


Squares on the left and right are the same color

It seems to us that the square on the left is brighter and lighter than the square on the right. In reality, these are the same color, the background on which these squares are located is just different. If you remove this gradient so that the background remains white, there will be no difference:

Or here's an even better example, where cells A and B are the same color:

The same thing happens with video. If you look at the screen at maximum brightness in complete darkness, this light will simply “burn out” your eyes. But if we look at the same screen with the same brightness, but in daylight (or better yet, in the bright sun), then we will not feel any discomfort, moreover, we will want to make the screen even brighter.

So, according to the manufacturer’s idea, all HDR videos must be watched in almost complete darkness . That is, the brightness at the time of production is set taking into account the absence of ambient light.

This is the reason why many users complain about the low brightness of HDR content.

So can my smartphone play HDR videos?

As we have already figured out, there is not a single smartphone on the market that would:

  • Supported Rec.2020 color gamut
  • Had a true 10-bit screen
  • Supported peak brightness of 4000 nits and above

Moreover, there are smartphones with HDR support whose peak brightness does not exceed 500-600 nits! And yet, they all proudly claim support for HDR, HDR10+ or ​​even Dolby Vision.

First of all, some smartphones are still capable of reproducing HDR content quite accurately, especially in the HDR10 format, which targets a maximum brightness of 1000 nits.

But what if the Dolby Vision video contains scenes with a brightness of 2000 nits or higher? We need to somehow fit into the smartphone brightness range of 1-1000 nits those details that are visible in the video at a brightness of 2000 nits. It's like trying to pour liquid from a 3-liter jar into a liter jar without losing a drop.

At first glance, the task seems impossible, but mathematics and algorithms, in particular tone mapping, come to the rescue. It is this algorithm that attempts to convert the brightness of the signal into something that can be displayed by a particular screen.

In general terms it works as follows. Along with each HDR video, metadata is necessarily included. This is like information about the date and location of your photos. Only in the case of HDR video there is much more useful information in the metadata.

In particular, when the video is ready, the maximum brightness and average brightness of the frame are recorded in the metadata. When a smartphone opens a particular video, it immediately reads the metadata to understand whether it has enough capabilities to display the content. And if, for example, the video was made with a maximum brightness of 1000 nits, and it opens on an iPhone 11 Pro (with a peak brightness of ~1300 nits), then there is no need to do any tone mapping and we will get a picture as close as possible to the original.

Of course, we still need to solve the problem with 10-bit and wide color gamut, but in terms of dynamic range there will be no problems.

If the phone encounters a video with a maximum brightness of 4000 nits, tone mapping is activated, which partially reduces the brightness and tries to restore some detail in the highlight areas. But, at the same time, the picture quality will be noticeably lower (loss of detail in the shadows and even partial color distortion).

But let's face it, it would be stupid to apply tone mapping to an entire video if only one scene contains a very bright 4000 nit flash of light. It would be extremely unwise to ruin the quality of the entire film because of this scene. But this is exactly what happens if our metadata is static . That is, they simply display the “average temperature in the hospital.”

This is why they came up with dynamic metadata . Such metadata does not display the maximum brightness of the film, but the maximum brightness of each specific scene or even each frame! For example, if a video that was taken at a maximum brightness of 4000 nits shows a scene in a dark room lit by a candle, the metadata reports that from that second the maximum brightness does not exceed 2-5 nits. Accordingly, no tone mapping needs to be done and more shadow detail can be displayed.

However, unfortunately, not all HDR standards support dynamic metadata.

What is the difference between HDR, HDR10, HDR10+ and Dolby Vision?

Before you look at the table listing the “characteristics” of each standard, it is important to know the following. If you see a device that supports HDR (without additional numbers), this means that we are talking about HDR10. That is, any smartphone that supports HDR is a smartphone that supports HDR10.

The same goes for any video. If the video does not indicate that the video supports HDR10+ or ​​Dolby Vision, then it was created in the HDR10 format. In other words, HDR10 is a basic set of recommendations that is always used unless otherwise stated. Therefore, you need to compare HDR10, HDR10+ and Dolby Vision.

In fact, there are more standards, but they are not found on smartphones.

HDR10+ is a standard from Samsung, developed jointly with Panasonic and 20th Century Fox.

Dolby Vision is a more advanced standard from Dolby. Unlike HDR10 and HDR10+, this standard requires licensing, meaning companies that want to add support for it to their smartphones must pay licensing fees to Dolby. To date, only Apple smartphones support Dolby Vision.

HDR10+ was initially found only on Samsung flagships, starting with the Galaxy S10+. But now Xiaomi, OPPO, OnePlus, Realme and Vivo have joined this standard.

All other devices use HDR10. But since it's not even really a standard, but rather a set of recommendations, HDR10 videos may look different on many devices.

The main differences between HDR10, HDR10+ and Dolby Vision are listed in the table below:

ParameterHDR10HDR10+Dolby Vision
Color depth10 bits10 bits12 bit
Possible number of shades1.07 billion1.07 billion68.7 billion
Maximum brightness1000 - 4000 nits1000 - 4000 nits4000 — 10000 nits
MetadataStaticDynamicDynamic
PriceFor freeFor freePaid

As you can see, the main advantage of HDR10+ over HDR10 is its support for dynamic metadata. That is, video with HDR10+ will look noticeably better on devices whose maximum brightness does not meet the stated requirements.

In addition, HDR videos are also interesting because they have an excellent reserve for the future. The higher quality screens appear in the future, the more colorful and realistic “old” films will look. For example, films are already being released today with support for peak brightness of up to 4000 nits and, naturally, 10-bit color + the widest color gamut. This video will look completely different on future smartphones.

Where can you get content in HDR format?

Many streaming services support HDR video. That is, to evaluate HDR you can safely use one of the following services: ivi, Okko, Megogo or Netflix. Almost all of them provide a free trial period.

As for Dolby Vision, only Netflix supports this standard so far. By the way, we have already published a detailed review of this service.

PS

Don’t forget to subscribe in Telegram to the first popular science site about mobile technologies - Deep-Review, so as not to miss the very interesting materials that we are currently preparing!

How would you rate this article?

Click on the star to rate it

There are comments at the bottom of the page...

Write your opinion there for all readers to see!

If you only want to give a rating, please indicate what exactly is wrong?

HDR TV nuances

For a television receiver, HDR is determined by two things:

  • What level of contrast is it capable of displaying?
  • What is the quantity and quality of reproduced colors?

Moreover, contrast is considered the most important factor. It displays the difference between the brightness of the lightest and darkest areas on each frame. The higher this indicator, the higher the contrast level. The brightness of the night sky strewn with stars is 0.003 nits, the celestial body is 1.6 billion nits. The main task of the TV is to correctly display this range in the presence of extreme set values ​​characterizing brightness, from one to five hundred nits.

For a TV to qualify for HDR, it must have a certain peak brightness threshold. But this is not the only parameter set, because we must not forget about the black level value. It determines how brightly the dark areas of the picture will be reflected.

Color is an equally important parameter in HDR. The receiver must process color, which is usually called “deep” or ten-bit. For clarity, let's say that in Blu-ray it is customary to use eight-bit color, which can display about sixteen million different shades. Whereas the ten-bit color of HDR TV receivers creates a significantly expanded range of color gamuts, making it possible to smooth out the gradation between them.

But in fairness, it must be said that there are some nuances here. To fully combine with HDR, it is not at all necessary for the TV to display all colors in a ten-bit signal. It simply requires the presence of a function for processing this signal and it must reproduce a picture created on the basis of the received information, which is usually called metadata.

If we talk about P3 color gamut, then the TV is considered compliant with the HDR standard, provided that it has the ability to cover more space in the color spectrum. And the smoothness of transitions between different shades in a given space will be much better than is the case with ordinary TVs.

Final part


HDR-10, together with HLG, is predicted to have the greatest adoption of all HDR options in television. To use Dolby Vision, which has greatly lost its position, you need streaming content along with discs.

If your TV is labeled as HDR Dolby Vision, it requires Dolby Vision content to enjoy the HDR effect. However, any model of TV with DV can support the HDR10 standard. If you use the Active implementation, the picture seen by the viewer on the screen will become more realistic.

Since almost all, including the most affordable 4K models, support HLG, HDR Active technology is destined to make LG 4K TVs literally omnivorous. And at the same time implementing HDR with the highest quality.

HDR10+ - Where can I find it?

Of course, the benefit of HDR10+ is that it's not just magic. TVs and 4K Blu-ray players must carry firmware to handle this, and the content must be made in it. So what does HDR10+ look like?

Samsung's 2021 TVs have HDR10+ capabilities, and all of its 2021 4K HDR TVs support the format. Dear players Oppo UDP-203 and UDP-205 4K have updated firmware for this format.

Panasonic has applied its HDR colors to HDR10+, but in 2021, Dolby Vision support has been added to the Japanese brand's TV lineup. The flagship GZ2000 supports both HDR formats, becoming the first TV in the world to do so. The GZ1500, GZ950, GZ920, GX800 and GX700 TVs all have HDR10+ built-in, and every TV from the GZ2000 to the GX800 is Dolby Vision compatible.

Philips, like Panasonic, supports a combination of Dolby Vision and HDR10+ on all its TVs. This leaves LG and Sony in the Dolby Vision corner, with Samsung the only one holding the Dolby Vision front.

As for compatible 4K players, Panasonic UB9000, UB450 and UB150 support HDR10+. Samsung has exited the 4K player market, so there will be no HDR10+ from them.

Hybrid log gamma: broadcast standard

Broadcast standards are evolving differently than production standards, but that doesn't mean they'll stop at SDR forever. Hybrid Log-Gamma (HLG) is an open broadcast format developed by the BBC in the UK and NHK in Japan. It is a backwards compatible format that implements HDR video over broadcast. HLG specifically targets 1000 nits of peak brightness, just like HDR10.

Because broadcasts must accommodate such a wide range of devices with different capabilities, it is important to ensure that modern HDR broadcasts display correctly on older SDR displays. HLG achieves this by providing a signal that allows modern HDR displays to achieve greater dynamic range without closing the door on older technologies.

Although the format was created for broadcasting, it is also supported by streaming services including YouTube and BBC iPlayer. Broadcasters already using HLG include Eutelsat, DirecTV and Sky UK

Is HDR10+ good?

It's unlikely that Samsung would have bothered to work on HDR10+ if it didn't think that adding dynamic metadata could bring significant additional image quality benefits. But seeing is believing.

Luckily, we saw three separate HDR10+ versus regular HDR10 demos. The first Samsung demo I saw had to be discounted on the grounds that the core performance levels of the two displays were clearly too different. Luckily, Panasonic had a seemingly more authentic demo in a darkened room, and Samsung allowed me to see a much more authentic behind-the-scenes demo.

In both cases, the benefits of HDR10+'s additional metadata were clear. The main benefit is the amount of visible detail in the brightest parts of an HDR image; far fewer subtle tones than with classic HDR10.

This helps the image look more detailed, and the impact of that detail is further enhanced by a slight increase during both demos in the apparent brightness of the image's brightest highlights.

Samsung's live demo was conducted using a pair of its 2021 Q9F flagship models, and even on those powerful trims, the HDR10+ difference was clear. Although, as discussed earlier, HDR10+ is expected to have the biggest impact on relatively affordable screens.

I didn't see a significant impact on color tone reproduction during the Panasonic head-to-head, although the color situation was a hit on the Q9F as well. Some tones looked more saturated and punchy under the influence of HDR10+, while others actually looked more washed out.

However, this latest issue may be due to the Q9F's edge lighting system struggling to control its light locally enough to prevent some tones from being over-whitened. We expect more modern TVs to deliver better and higher levels of performance.

What to watch

Over the past decade, there has been a plethora of content developed with a focus on supporting popular technology. Therefore, there are no more problems when searching for what to watch in HDR mode.

It is important that the content being reproduced is in its original form, i.e. Conversion and additional compression of videos are unacceptable. It is recommended to select only licensed copies of materials and use legal services that provide high-quality products to view or purchase.

Where is HDR support available:

  • Online cinemas - almost all services have long supported the playback of HDR films and TV series, and they even feature Russian projects that use extended dynamic range technology /
  • Video games - HDR support can be seen in many video games, including Forza Horizon 4, Metro Exodus and Horizon Zero Dawn, and not only computers connected to a TV, but also previous generation consoles can demonstrate this technology.
  • Ultra HD Blu-Ray is an improved physical media format that natively supports HDR, so High Dynamic Range movies can be played from it as well, as long as you have the right player.

To watch content in HDR, you don't need to purchase new cables when it comes to connecting consoles, players or PCs to your TV. Existing HDMI cables cope with this task and provide high-quality image transmission with metadata. However, they must support the 2.0a interface and be marked High Speed. If your electronic device initially supports HDMI version 2.0, then don’t worry. Some manufacturers have updated it to 2.0a with new firmware.

Rating
( 1 rating, average 5 out of 5 )
Did you like the article? Share with friends:
For any suggestions regarding the site: [email protected]
Для любых предложений по сайту: [email protected]