Home » TVs » Reasons for low HDR mode performance in TVs Review
HDR TV

Reasons for low HDR mode performance in TVs Review

The innovative technologies are the main trend in the consumer electronics segment, and hi-tech TVs segment illustrates it well. In turn, a huge number of new technologies are accompanied by non-unified terminology, which often causes confusion. But even a simplified classification will help to understand a little the nuances of new technologies.

The list of main modern innovations includes Quantum Dot technology, OLED & Micro LED panels, Local Dimming with FALD (Full Array Local Dimming) or with the help of innovative Mini LED backlit, and, of course, HDR technology.

Models with the first three technologies are more expensive, but they provide a significant improvement in image quality and do not cause questions. In short, they work as follows.

Quantum Dot technology

This technology has dramatically improved the picture quality of LCD TVs. Companies today have developed and are using several versions of this technology, including Samsung QLED, Sony Triluminos, LG NanoCell, Hisense ULED, etc. It has gracefully solved the problem of enhancing the white quality in the LED backlit of LCD TVs. As known, only sunlight is perfectly white. In fact, the developers created an analogue of a small sun in TV, forming white from blue (blue LEDs behind the matrix with nanoparticles), red and green (nanoparticles on the matrix, the size of which corresponds to the wavelengths of red and green).

Quantum Dots technology
Quantum Dots technology

Improving the white quality significantly increased brightness, color gamut, and color accuracy.

Quantum Dots image quality
Quantum Dots image quality

OLED & Micro LED panels

As known, OLED technology uses self-emitting diodes and does not require backlit. Due to the lack of backlit scattered light, they provide perfect blacks and thus excellent contrast (max white to black ratio).

OLED vs LED LCD TV image quality
OLED vs LED LCD TV image quality

Unfortunately, they are not very bright, and their price is highly dependent on the OLED panel size. For example, the 48-inch LG OLED CX with stunning picture quality costs less than $ 1,300 today. But OLED TVs with large screens are still expensive.

To avoid this problem, some companies are actively developing the direction of Micro LED panels based on modules. Like OLED models, they use self-emitting LEDs without backlit. Of course, such a design, in principle, does not limit the panel size, and allows developers to create screens of any shape in the future. At CES 2019, Samsung even showed 13-inch modules for such panels.

Samsung Micro LED modul
Samsung Micro LED modul

This technology dominates today in terms of image quality, but is prohibitively expensive. For example, at CES 2021 Samsung unveiled the latest 109-inch MicroLED panel, which will cost around $ 156,000.

Samsung Micro LED panel CES 2021
Samsung Micro LED panel CES 2021

Local dimming based FALD (Full Array Local Dimming) & innovative Mini LED backlit.


This technology controls the brightness of the backlit in different areas of the frame depending on the content, increasing the image contrast.

Local Dimming technology
Local Dimming technology

Innovative Mini LED backlit technology uses miniature LEDs, the size of which allowed developers to increase their number to tens of thousands. They are combined into thousands of local dimming zones with separate control, providing dimming control for each zone. Probably Mini LED technology can be positioned as the latest generation of local dimming with FALD.

Mini-LED technology
Mini-LED technology

At CES 2021, industry leaders unveiled Samsung Neo QLED, LG QNED and TCL OD-Zero TVs with Mini LED backlit. In March TCL announced a fantastic 85-inches TCL X12 8K Mini LED Starlight with 96,000 LEDs. Its estimated price will be just over $ 15,000. In comparison, the legendary TCL 6-Series Roku TV (2018) used only 86 local dimming zones, providing great contrast.

Samsung Neo QLED TVs and LG QNED TVs are available today.

HDR (High Dynamic Range) technology

HDR uses 10 bit encoding instead of 8 bit, and will display the image with dynamic metadata in mind.

Of course, it’s fundamentally different from HDR technology in photography, based on the processing a series of shots for one frame with different shutter speeds. In this case, the digital processing algorithm selects from a series a frame with an optimal exposure for each fragment, forming from them something like a Frankenstein-frame.

HDR foto
HDR foto

Unlike 8-bit color depth coding, 10 bit coding expands the number of displayed shades from 16.7 million to 1.07 billion for each pixel (extended color space Rec. 2020).

10bit vs 8bit
10bit vs 8bit

As a result, the image quality is dramatically improved.

HDR image quality
HDR image quality

Depending on the format, content can contain static or dynamic metadata with information about light level, peak and average frame brightness, colors, etc.

Main HDR formats:

– HDR10 – 10 bit signal with static metadata;

– HDR10+ – 10 bit signal with dynamic metadata;

– HLG (Hybrid Log-Gamma);

– Dolby Vision – premium HDR version with 12 bit color depth (TV must have a special chip).

Many viewers after HDR format psychologically cannot return to SDR, the quality of which seems to them too dull and faded. As a result, over the past few years, the number of HDR fans has grown like an avalanche. In theory, owners of HDR TVs should just be enjoying a great picture. But the internet is filled with questions like “Should I turn on HDR on my TV?”, “Should I have HDR on or off?” “How do I turn off HDR?”, “Is HDR better than HD?”, “Is HDR a gimmick?”, “Does HDR really make a difference?”, etc. Of course, this situation is not entirely adequate and requires analysis.

“HDR Ready”

Like 4K resolution, the huge popularity has played a cruel joke with a new and, of course, excellent technology.

Several years ago, some manufacturers and sellers of TVs and projectors did not quite correctly began to use the term “4K compatible” or “4K Ready” in the specs of even very budget models. Of course, many consumers interpreted it as the ability of the model to reproduce in 4K resolution, but in reality it corresponded only to the ability to receive 4K content. Having received such a signal, device does downscaling up to the native real panel resolution. In fact, “4K Ready” slightly expands the range of available content. But consumers quickly sorted out this situation and began to pay attention to “native resolution”, which really characterizes the panel’s capabilities.


Of course, in general, companies behave correctly and do not use such manipulations. For example, Epson projectors use e-shift technology, which slightly, but really increases the perceived resolution by duplicating pixels.

e-Shift
e-Shift

Today, a similar situation exists with HDR technology. Specs of even budget models often contain “HDR: HDR 10, HLG, Dolby Vision, etc”. But, of course, an inexpensive and not very bright 8 bit panel without WCG (Wide Color Gamut) support basically cannot reproduce the HDR spectrum. In fact, these TVs only have a chip that decodes HDR content, but plays in SDR quality, disappointing user expectations. Without this chip, TV reproduces HDR format as stripes or black background.

Real HDR requirements

This technology provides full quality while meeting two requirements:


– TV with bright and quite expensive 10bit panel, WCG (Wide Color Gamut) support, and with HDMI 2.1 (High Definition Multimedia Interface);

– HDR content.

To some extent, Wide Color Gamut (WCG) technology can be positioned as the predecessor of HDR and is a software-based color enhancement method.

WCG
WCG

In fact, it complements HDR. Simplistically, HDR improves the dynamic range of a picture (with brighter brights and darker darks), WCG increases the quality of colors – redder reds, bluer blues, greener greens, etc. In other words, HDR improves image clarity quantitatively, while WCG does so qualitatively. Together they provide an amazing HDR effect.

HDR+WCG
HDR+WCG

Of course, 10 bit panels provide the best HDR quality, but they are expensive. Therefore, many companies use more affordable active 8 bit + FRC panels with lower HDR quality. Frame rate control (FRC) is a method for enhancing color depth in TFT LCD displays. FRC uses a cycling between different color shades for each new frame to simulate an intermediate shade.

8 bit+FRC panel
8 bit+FRC panel

Of course, the required memory of HDR content is radically higher compared to SDR. Accordingly, its transmission to TV requires a very wide bandwidth. For example, the transfer of dynamic metadata requires a model with HDMI 2.1 support. For comparison, the bitrate of HDMI 1.4 is 10.2 Gbps, HDMI 2.0 – 18 Gbps, HDMI 2.1 – 48 Gbps. Unfortunately, modern budget and many mid-budget models do not support it.

Conclusion

Full playback of HDR content requires:


– high peak brightness (about 1,000 nits);

-10 bit or at least 8 bit + FRC panel with WCG support;

– HDMI 2.1 support (HDMI 2.1 input).

Unfortunately, manufacturers do not always indicate panel type, WCG support and peak brightness in specs. In order to avoid disappointment, it’s better to choose a HDR TV based on an objective assessment of their HDR performance in reviews on well-known sites (CNET, Digital Trends, etc) that test many popular models.

Of course, visual assessment without instruments is also possible, but it will be objective only in the side-by-side mode, which requires two identical models. Otherwise, TV can just display content well due to QD technology, local dimming, etc. Even worse, the HDR mode can even degrade the playback quality, for example, due to insufficient brightness. Of course, this situation requires disabling it in the settings.

This video showcases HDR performance of Samsung 4K QLED HDR 1500.

Scroll to Top