High Dynamic Range provides better contrast, greater brightness levels and a wider colour palette compared to traditional technology. According to many experts, HDR technology was the first significant leap after the advent of HDTV instead of SD.
The development and implementation of this technology has become a logical continuation of the rapid development of television technology. Strong competition and the development of digital technologies have stimulated the technical development of TVs. As a consequence, companies created new matrix generation, including OLED, the high-performance video processors and new technologies, including FALD backlighting. These new TVs processed video signals at a fundamentally new level, providing a revolutionary improvement in image quality. But, unfortunately, the traditional standards of 8-bit color limited new possibilities.
Of course, this situation could not last long. As a result, in 2012, Dolby Labs offered an elegant solution to this problem, increasing the dynamic range of the image brightness to 12 bits. As known, this value directly affects the number of displayed colors and their shades.
Of course, increasing the number of bits significantly expands the color gamut.
Accordingly, Dolby Vision became the first format for HDR. But this situation did not suit many market leaders and they began to develop their own alternative standards. Of course, such a development was unlikely to have prospects due to the complete absence of unification. Therefore, LG, Samsung, Sharp, Sony and Vizio have joined forces and created an open source HDR standard. In April 2016, the UHD Alliance industry group, including LG, Samsung, Sony, Panasonic, Dolby, etc announced the adoption of Ultra HD Premium certification for UHD Blu-ray players.
Operation principle
The scheme perfectly illustrates the essence of this technology, which uses the expansion of the color palette to increase the image saturation.
Of course, the image quality in this format is radically superior to the traditional format.
The image quality mainly depends on the contrast (the ratio of the brightest and darkest points on the screen) and color rendering (color palette, color gamut). Of course, the matrix resolution, viewing angles, fps of video, sound quality, refresh rate, etc also affect the perception of video content. But, the realism of perception mainly depends on the contrast, which creates a sense of depth and clarity, and color gamut, which provides color saturation.
As known, the contrast characterizes the ratio of peak brightness to black level (minimum brightness), which are measured in ‘nits’ (1 nit corresponds to candle brightness). For example, TV with a maximum brightness of 400 nits and a black level of 0.4 nits provides contrast of 400 nit / 0.4 nit = 1000: 1. Of course, today OLED matrices provide maximum contrast due to the lack of backlight and, accordingly, almost “0” black level. This value significantly affects the visualization of content.
Budget TVs usually provide up to 500: 1. The contrast of expensive models reaches 1000: 1 and higher.
As known, 10-bit color reproduces more than a billion shades. Of course, such a wide range almost completely erases the gradations between shades, increasing the realism of the scene. Modern HDR technology uses 10-bit HDR10 standard and its versions with DCI-P3 or 12-bit Dolby Vision with Rec.2020.
HDR content and its broadcast
Any video content contains static metadata, which is necessary for adjusting the screen when mastering during signal processing. They do not change and are therefore called static.
But static metadata does not preserve the original colors for each scene. Dynamic metadata solves this problem.
In fact, content creators set the necessary brightness and contrast for each frame. They then generate a set of metadata and merge them with content for frame-by-frame transmission. Thus, each frame has its own metadata. This unique technology provides the most accurate color rendition.
Of course, the creation of such video content is very expensive, and its broadcast requires a very high bandwidth communication channel.
New HDR10+ provides the transfer of dynamic metadata, reducing differences in color accuracy compared to the Dolby Vision standard.
Matrix
Today, the AV industry uses OLED and LED LCD displays. For more than a decade, experts have been discussing their advantages and disadvantages in “OLED vs LED LCD” mode. As known, OLED matrix uses their own diode radiation without additional LED backlight. As a result, OLED TVs have minimal thickness, excellent viewing angles and provide almost “0” black. However, they are expensive and have limitations in peak brightness. LED LCDs are much cheaper, provide high peak brightness, and FALD backlight effectively reduces the black level due to the local dimming of individual screen segments. Since almost all modern TVs provide high image quality due to additional efficient technologies, the thickness and price significantly affect the final consumer choice.
On the other hand, LG has advanced significantly in the development of OLED technology, introducing several relatively inexpensive 4K OLED models in 2018, including LG OLED B8 PLA. Of course, this trend can significantly reduce price differences.
Thus, OLED and LED LCD displays provide high contrast, but the OLED matrix forms it due to the minimum black level, and the LED LCD screen provides it with high peak brightness.
The UHD Alliance elegantly solved this problem with two standards, each of which corresponds to a UHD Premium:
– peak brightness more than 1000 nits and black level less than 0.05 nits;
– brightness over 540 nits and black level less than 0.0005 nits.
Thus, LED LCD TVs can qualify for Ultra HD Premium status in accordance with the first standard, and OLED models can use the second standard.
Conclusion
In general, modern LED LCD TVs provide a high peak brightness with a traditional black level, and OLED TVs reproduce a less bright picture with a deeper black. The difference in perception mainly depends on the content and personal preferences of the viewer.
Technology roadmap for UHD will help you navigate the current situation in this segment.
Of course, today support for HDR is one of the important criteria for choosing a TV.
This video offers a comparison of SDR vs 4K HDR (HLG) image quality.
Pingback: QD-OLED, QD-MicroLED and QDEL technologies in quantum dot display Review - The Appliances Reviews
Pingback: Samsung Neo QLED Mini-LED vs LG QNED Mini-LED vs TCL OD-Zero Mini-LED at CES 2021 Review - The Appliances Reviews
Pingback: What is 8-bit, 10-bit and 12-bit color or color depth in TVs ? - The Appliances Reviews
Pingback: Review of the Samsung Q900R 8K TVs - The Appliances Reviews
Pingback: Review of LG Nanocell 75SM9970 (75SM9900 in Europe) 8K TV - The Appliances Reviews
Pingback: Review of 8K TVs - The Appliances Reviews
Pingback: What is Micro LED - The Appliances Reviews
Pingback: What is HFR (high frame rate) format? - The Appliances Reviews