• notebookcomputer
  • 26/12/2022
  • 231 Views

HDMI 2.1a is on its way. Here’s what you need to know

Most of the A/V industry — and most of the folks who buy A/V products — are still coming to grips with how the various features of HDMI 2.1 affect TVs, soundbars, and a whole slew of other devices. But you can’t halt progress, and the next version of the HDMI standard — known as HDMI 2.1a — has already started its slow roll toward certification and eventual adoption. What does this new specification include, how will it affect you, and what will you need if you want to take advantage of it? We’ve got everything you need to know.

Contents

A small upgrade for a small group of people

The first thing you should know about HDMI 2.1a is that you probably don’t need it. As a minor update to the HDMI 2.1 specification, it only introduces one new feature: Source-Based Tone Mapping (SBTM).

We’ll get into what SBTM is and how it works in a moment, but here’s the TL;DR: Unless you are a gamer looking for the ultimate in HDR performance, or a PC user who routinely works with a combination of SDR and HDR content, you can safely ignore HDMI 2.1a for a long time to come. You don’t need to buy new equipment and you don’t need to worry about compatibility with the new version.

What is Source-Based Tone Mapping (SBTM)?

Before we get into source-based tone mapping, let’s quickly describe tone mapping. Tone mapping is a process that takes place inside a TV and a very few computer monitors. That process examines the incoming video signal and determines how it should be presented using the display’s capabilities. The display is mainly interested in hue, saturation, and luminance — the three main building blocks of color, contrast, and brightness.

Quite often, the source material will possess values for hue, saturation, and luminance that exceed what the TV can display — this happens most often with high dynamic range (HDR) content.

So the display has to make some decisions about how to translate those “out of bounds” values into something it can reproduce. It’s rarely perfect — compromises must be made — but when it’s well-executed, you get to see a version of the movie or TV show that looks as close to the original material as possible. That’s tone mapping in a nutshell.

Right now, a source device like a streaming media player, Blu-ray player, or a gaming console is able to determine if a connected display can handle HDR signals or not, but it makes no other adjustments to the signal it sends.

Source-based tone mapping changes this by letting a display communicate its hue, saturation, and luminance capabilities back to the source device, which in turn lets the source device do the tone mapping before the signal ever reaches the connected display.

Why would you want a source device to do tone mapping?

Under normal viewing — say when watching a movie or a TV show — the TV is perfectly capable of evaluating the incoming signal and doing the necessary tone mapping because this content neatly fits into one of two boxes: Standard dynamic range (SDR), or HDR. Either way, the TV knows how to perform the tone mapping.

But there are some situations where the TV is asked to display two kinds of content at once. A classic example is the home screen of a streaming app, like Netflix, Amazon Prime Video, or Disney+. While the graphics (buttons, menus, etc.) might be presented in SDR, there could be video thumbnails or other elements on the screen that are in HDR.

That can create a conundrum for the TV. If initially, all of the on-screen elements are SDR, the display will choose a corresponding set of hue, saturation, and luminance values to make them look as good as possible. But if, as you scroll through your options, an HDR element shows up — perhaps as an animated thumbnail for a show — the display is forced to adjust these values again as it tries to preserve the look of the HDR components. But as it does this, the SDR elements can become compromised and can end up looking dull and lifeless.

The same can happen in reverse.

The thing is, the display doesn’t actually know that it’s dealing with little HDR windows inside of an SDR interface. It lacks the contextual awareness that would let it perform its tone mapping in a way that was optimized for both kinds of content.

HDMI 2.1a is on its way. Here’s what you need to know

In theory, the source device — in this case the streaming media player — understands why there’s this mix of SDR and HDR, and it even understands which of these elements should take precedence when the tone mapping is performed. It would also avoid the mistake of constantly changing the tone mapping as the mix of HDR and SDR elements change over time. When armed with the knowledge of a display’s capabilities (via SBTM) it can ensure that the image the display shows you has the right balance for all of the on-screen elements.

Do I really want an external device doing tone mapping?

As we explained above, under ideal circumstances, a source device should be able to perform better tone mapping than a display, because it understands the visual context and it understands a display’s capabilities.

But that doesn’t necessarily mean your source device will actually do a better job at tone mapping than your display. Tone mapping can be a computationally demanding process and given that different source devices possess different levels of computing horsepower, it’s not unreasonable to think that some of these devices will do a better job than others. An Apple TV 4K (2021) has way more power than a Roku Streaming Stick 4K, for example, and a PlayStation 5 has way more power than an Apple TV 4K.

By the same token, some TVs have extraordinary image processing capabilities (flagship models from LG, Sony, and Samsung are all superb in this regard). So your decision about whether or not SBTM is something you actually want may depend on the equipment you own and that equipment’s strengths and weaknesses.

The rise of HDR computing

SBTM may also help when it comes to mixed dynamic range content on computers. If you connect your computer to an HDR-capable display via HDMI, and your computer’s graphics card can handle it, you can see HDR content. Just like in the Netflix example above, there could be times when you’ll have HDR content in one window, SDR content in another window, along with all of the computer’s user interface elements, which are graphics generated by the operating system.

In such a complex environment, an HDR display will once again need to make some decisions around tone mapping, and like in the Netflix situation, those decisions could negatively affect the viewing experience. SBTM gives the computer the ability to override the display’s tone mapping while simultaneously making sure that you get the full benefit of your HDR display’s capabilities.

Unfortunately, much of this is still only theoretical. The vast majority of computing is still done in SDR. Laptops that have built-in HDR-capable screens won’t need SBTM, and if your external monitor isn’t HDR-capable, there won’t be a lot of benefit to SBTM. Perhaps most importantly, your computer and display will both need to support HDMI 2.1a (and be connected via HDMI) to realize these benefits — and there’s no indication that the companies that make these devices are about to jump on board.

Gamers are the real winners

Netflix may not agree, but I don’t think too many folks will be fussed if the auto-play videos that show up while they’re browsing for something to watch don’t look their best.

Gamers, on the other hand, have a legitimate reason to be concerned about how their displays perform tone mapping as it can have a detrimental effect on their gaming experience when it’s done poorly.

In some games, a lack of shadow detail can cause you to miss an on-screen opponent lurking in a corner. In a racing game, a lack of detail in a bright region of the screen might camouflage an upcoming bend in the road or the presence of another car. Both of these examples can and do happen when a display has to figure out which parts of an HDR image matter most.

In fact, game developers spend a lot of time determining the appropriate tone mapping for each scene to ensure that brightness levels enhance their intent instead of detracting from it. And when a TV’s built-in tone mapping decides to alter the presentation based on its own understanding of a scene, it can undo a developer’s work.

It’s one thing when this happens in an HDR game that you’re playing in standalone mode, but it’s a very different issue when playing against someone else. If your rival’s TV happens to do a better job of tone mapping than yours, it could give them a competitive edge.

This problem — of accurately and consistently tone mapping HDR games on a wide variety of TVs — was considered a big enough deal that in 2018, Microsoft (Xbox), Sony (PlayStation), LG and Vizio formed the HDR Gaming Interest Group (HGIG) in an attempt to give game developers a standardized way to do source-based tone mapping. By sending display characteristics back to the console or PC via HDMI, a game could be tone mapped consistently for all TVs and monitors.

HGIG eventually counted dozens of key companies among its members, including Activision, Ubisoft, and Google (just to name a few). But adoption of the HGIG’s recommendations has been spotty and, as of today, there’s still no industry-wide set of standards for companies to follow.

Things look somewhat bleak for the HGIG initiative, but it may not matter. If console, PC, and TV makers adopt HDMI 2.1a (which is far from a guarantee right now) SBTM will check most of the boxes that HGIG had envisioned.

Wait, what if I watch Netflix (or some other streaming service) using an app on my smart TV?

Good question, you’re obviously getting the hang of this whole SBTM thing. In the event that you only use the built-in apps on your smart TV, source-based tone mapping doesn’t affect you at all.

Native streaming apps are developed by third parties specifically for a given TV and operating system, e.g. A 2022 LG OLED TV running WebOS. So the tone mapping process is already as optimized as it can get. In fact, if all of the content we viewed on our TVs came from native apps, we wouldn’t need SBTM at all.

Is HDMI 2.1a available now?

No. As of February 2022, the specification is still making its way through the certification process and no manufacturers have declared support for SBTM. The HDMI Forum, which develops new HDMI standards, plans to release HDMI 2.1a by the end of March 2022. In theory, any products that are firmware-upgradeable to the new specification could see those updates happen any time after this date, though keep in mind that it could take some time to do the tests that would be needed. We probably won’t see new products touting SBTM as a feature until 2023.

What will I need for HDMI 2.1a?

Whether you want to take advantage of SBTM for a better media or gaming experience (or both), all of your devices will need to be upgraded to HDMI 2.1a. This includes your gaming console, streaming media device or other external sources, your TV or monitor, and any devices like A/V receivers or soundbars that may sit in between your TV and your source in the HDMI chain.

Does that mean I have to buy new products?

Not necessarily. It’s expected that most source devices and displays sold in the last few years will be firmware updatable to support HDMI 2.1a, but we have no way of knowing exactly which companies and which products will do so. There’s also no guarantee that just because your TV is physically capable of working with HDMI 2.1a, your manufacturer will choose to issue an update.

It’s also highly unlikely that you’ll need to replace your existing HDMI cables. SBTM uses Extended Display Identification Data (EDID) signaling to communicate a display’s capabilities to a connected source device, and EDID has been part of the HDMI specification since version 1.0.

How will I know if a new product supports HDMI 2.1a?

The HDMI Licensing Administrator, which oversees the way that HDMI-capable products are marketed, has a pretty clear rule that products like TVs and set-top boxes aren’t allowed to simply say that they support HDMI 2.1. Instead, they’re obliged to list the specific HDMI 2.1 features they support, e.g. HDMI eARC, variable refresh rate (VRR), 4K @ 120Hz, etc. The same will be true for HDMI 2.1a — you’ll need to look for some kind of mention of source-based tone mapping in the product description or specifications.