The Sad, Misleading, and Embarrassing State of HDR in PC Gaming

HDR is ridiculously bad for PC gaming in most cases, and we all know that’s true.

It might surprise you, though, if you just considered how gaming monitors are advertised. After all, on paper, HDR is technically supported by your monitor, games, and graphics card. Heck, even Windows supports relatively bug-free HDR these days.

So who is to blame then? Well, when I dug deep to find the answer, I found three main culprits that explain our current situation. And even with some light at the end of the tunnel, this multi-faceted problem isn’t about to go away on its own.

The game problem

Jacob Roach / Digital Trends

I should start by explaining why HDR is a problem specific to PC games. It’s a wildly variable experience depending on what screen you have and what game you’re playing, which makes this whole HDR mess even more confusing on PC. The main reason for this is static metadata.

There are three main HDR standards: HDR10, HDR10+ and Dolby Vision. The latter two support dynamic metadata, which basically means they can dynamically feed display information based on what the monitor is capable of and what scene you’re in (even what image is currently on screen). HDR10, on the other hand, only has static metadata.

Dynamic metadata is a big reason why HDR on console is so much better than HDR on PC.

Only a few select monitors support Dolby Vision, like Apple’s Pro Display XDR, and none of them are gaming monitors. There are a few HDR10+ monitors, but they’re exclusively from Samsung’s most expensive displays. . The vast majority of monitors deal with static metadata. However, TVs and consoles widely support Dolby Vision, which is a big part of why console HDR is so much better than PC HDR.

As Alexander Mejia, former game developer and product manager for Dolby Vision Gaming, points out, static metadata creates a big problem for game developers: “There are more and more televisions, monitors and laptops HDR on the market, but if you take a couple from your local big box retailer your game is going to look drastically different on each one… How do you know that the look you set in your studio will be the same as the one you the player will see? »

HDR comparison in Devil May Cry 5.
Jacob Roach / Digital Trends

On my Samsung Odyssey G7, for example, Tina Tiny’s Wonderland looks dark and unnatural with HDR on, but The devil may cry 5 looks naturally dynamic. Research user experiences on these two games and you’ll find reports ranging from the best HDR game ever to downright terrible image quality.

It doesn’t help that HDR is usually an afterthought for game developers. Mejia writes that developers “should always provide a standard dynamic range version of your game – and making a separate version for HDR means twice the mastering, testing, and quality assurance. Good luck getting approval at this topic.

There are many examples of developer apathy towards HDR. The recently released Ring Elden, for example, shows terrible flickering in complex scenes with HDR and motion blur enabled (above). Disable HDR and the problem goes away (even with motion blur still on). And in Destiny 2, HDR calibration was broken during four years. HDTVTest found that the slider did not map brightness correctly in 2018. The issue was not resolved until February 2022 with the release of The Witch Queen expansion.

Gaming is a source of problems for HDR on PC, but it’s a problem of consequence: A problem that stems from a gaming monitor market that seems frozen in time.

The monitor problem

The Alienware QD-OLED monitor in front of a window.

Even with the many Windows bugs that HDR has caused over the past few years, monitors are the main source of HDR issues. Anyone familiar with display technology can list the issues without a second thought, and that’s the point: after years of HDR monitors flooding the market, screens are mostly in the same place as they were when HDR first landed on Windows.

Traditional knowledge is that good HDR requires at least 1000 nits of peak brightness, which is only partially true. Brighter screens help, but only because they are able to generate higher levels of contrast. For example, the Samsung Odyssey Neo G9 is capable of twice as much brightness as the Alienware 34 QD-OLED, but the Alienware display offers much better HDR due to its exponentially higher contrast ratio.

There are three things a display needs to get good HDR performance:

  1. High contrast ratio (10,000:1 or better)
  2. Dynamic HDR metadata
  3. Extended color gamut (above 100% sRGB)

TVs like the LG C2 OLED are so desirable for console gaming because OLED panels offer massive contrast (1,000,000:1 or more). Most LED monitors top out at 3000:1, which isn’t good enough for solid HDR. Instead, monitors use local dimming — independently controlling light on certain sections of the screen — to boost contrast.

A colorful image on the LG C2 OLED screen.
Dan Baker/Digital Trends

Even high-end gaming monitors (above $800) don’t come with enough zones, though. The LG 27GP950-B only has 16, while the Samsung Odyssey G7 has an embarrassing eight. For a really high contrast ratio you need a lot more zones, like the Asus ROG Swift PG32UQX with over 1,000 local dimming zones – a monitor that costs more than building a new computer.

The vast majority of HDR monitors don’t even scratch the bare minimum. On Newegg, for example, 502 of the 671 HDR gaming monitors currently available only meet VESA’s DisplayHDR 400 certification, which doesn’t require local dimming, extended color gamut, or dynamic metadata.

An example of local dimming on a Vizio TV.

Spending on a premium experience isn’t new, but it has been for four years now. Instead of premium features becoming mainstream, the market has been flooded with monitors that may advertise HDR without offering any of the features that make HDR work in the first place. And monitors that tick those sub-$1,000 boxes typically save money to do so with few local dimming zones and shoddy color coverage.

There are exceptions, like the Asus ROG Swift PG27UQ, which provide an excellent HDR gaming experience. But the fact remains that the vast majority of monitors available today aren’t too different from monitors available four years ago, at least in terms of HDR.

The light at the end of the tunnel

The ultra-wide curved QD-OLED monitor.

The HDR experience on PC has been mostly static for four years, but that’s changing due to a fancy new display technology: QD-OLED. As the Alienware 34 QD-OLED shows, it’s the panel technology that will truly drive HDR in PC games. And good news for gamers, you won’t have to spend more than $2,500 to access it.

MSI just announced its first QD-OLED monitor with identical specs to Alienware, and I suspect it will use the exact same panel. If so, we should see a wave of 21:9 QD-OLED monitors by early next year.

We’re also seeing more OLED monitors, like the recently announced 48-inch LG 48GQ900. These are TVs marketed as gaming monitors, sure, but display makers are clearly in tune with gamer demand for OLED panels. Hopefully we’ll see some that are the size of a proper monitor.

There are other display technologies that offer better HDR performance, such as mini LED. But QD-OLED is the seismic change that will hopefully finally make HDR a reality for PC gaming.

Editors’ Recommendations






About Dwayne Wakefield

Check Also

How to get ‘sparkling’ windows and glass with a ‘secret ingredient’ – homemade cleaning spray

Windows, mirrors and glass shower doors can be troublesome to clean, with stains and marks …