In 2016 HDR technology was something that started appearing in our television sets even if we didn’t quite understand what it was and why we needed it. My old telly worked just fine before HDR came along! in 2017 more and more PC monitors are now starting to include this technology and no doubt it will become equally ubiquitous. So now’s a good time to get your head round this new acronym and see if it’s worth the hype.
What does HDR mean?
HDR stands for high dynamic range, wiki describes this as a technique used in imaging that reproduces a greater dynamic range of luminosity than possible with standard digital imaging techniques. This means you see a more significant difference between the bright parts of an image and dark parts. The goal is to create more realistic looking images in games and movies by allowing greater levels of detail.
So for example, in an SDR screen (standard dynamic range), the detail in darker scenes can be lost as subtle grey tones fade into black. Similarly brighter parts of the image may be lost in a white background. HDR aims to solve this problem. Bright parts of the image can be really bright, the dark ones can be really dark, and the detail is visible in both.
The overall effect is a more dynamic image that will seem more “real” to us the viewer.
HDR and PC desktop monitors
It’s early days in the PC monitor market when it comes to PC implementation. Delivering content is not so much of a problem, this can be handled by existing graphics cards and updates to drivers. And whilst a standard for a true HDR experience has emerged in the television market it looks like, as is often the case, Nvidia and AMD will have their own approaches under their FreeSync and G-Sync brands.
If you are looking at adopting HDR then you’ll have to check carefully to see what is on offer and we’ll be sure to take a look at HDR in our upcoming monitor overviews.