All updates to this product will be posted below. Click 'Follow Product Status' to receive notifications when there are updates for this product.
Does anyone who owns this monitor know the vertical height of the stand? I’m looking into picking one up but (due to my desk’s ‘upper level’) I need the stand to fit under ~14.5". Cheers in advance.
I wouldn’t recommend the 100 brightness setting anyway. I’ve observed that it can lose color details. For example a character was wearing a coat with fur around the neck. In the default 90% setting, I could see the edges of the fur were gradually turning slightly yellow-ish. In 100% brightness mode, it didn’t. It was just white. I would describe the 100% brightness mode on this monitor as a gimmick, just so Asus can claim “1300” nits on the specs. Basically the analog of the “1ms” response time in LCD monitors by adding an overdrive setting that degrades image quality, but allows them to put “1ms” on the box :P
Agreed. It’s the main reason I returned this thing. I haven’t heard the same report on the pg27aqdp though or the lg variants. Glad rtings mentions it now (no other reviewer does even though it is obvious).
But if I understand correctly, with the brightness set to 100, it’s actually over-brightening the small highlights ?
I wouldn’t recommend the 100 brightness setting anyway. I’ve observed that it can lose color details. For example a character was wearing a coat with fur around the neck. In the default 90% setting, I could see the edges of the fur were gradually turning slightly yellow-ish. In 100% brightness mode, it didn’t. It was just white.
I would describe the 100% brightness mode on this monitor as a gimmick, just so Asus can claim “1300” nits on the specs. Basically the analog of the “1ms” response time in LCD monitors by adding an overdrive setting that degrades image quality, but allows them to put “1ms” on the box :P
Just read it. Doesn’t that mean that the monitor has bad EOTF tracking ? The game asks the display to display a value in nits, so 750 nits, not just “maximum whatever brightness the monitor can display”. If the monitor displays a higher brightness, it’s technically inaccurate. That’s why most monitors clip at their actual maximum brightness ?
The game might want 10000 nits if it was developed using a 10k nits HDR reference display. Obviously we don’t want that, because that would mean we’d lose all HDR detail. The game needs to find out what the clipping point is, so that it knows how to map its internal range of 0-10000 nits to 0-750. If the game normally tries to output 10k, it will map that to 750 instead. If it tries to output 8000, it will map that to something like 670.
It’s not about accuracy. It’s about converting the dynamic range the game was developed in to the dynamic range of the display you’re using. There’s different ways to do that. For games, letting the game itself do this conversion is best, which is accomplished by enabling an “HGIG” mode on the display (and presumably, this is the “console” mode on this monitor.) Movies can have a different way to do this mapping, where HDR metadata is being programmed by the video player. I suppose that’s what the “cinema” mode is for. Lastly, you can have the display itself do the mapping, which is “gaming” mode.
If you let the display do the conversion (“tone mapping”), then in order to actually get maximum brightness you’d need an input that’s quite high (like 2000 nits.) I don’t know what the required input brightness is in “gaming” mode, but in HGIG mode (“console”,) you only need an input that matches the clipping point to get the highest output brightness. And that’s why you do the calibration to find that clipping point. The only thing the game needs to know is that.
But if I understand correctly, with the brightness set to 100, it’s actually over-brightening the small highlights ?
Monitors Unboxed showed that at a 2% window this monitor indeed overbrightens highlights. I suspect that is by design due to the ~10% window calibration setting used by windows and most hdr apps with their own calibration slider.
But if I understand correctly, with the brightness set to 100, it’s actually over-brightening the small highlights ?
I’m not sure if Rtings used HDR calibration app or just the default HDR metadata(800 maxcll, 260 maxfall), but they tested the EOTF tracking with 100 brightness and it looks pretty good. 90 brightness is actually the default setting on this monitor which is indeed 750 nits at 2% APL, I feel like that could be something to do with why the white clipping point is 750 nits even on 100 brightness