Fairly off topic but I’m at a crossroads of inconvenience and improved video quality with the C1. Below applies to ARC and AVR so I’d like your opinion:
Apple and Netflix are doing something that messes with the audio stream.
I have a Fiio BTR3K connected to USB1.
Dolbly Audio Processing ON I get audio from HDMI and USB respectively from the output that I choose.
ONLY Netflix and Apple will play ATMOS. The rest will be 5.1. Both Headphone and AVR.
Conversely, if I turn Dolby Processing OFF, which it should be, I get ATMOS from all other apps, except Netflix and Apple (non-ATMOS for those two is fine).
If I play ATMOS from either of them the Shield goes coo-coo with Headphones/DAC. No volume and if I try and adjust it, the Shield tells me to use the original remote (there is none) and internal volume control ceases to work on that content.
It seems to me Netflix and Apple are manipulating the ATMOS stream in a non-Srandard format in some manner that the Shield doesn’t like but don’t know how to verify.
Tested through AVR Onkyo NR6100 (same I’m sure on my Denon since it’s not an AVR issue) and via eARC Shield -> LG C1 -> AVR.
If you’re wondering why I need the BTR3K USB DAC/AMP, I don’t want constantly (every night) plug and unplug my cans from the AVR out of convenience and fear of wearing out the socket.
And no I cannot use Zone 2/B since I neither have the money or room for yet another receiver.
This is my bedroom setup so it’s not amazing by any stretch so I have two options if nobody knows what’s going on which id like your opinions on.
1) I can continue using the ShieldPro and set up a Macro to turn Dolby Processing on/off each time I switch between streaming services.
2) Deal with the fact that I need to plug and unplug headphones.
A) Continue using the outdated Shield and have passthrough (though I don’t know how much that matters just for streaming services)
B) Get a 2024 Onn Pro (yes it’s inferior hardware wise) and have ATMOS over DD+ (which I think all streaming services use anyway) and reap the benefits of IMO a snapier interface and AV1 video support.
In Discussion:
• Posted 10 months ago
Update: We uploaded the latest brightness measurements and uniformity photos for the Accelerated Longevity Test.
I’m curious if your test unit is suffering from an absurd number of dead pixels and dead spots (around the border). This seems to be a huge problem on this model, and my 55" unit with 10,000hrs on it has probably hundreds of dead pixels and a few dead spots (that are noticeable from viewing distance). It also appears to not be limited to this model and also occurs on Sony tv’s also.
If I had known this was possible, I might have explored other TVs, and perhaps your OLED reviews should bring some attention to this problem. It would be great if LG could be persuaded to fix it since it’s a manufacturing defect.
Hi there,
I took a look at all the OLED’s we have in office, including our C1, and most of the older models do have some amount of dead pixels around the edge of the display. On most of them, this is occurring around the top and side edges. This is curious, and we are not positive as to why it is happening, but it is something we are aware of and want to keep an eye on. So far, last year’s LG and Sony OLED’s seems to be fine and the QD OLED’s also look fine. But the previous gen X,1, and 2 series LG’s and the H, and J Sony models we have in office have dead pixels or stuck/altered subpixels.
We want to look into this more, so thank you for input.
I’m curious if your test unit is suffering from an absurd number of dead pixels and dead spots (around the border). This seems to be a huge problem on this model, and my 55" unit with 10,000hrs on it has probably hundreds of dead pixels and a few dead spots (that are noticeable from viewing distance). It also appears to not be limited to this model and also occurs on Sony tv’s also.
If I had known this was possible, I might have explored other TVs, and perhaps your OLED reviews should bring some attention to this problem. It would be great if LG could be persuaded to fix it since it’s a manufacturing defect.
I was wondering if you could re-test the Input Delay Mode on the latest firmware because the refresh mode behavior I’m seeing now is swapped. Standard mode now shows at 120Hz for 60Hz and Boost as 60Hz for 60Hz, seen with a slow mo video recording. The gamma changes however are not swapped, so the more accurate blacks are still there as expected for Boost mode (which I would really like for 120Hz signals which forces the less accurate gamma in any Reduce Input Delay mode). Maybe you would also find input lag is lower in Standard mode now if you also find that the 120Hz mode is on for Standard mode.
Thanks for taking the time to share your thoughts with us. While we generally do our best to look into things for users, our focus right now is providing timely reviews of the 2024 models. So, unfortunately, this isn’t something we’d be able to look into any time soon. Sorry about that!
I was wondering if you could re-test the Input Delay Mode on the latest firmware because the refresh mode behavior I’m seeing now is swapped. Standard mode now shows at 120Hz for 60Hz and Boost as 60Hz for 60Hz, seen with a slow mo video recording. The gamma changes however are not swapped, so the more accurate blacks are still there as expected for Boost mode (which I would really like for 120Hz signals which forces the less accurate gamma in any Reduce Input Delay mode). Maybe you would also find input lag is lower in Standard mode now if you also find that the 120Hz mode is on for Standard mode.
Edit: I retested with the Time Sleuth which only supports up to 1080p 60Hz and nothing changed input lag wise. Still 9/13/16ms (top/middle/bottom) in Standard mode and 2/10/17ms in Boost mode. Maybe it’s only in 4K that it swapped or my slo mo recording is fooling me. I will try to figure this one out and report later.
Fairly off topic but I’m at a crossroads of inconvenience and improved video quality with the C1. Below applies to ARC and AVR so I’d like your opinion:
Apple and Netflix are doing something that messes with the audio stream.
I have a Fiio BTR3K connected to USB1.
Dolbly Audio Processing ON I get audio from HDMI and USB respectively from the output that I choose. ONLY Netflix and Apple will play ATMOS. The rest will be 5.1. Both Headphone and AVR.
Conversely, if I turn Dolby Processing OFF, which it should be, I get ATMOS from all other apps, except Netflix and Apple (non-ATMOS for those two is fine). If I play ATMOS from either of them the Shield goes coo-coo with Headphones/DAC. No volume and if I try and adjust it, the Shield tells me to use the original remote (there is none) and internal volume control ceases to work on that content.
It seems to me Netflix and Apple are manipulating the ATMOS stream in a non-Srandard format in some manner that the Shield doesn’t like but don’t know how to verify.
Tested through AVR Onkyo NR6100 (same I’m sure on my Denon since it’s not an AVR issue) and via eARC Shield -> LG C1 -> AVR.
If you’re wondering why I need the BTR3K USB DAC/AMP, I don’t want constantly (every night) plug and unplug my cans from the AVR out of convenience and fear of wearing out the socket.
And no I cannot use Zone 2/B since I neither have the money or room for yet another receiver.
This is my bedroom setup so it’s not amazing by any stretch so I have two options if nobody knows what’s going on which id like your opinions on.
1) I can continue using the ShieldPro and set up a Macro to turn Dolby Processing on/off each time I switch between streaming services.
2) Deal with the fact that I need to plug and unplug headphones. A) Continue using the outdated Shield and have passthrough (though I don’t know how much that matters just for streaming services) B) Get a 2024 Onn Pro (yes it’s inferior hardware wise) and have ATMOS over DD+ (which I think all streaming services use anyway) and reap the benefits of IMO a snapier interface and AV1 video support.
Update: We uploaded the latest brightness measurements and uniformity photos for the Accelerated Longevity Test.
Hi there,
I took a look at all the OLED’s we have in office, including our C1, and most of the older models do have some amount of dead pixels around the edge of the display. On most of them, this is occurring around the top and side edges. This is curious, and we are not positive as to why it is happening, but it is something we are aware of and want to keep an eye on. So far, last year’s LG and Sony OLED’s seems to be fine and the QD OLED’s also look fine. But the previous gen X,1, and 2 series LG’s and the H, and J Sony models we have in office have dead pixels or stuck/altered subpixels.
We want to look into this more, so thank you for input.
I’m curious if your test unit is suffering from an absurd number of dead pixels and dead spots (around the border). This seems to be a huge problem on this model, and my 55" unit with 10,000hrs on it has probably hundreds of dead pixels and a few dead spots (that are noticeable from viewing distance). It also appears to not be limited to this model and also occurs on Sony tv’s also.
If I had known this was possible, I might have explored other TVs, and perhaps your OLED reviews should bring some attention to this problem. It would be great if LG could be persuaded to fix it since it’s a manufacturing defect.
Thanks for taking the time to share your thoughts with us. While we generally do our best to look into things for users, our focus right now is providing timely reviews of the 2024 models. So, unfortunately, this isn’t something we’d be able to look into any time soon. Sorry about that!
I was wondering if you could re-test the Input Delay Mode on the latest firmware because the refresh mode behavior I’m seeing now is swapped. Standard mode now shows at 120Hz for 60Hz and Boost as 60Hz for 60Hz, seen with a slow mo video recording. The gamma changes however are not swapped, so the more accurate blacks are still there as expected for Boost mode (which I would really like for 120Hz signals which forces the less accurate gamma in any Reduce Input Delay mode). Maybe you would also find input lag is lower in Standard mode now if you also find that the 120Hz mode is on for Standard mode.
Edit: I retested with the Time Sleuth which only supports up to 1080p 60Hz and nothing changed input lag wise. Still 9/13/16ms (top/middle/bottom) in Standard mode and 2/10/17ms in Boost mode. Maybe it’s only in 4K that it swapped or my slo mo recording is fooling me. I will try to figure this one out and report later.
Update: Updated text in the PS5 Compatibility section of this review after confirming 1440p works and added text to the Xbox Series X|S Compatibility section.