Two years ago I bought the “Samsung Galaxy Buds+” after reading the highly favorable RTINGS review. When I started testing them I couldn’t believe the poor sound quality. The sound signature was balanced but the drum cymbals sounded compressed and very unpleasant. I found out that the problem was related to using the default bluetooth audio codec: SBC. On Android phones (not Samsung, since Samsung phones have a high quality codec for its buds), I had the possibility to manually change the codec from SBC to AAC, which substantially improved the sound quality and removed that problem with the cymbals. However, when using these headphones on a Windows PC, I could only stick with the SBC codec. That horrible compression in the high frequencies is especially noticeable when listening to music that has a lot of information. In my case, it was especially noticeable when listening to genres like heavy metal.
This was the time I was most disappointed with RTINGS because this is such a determining factor in sound quality but it’s not taken into account in the reviews. I wrote the following post and two users confirmed that they had the same issue: https://www.rtings.com/discussions/Nj38vkqXv7MxBUj7/
Would it be possible to make a test for wireless headphones where you can check if there are audible compression artifacts due to the codec? Maybe recording some noise reproduced by the headphones or some busy music that can show this kind of compression in the spectrogram. This could also be useful for wireless bluetooth speakers.
No it doesn’t seem to cause any noticeable input lag, I had the B7 for a long time and looked at various work arounds to dark HDR, all dynamic contrast seems to do is mess with the gamma values nothing more than that.
The dynamic contrast low in cinema mode is something different, that was a prototype of LG’s dynamic tone mapping feature that came in later models, but this only works in cinema mode, wont do the same in game mode.
Most games should have HDR calibration settings at this point were you can hopefully offset the problem.
The 8 series onwards changed how they measured HDR metadata and more recent models have HGIG alongside LG’s own dynamic tone mapping feature available in game mode.
It was possible to alter the HDR data for the way the 7 series was designed but this was complicated and involved the use of a HD Fury device to inject the data sitting between console and TV, you also had to alter it based on each game, I forget the details beyond that but it was more hassle than it was worth I thought at the time.
No it doesn’t seem to cause any noticeable input lag, I had the B7 for a long time and looked at various work arounds to dark HDR, all dynamic contrast seems to do is mess with the gamma values nothing more than that. The dynamic contrast low in cinema mode is something different, that was a prototype of LG’s dynamic tone mapping feature that came in later models, but this only works in cinema mode, wont do the same in game mode. Most games should have HDR calibration settings at this point were you can hopefully offset the problem. The 8 series onwards changed how they measured HDR metadata and more recent models have HGIG alongside LG’s own dynamic tone mapping feature available in game mode. It was possible to alter the HDR data for the way the 7 series was designed but this was complicated and involved the use of a HD Fury device to inject the data sitting between console and TV, you also had to alter it based on each game, I forget the details beyond that but it was more hassle than it was worth I thought at the time.
Do you recommend using the dynamic contrast option or changing the brightness settings in-game?