Get insider access
Preferred store
Notice: Your browser is not supported or outdated so some features of the site might not be available.
Notice: TV 2.0 is finally here! With this massive update, we've completely revamped the way we test and score TVs, with an emphasis on how a TV performs in a bright room. You can read about all the changes in our 2.0 changelog.
Notice: Improving our reviews takes a huge amount of time and effort. We're hiring writers and testers. Apply now!

R&D Snapshot
TV 2.0 Preview

Updated
Bright room reflection handling on 12 TVs

As TV manufacturers have been gearing up over the last few months to announce and release their 2025 lineups, we've been hard at work preparing our next test methodology update for TVs. This isn't just any old test bench update; we're here today to share a sneak peek behind the curtains of one of our biggest updates ever: TV 2.0.

TV 2.0 - Main Changes

TV 2.0 represents a significant change in our TV testing. We've changed nearly every aspect of our TV reviews in some way, from test coverage to the individual objective scoring splines. This is just a sample of the changes coming; stay tuned for a full changelog when we publish TV 2.0 in a few weeks. What you see here is also a work in progress, so some things could change between now and publication. It's also just a preview of some of the changes. We've changed much more than what's mentioned here, but these are the main changes.

Structure

One of the first things you'll notice when you open a review on TV 2.0 is that we've completely changed the structure. Some of our old sections, like Picture Quality, were getting very, very long and were filled with tests that not everyone cared about. To remedy this, we've restructured the reviews to show the most important tests first. Instead of 6 main test categories on TV 1.11, TV 2.0 is now broken down into 12 separate sections. These sections more or less match the usages at the top of the review.

  • Brightness
  • Black Level
  • Color
  • Processing
  • Game Mode Responsiveness
  • Motion Handling
  • Reflections
  • Panel
  • Inputs
  • Design
  • Smart Features
  • Sound Quality

This structure is far from final and will likely change between now and the publication of the final test bench update. Let us know what you think and if you have any suggestions!

SDR Color Volume

Gamut rings on the LG C4.
LG C4 gamut rings
Gamut rings on the Samsung S95D.
Samsung S95D gamut rings

Gamut rings are a new way of looking at color volume in SDR by splitting it into rings of increasing lightness, from 0 to 100. This makes it easier to see differences between panel types that weren't very obvious in our previous tests. As you can see above, colors on the LG C4 are noticeably desaturated at high lightness levels. The benefits of the S95D, which doesn't rely on a white subpixel to boost brightness, are immediately apparent.

HDR Calibration

Graphs showing the pre- and post- calibration accuracy on the Samsung S95D.
Pre- and post- calibration HDR accuracy on the Samsung S95D

We can't talk about HDR without talking about creative intent, but until now, we've only been able to talk about the range of brightness and colors that a TV could display. Besides our PQ EOTF test, we couldn't really talk about how well a TV tracks creative intent. TV 2.0 takes one small step closer to answering those questions, as we now measure the white balance dE, color dE, and the overall color temperature in HDR10. We do this twice, once with the TV in its most accurate pre-calibration settings and again after calibrating it.

Reflections

Total reflected light on the Samsung S95D.
Total reflected light on the Samsung S95D.
Samsung S95D direct reflections.
Samsung S95D direct reflections.

Last year's Samsung S95D OLED, with its matte coating and QD-OLED panel, exposed some limitations in our reflections handling. There are pros and cons to any TV coating, but other than measuring the pure intensity of reflections, we didn't really have any way to quantify how light impacts a TV's picture quality. One of our goals with this test bench was to expand our reflection handling tests to better capture the nuances between these different coatings so you can make an informed decision. We've redone our total and direct reflections tests to make the test easier to run and more representative, but we've also added two new tests.

The Ambient Black Level Raise test measures how much a TV's black levels rise when you're in a bright room. This became more of an issue in recent years with the release of QD-OLED panels. Since these TVs lack a polarizer, they have a noticeable purple tint when used in a bright room, and blacks aren't as deep. This new test lets you quickly see how the panel technology impacts contrast when watching TV in a bright room.

The Ambient Color Saturation test is very similar to the black level raise test mentioned above. Instead of looking at black levels, though, it looks at a TV's perceived color volume as a function of ambient lighting. This shows you how bright and vibrant colors will be in a bright room. Unlike the black level raise test, which shows the change in black levels relative to a dark room, the color saturation test shows the absolute values in both dark and bright rooms.

Samsung S95D
Ambient black level raise on the Samsung S95D.
Ambient color saturation on the Samsung S95D.
LG C4
Ambient black level raise on the LG C4.
Graph showing ambient color saturation on the LG C4.

Taking the above examples of the LG C4 and the Samsung S95D, you can immediately see the benefits of each model. The C4 retains its black levels better in a bright room, so it maintains more of those deep, inky blacks OLEDs are known for. On the other hand, the S95D maintains bright colors better, so if you prefer bright, saturated colors and usually watch TV in a bright room, you can immediately see why the S95D is a better choice.

Panel Tech

Graph showing the spectral power distribution of a TV.
Graph showing the spectral power distribution of a TV.
Graph showing the spectral power distribution of a TV.
Graph showing the spectral power distribution of a TV.
Graph showing the spectral power distribution of a TV.
Graph showing the spectral power distribution of a TV.
Graph showing the spectral power distribution of a TV.
Graph showing the spectral power distribution of a TV.
Graph showing the spectral power distribution of a TV.
Graph showing the spectral power distribution of a TV.
Graph showing the spectral power distribution of a TV.
Graph showing the spectral power distribution of a TV.
Graph showing the spectral power distribution of a TV.
Graph showing the spectral power distribution of a TV.
Graph showing the spectral power distribution of a TV.
Graph showing the spectral power distribution of a TV.
Graph showing the spectral power distribution of a TV.
Graph showing the spectral power distribution of a TV.
Graph showing the spectral power distribution of a TV.
Graph showing the spectral power distribution of a TV.
Graph showing the spectral power distribution of a TV.
Graph showing the spectral power distribution of a TV.
Graph showing the spectral power distribution of a TV.
Graph showing the spectral power distribution of a TV.

Although we've been collecting this data for years as part of the process needed to calibrate our equipment before each test, by popular demand, we're now including the spectral power distribution charts for all TVs under the Panel Technology section. Although most people shouldn't pay any attention to this, a TV's SPD tells us a lot about how it produces light. More precise peaks on each primary leads to better color separation and (usually) a wider color gamut. We can also see which TVs are using certain technologies like KSF phosphors or quantum dots.

Response Time/CAD

Our response time testing has remained relatively unchanged for the last 7 years. It was fine when people mainly used TVs for watching shows/movies or playing casual games on older consoles, but with the rise of high refresh rate TVs, it simply isn't good enough anymore. With more and more people looking to use their TVs as they would a high-end gaming monitor, we needed a better test. The solution to this was fairly easy, and by popular request, we ported a portion of the response time and cumulative absolute deviation tests we developed for Monitor 2.0.

Response Time - Calibrated Mode
Response Time - Calibrated Mode
Cumulative absolute deviation graphs at 60Hz
Game Mode @ 60Hz
Cumulative absolute deviation graphs at 120Hz
Game Mode @ 120Hz
Cumulative absolute deviation graphs at the max refresh rate.
Game Mode @ 144Hz

With this update, we've updated our response time test to use the new pursuit photo. This updated photo makes it easier to spot things like overshoot as specific color response times compared to our old photo. Like our previous test, we still run the response time test in the most accurate settings, and it's intended to show how motion is handled when watching movies, shows, or sports.

We've also added three new tests. Instead of measuring the response time itself, these tests measure the cumulative absolute deviation, or in other words, the total area shaded in yellow on the charts above. This new way of measuring transitions takes into account how severe overshoot is, so if you have two TVs that take the same amount of time to transition between two shades, but one overshoots the target before falling back, it'll score worse in this test. Unlike the response time test, the CAD tests are done in Game Mode, at 60Hz, 120Hz, and the max refresh rate of the TV.

Input Lag & Supported Resolutions

Input lag on the S95D on 1.11
Original input lag box on the Samsung S95D on version 1.11
Input lag on the S95D after 2.0
Input Lag box on the Samsung S95D after TV 2.0

On TV 2.0, we decided to simplify the input lag test to remove formats that don't matter as much anymore, like 1440p. TVs were never designed for 1440p inputs, and with the rise of HDMI 2.1, PC gamers looking to render their games at 1440p to achieve a higher framerate can simply have their graphics card upscale the image to 4k anyway, so it doesn't matter. This change isn't just on input lag, as we've also removed tests for 1440p in the VRR and Supported Resolutions sections of the review.

Beyond the test coverage itself, we've also made significant changes to our input lag scoring. Our scoring curve hasn't changed much in the last few years, but TVs have gotten a lot better. If input lag is important to you, it's difficult to make a buying decision when the best and worst TVs all score almost the same.

We also adjusted the 144Hz input lag measurements to instead measure the input at the maximum refresh rate a TV supports for both 1080p and 4k. With more and more TVs supporting 165Hz and even higher refresh rates, this ensures we're showing you the best input lag a given TV can provide.

Scoring Changes

Usage scores on TV 1.11
Usage scores under TV 1.11
List of usages and scores on TV 2.0
Updated usages under TV 2.0

Our usage scores have also been completely revamped. We've removed a few outdated usages and simplified others. We've also added new Performance Usages to TV reviews. These usage scores look at specific aspects of a TV's performance, so if you care about Brightness, for example, you just have to look at the Brightness score to understand how well the TV performs in that aspect. We're also much harsher on some of our scoring, ensuring that the scores accurately represent how a TV really performs and making it easier to choose one TV over another.

Comparison of the mixed usage normal distribution on TV 1.11 vs 2.0
Mixed Usage normal distribution before and after TV 2.0

The mixed usage score, for example, has shifted significantly, with an average decrease of 0.7. Low-end TVs that nobody should buy are now scoring deep in the red, making it clear to anyone, whether you're a hobbyist or not, that you should avoid them.

ModelMixed Usage Score 1.11Mixed Usage Score 2.0Difference
LG UT75706.54.6-1.9
Samsung Q60D7.15.8-1.3
Roku Select Series7.15.4-1.7
Sony BRAVIA 37.25.7-1.5
Hisense CanvasTV QLED 20247.35.9-1.4
Panasonic W95A7.97.5-0.4
Samsung The Frame 2024 QLED7.96.4-1.5
TCL QM7/QM751G QLED8.17.8-0.3
Sony X90L/X90CL8.17.6-0.5
LG QNED90T8.17.3-0.8
Hisense U7N [U7, U75N]8.27.5-0.7
Sony BRAVIA 7 QLED8.48.2-0.2
Samsung QN90D/QN90DD QLED8.48.1-0.3
Hisense U9N8.58.4-0.1
Hisense U8/U8N8.58.3-0.2
TCL QM8/QM851G QLED8.58.2-0.3
Hisense 75U8N8.57.9-0.6
Sony BRAVIA 9 QLED8.88.4-0.4
Panasonic Z85A OLED8.88.1-0.7
LG B4 OLED8.98.1-0.8
Sony BRAVIA 8 OLED8.97.9-1
Samsung S95D OLED98.6-0.4
Samsung S90D/S90DD OLED (QD-OLED)98.5-0.5
LG C4 OLED98.3-0.7
Panasonic Z95A OLED9.18.4-0.7
Sony A95L OLED9.28.7-0.5
LG G4 OLED9.28.6-0.6

Release Plan

Over the next few weeks, we'll wrap up the initial phase of our TV 2.0 launch plan, which includes retesting and rewriting 22 TV reviews from the last year or so. We're aiming to publish those reviews and the new methodology in a few weeks, on or around March 26th. We'll have another batch of 10 TVs updated shortly after launch, and we've already started testing the first 2025 models on 2.0, including the Amazon Fire TV Omni Mini-LED Series and the TCL QM6K. Going forward, we'll test all new TV reviews directly on the 2.0 test bench.

In addition to the updated reviews, we've been hard at work updating dozens of test articles, so full details of how we execute most of the new and updated tests will be available at launch.

Conclusion

TV 2.0 is the culmination of thousands of hours of research and development, and it wouldn't have been possible without the thousands of comments we've received. So, what do you think of the changes we've announced so far? This is just the start, as we have other changes planned over the course of the year and into next year. If you think we've missed something, let us know in the forums!

Comments

  1. Article

R&D Snapshot: TV 2.0 Preview: Main Discussion

What do you think of our article? Let us know below.


Want to learn more? Check out our complete list of articles and tests on the R&D page.

PreviewBack to editorFormat guide
Sort by:
newest first
  1. 2
    1
    0
    1
    0

    Hi WatcherTV, Thanks for taking the time to reach out with your suggestion! Unlike for monitors or specialty displays, USB-C has unfortunately not been adopted by TV manufacturers for conventional televisions. Should it be implemented, we will be sure to add it to our reviews. Don’t hesitate should you have any other suggestions or feedback for us. Cheers

    Thanks. We could show TV manufacturers that not having USB C support is a downside these days by missing points. It was good to have one digital standard for transmitting data, including video and music. Let’s be proactive and show producers that they should also move in this direction.

    I would like to have one type of cable and be able to connect different devices to each other. Optimist :)

  2. 2
    1
    0
    1
    0

    Hi. USB C is currently the most popular standard in IT. Including power supply (PD - power delivery), video signal transmission (DP - display port alt mode), or audio (USB Audio). Maybe you can add information whether the TV supports the DP standard and can work as a monitor via USB C or connect a SSD/HDD drive via USB C. Thanks.

    Hi WatcherTV,

    Thanks for taking the time to reach out with your suggestion!

    Unlike for monitors or specialty displays, USB-C has unfortunately not been adopted by TV manufacturers for conventional televisions. Should it be implemented, we will be sure to add it to our reviews.

    Don’t hesitate should you have any other suggestions or feedback for us.

    Cheers

  3. 2
    1
    0
    1
    0

    When is the next test bench release expected?

    We’re hoping to have the next test bench update completed by the end of the year.

  4. 2
    1
    0
    1
    0

    You’re correct that those tests aren’t currently included in mixed usage. We’ve already begun work on our next test bench, and we’ll be reevaluating what goes into mixed usage. Thanks for the feedback!

    When is the next test bench release expected?

  5. 2
    1
    0
    1
    0

    Hi. USB C is currently the most popular standard in IT. Including power supply (PD - power delivery), video signal transmission (DP - display port alt mode), or audio (USB Audio). Maybe you can add information whether the TV supports the DP standard and can work as a monitor via USB C or connect a SSD/HDD drive via USB C. Thanks.

  6. 3
    2
    1
    2
    0

    Hi folks! I just wanted to mention that we’re working on a new YouTube video and need your help to make it successful! We’re going to be compiling the questions/feedback we get on our new TV Test Bench Update using our discord channel.

    Simply put, we need your help to gather enough questions to make a meaningful video! So if there’s something you want to ask our display experts, or if you want to see your comment featured in our upcoming video later this month, then now’s your chance!

    Send us your questions to our discord channel ASAP!

    Thank you!!!

  7. 2
    1
    0
    1
    0

    Is the 4K @ 120Hz test done in Game Mode? The other resolutions and refresh rates have a score for both in and out of game mode, but not 4K @ 120Hz. Is it assumed only gamers will be using such a high refresh rate and thus it is assumed Game Mode is turned on, or is input lag just lower at high refresh rates?

    Yes, the 4k @ 120Hz test is done in Game Mode. All of the input lag tests that don’t say “Outside of Game Mode” or “With Interpolation” are done in Game Mode. Thanks.

  8. 2
    1
    0
    1
    0

    I find there are not Viewing Angle and Gray Uniformity in the Mixed Usage, is that right? Such as Frequency Response and 24p judder are not included in the Mixed Usage. Will these test items be added to the Mixed Usage in the future?

    You’re correct that those tests aren’t currently included in mixed usage. We’ve already begun work on our next test bench, and we’ll be reevaluating what goes into mixed usage. Thanks for the feedback!