All updates to this product will be posted below. Click 'Follow Product Status' to receive notifications when there are updates for this product.
Hello! Might be a bold request, but would it be possible to also include Game Mode color accuracy in the methodology? You tested the accuracy in the Professional mode, but there is no true Professional equivalent mode inside Game Mode. Also, since PC HDR is still a hot mess and only a few games have proper HDR implementation, it would be greatly appreciated if you could provide a bit more detail about SDR performance specifically in Game Mode, as the SDR brightness tests seem to focus mainly on Professional mode currently. Thank you very much for your amazing work and dedication!
Hi aeoj,
Thanks for taking the time to write to us with your suggestion!
Totally agree—PC HDR can definitely be a mess right now, and we’re hoping that changes sooner rather than later. Your suggestion to test color accuracy in Game Mode is a great one. It would indeed be valuable to understand how it compares to a reference picture mode for video content.
We’ve added your suggestions to our internal tracker, which we review when planning updates to our test bench. That way, we can weigh it alongside other potential improvements and prioritize accordingly. We are looking to expand our game mode coverage going forward, though we are currently focusing our efforts on improving our image processing and motion tests as part of our next test bench currently in development.
If you’ve got any other feedback or ideas, feel free to send them our way!
Cheers
https://www.rtings.com/monitor/reviews/asus/rog-strix-oled-xg27acdng Total 1.0% Indirect 1.4% Calculated Direct 0.4%
1.4+0.4=1.0?
More importantly:
- Is there a point in “indirect reflections” in the first place? If the screen reflects all incident light along its surface, then you’d show users a very bad number (100%), but the screen will be actually very good (appear black).
- Direct measurement of direct reflection should be much better than indirect measurement (integrating sphere… subtracting…). May be simply take a picture of some area light, and same area light reflected from the screen (all in raw of course), and then take the ratio of luminances?
Hi lunar,
Great catch, there is a mistake in the review! I have just flagged it for rework by the testing team.
When we talk about indirect reflections, we mean reflections with the specular component excluded. This includes things such as diffusion and light scatter which can cause black level raise.
We’re currently working on implementing our new reflections tests from our TV reviews to our monitor reviews. The focus will be on dedicated direct reflection, ambient black level raise and total reflections analysis.
Thanks again and don’t hesitate if you have any other feedback or suggestions for us!
Cheers
Hi Edt,
Thanks for taking the time to write to us with your feedback!
It’s something we will look into when we get to improving our viewing angle test, thank you for mentioning exactly where you noticed it in The Ring of Power, it will help us investigate. :)
Did you notice the issue on both the U8N and U8QG in the same scene?
Cheers
So, where are the REC2020 gamut coverage measurements, UV and XY terms? I can’t find them with this new testing method.
Hi Ballbuddy4,
Thanks for reaching out with your question!
We no longer evaluate BT.2020 gamut coverage using the 2D chromaticity diagram in either xy or uv coordinates. This method has become somewhat obsolete with the advent of display technologies that produce colors using more than just discrete RGB channels. This includes WOLED TVs with their white subpixels, but also projectors using DLPs with color wheels that incorporate non-RGB subcomponents.
To provide a more accurate assessment of BT.2020 coverage, we now use more advanced methods:
For SDR content, we use gamut rings, which are a 2D representation of the 3D CIELAB color space. Our measurements are taken at a standardized luminance of 100 cd/m².
For HDR content, we rely on our HDR Color Volume test, which evaluates BT.2020 coverage in the ITP color space, using a reference luminance signal of 10,000 cd/m².
By combining results from both SDR and HDR evaluations, we’re able to give a more complete and accurate picture of a display’s ability to reproduce BT.2020 colors.
Let us know if you have any other questions!
Cheers
Am I wrong or did you forget to add HDR color accuracy (at least for the flagship models) under 50% luminance? I suggest 15 and 30% will be enough (dE ITP of course) This is important if you want to be taken seriously..
Hi RtingsUser114164,
Thanks a lot for reaching out and sharing your thoughts!
You’re absolutely right to highlight the importance of measuring HDR color accuracy at lower luminance levels. Some of our current color patches do fall under the 50% luminance mark, but we agree that this alone isn’t sufficient—especially given how challenging accurate color reproduction can be at low brightness levels.
Going forward, we are looking to expand our testing to specifically assess low-luminance color accuracy on TVs. We really appreciate your feedback as our new tests and updates to existing ones are based on community feedback. The more interest we see from the community, the more likely we are to prioritize and develop specific tests.
Please don’t hesitate to reach out again if you have more suggestions!
Cheers
Hi Sillyrabbit,
Thanks for reaching out with your suggestion!
I have added it to our tracking list for review when we get to planning our next test bench update. In the meantime, please don’t hesitate should you have any additional suggestions or feedback for us.
Cheers
Hi UsamaHaunter,
Thank you for taking the time to write to us with your detailed feedback and suggestion.
We’ve noticed this on OLEDs as well. Dark shades of gray tend to brighten as the refresh rate drops. It’s something we are looking to investigate further soon as we want to understand the mechanisms at play. While we think part of the reason might have to do with drift in the thin-film transistor layer driving the OLED pixels, we’re wondering if part of it may be intentional in the monitor’s firmware in order to reduce luminance stability issues as the FPS goes from a high figure (e.g., 240) to a lower one such as 60 FPS.
The fact that your LG CX and BX fare significantly better at lower refresh rates suggests this issue might be more prominent in monitor-specific implementations (e.g., LG 27GR95QE), which could be tied to tuning choices made for high refresh rate performance.
We may need to evaluate the maximum gamma shift of OLED displays separately from VRR Flicker to provide some insights on how bad the phenomenon can be on some models.
Let us know should you have any additional feedback or suggestions for us!
Cheers
Hi Deepo,
Thank you for taking the time to reach out with your suggestion, and we’re glad to hear you enjoyed our R&D article on eARC and soundbars!
Your suggestion to include eARC latency measurements in our TV test bench is a great one, especially considering how impactful this can be for gaming, as you’ve pointed out. It’s definitely not ideal when latency pushes things past the threshold of what feels natural.
Right now, we’re working on the next iteration of our TV test bench update, which is focused on motion and image processing. That said, I’ve added your suggestion to our internal tracking list for future test bench considerations. eARC latency is an area we agree could use more coverage, especially from the TV side of the equation.
If you put together that S90D vs. C1 comparison video, we’d love to check it out!
Feel free to reach out anytime with more ideas or feedback—we really value community input like this.
Cheers
Love your reviews! Not to be too pedantic but, would you consider adding more real scene tests? Seeing how it is so heavily weighted in your brightness scores, wouldn’t it figure to have a more varied set of scenes to test? Your current ‘lamp in the left upper corner’ test also may skew negatively for panels that employ Convex Power Control (CPC) which dims the edges/corners of the display to help prevent burn in. Regards!
Hi rinburevolotion,
Thanks for reaching out with your feedback—and for the kind words!
You’re not being pedantic at all; it’s details like this that help us improve. We’ve been considering potential updates to our monitor brightness testing methodology to better reflect real-world performance, and that could include revisiting the real scene content we use, especially given how much weight it carries in our scoring.
You also raise a great point about Convex Power Control and how our current scene might affect displays that dim the edges and corners. It’s something we’ll keep in mind when we get to investigating how to improve our brightness testing.
Right now, we’re focused on expanding our updated TV reflections testing to monitors, but I’ve added your suggestion to our internal tracker for when we plan our next test bench updates.
Thanks again, and feel free to share any other suggestions you might have!
Cheers
Rtings, I like the Gamut Ring measurement idea and commend the update. Despite the explanation, I don’t quite understand why you are measuring DCI-P3 and Rec2020 for SDR color volume. It seems like BT601 and BT709 color space would make more sense for SDR, and that most readers might confuse the Gamut Ring as representing the HDR color space.
Admittedly, it’s possible one could be watching 4K SDR content with wide color, but that seems less frequent than SD or HD content. I don’t have a sense of how many SDR video games would use wide color – can you provide some context and examples? It also makes sense to cap the brightness at 100nits (though it does seem unlikely that a user will do the same) as some HDR TVs do not maximize the SDR brightness at the same level as HDR, but in a TV that offers local dimming, is that 100 nits measured on a small window, which represents a peak brightness, or on a full field white? Thanks,
Hi Amazon_Fan,
Thanks for taking the time to reach out and share your thoughts on our new SDR Color Volume test—we really appreciate your detailed feedback and questions!
You’re absolutely right that BT.601 and BT.709 are the standard color gamuts for SDR content. However, one of the key reasons we opted to use DCI-P3 and BT.2020 for our new SDR Color Volume test is that modern TVs already cover BT.709 so completely—often 100%—that measuring within that gamut offers little to no differentiation between models, even lower-end ones. In contrast, using larger color gamuts like P3 and BT.2020 allows us to highlight differences in color rendering capabilities between displays in SDR picture modes.
It’s true that traditional SDR content rarely exceeds BT.709. But there are still relevant scenarios where a TV might display colors outside of BT.709 in SDR picture modes. For instance, some users choose to set their TV to a wider gamut manually by personal preference, a notable one is when gaming in SDR to get more saturated colors (e.g., setting the TV to P3 color when using a Nintendo Switch which outputs BT.709). In regular video content, it’s possible that some pixels on screen may fall beyond the limits of the BT.709 color space, depending on how it was encoded. This is something we also often see in HDR, where a lot of content mastered in P3 will have pixels falling into BT.2020 territory. We’ve updated the test’s tooltip to make this nuance clearer—thanks for bringing it to our attention!
The brightness cap at 100 cd/m² is intentional since it’s the reference diffuse white point for SDR content. More importantly however is the fact that it’s a luminance output any TV can easily reach, which is crucial to ensure the comparability of our gamut rings results. At different diffuse white luminance points, results lose their comparability as the measured coverage of the color spaces will differ. We measure it with a test window with constant APL to ensure brightness stability across TVs, especially OLED models.
At a broader level, the motivation behind adding SDR Color Volume isn’t about emphasizing the presence of wide color in SDR content itself, but rather addressing a gap in how we assess color volume. Our current HDR Color Volume test uses the ICtCp (ITP) color space, which is excellent for HDR but not applicable to SDR gamma transfer functions. This means our HDR Color Volume results only reflect performance in HDR picture modes. Gamut Rings, however, are specifically suited for SDR gamma, making them ideal for evaluating SDR modes regardless of the color gamut of the content.
We’re also closely following industry efforts to expand Gamut Ring methodologies for HDR applications. There’s promising work underway, but no finalized standard yet. Until then, combining SDR and HDR Color Volume results provides the most complete picture of a TV’s overall color output capability, across all picture modes and content types.
I hope this gives you clearer insight into why we designed the test the way we did, and we’re grateful for your thoughtful engagement with it.
Cheers
Hi. USB C is currently the most popular standard in IT. Including power supply (PD - power delivery), video signal transmission (DP - display port alt mode), or audio (USB Audio). Maybe you can add information whether the TV supports the DP standard and can work as a monitor via USB C or connect a SSD/HDD drive via USB C. Thanks.
Hi WatcherTV,
Thanks for taking the time to reach out with your suggestion!
Unlike for monitors or specialty displays, USB-C has unfortunately not been adopted by TV manufacturers for conventional televisions. Should it be implemented, we will be sure to add it to our reviews.
Don’t hesitate should you have any other suggestions or feedback for us.
Cheers
With your next test bench setup you seriously need to include your vrr-induced gamma flicker test. Gsync and freesync is extremely important to gamers and to know if the tv is going to flash uncontrollably is pretty darn important. You already test it for pc monitors. For example the s90d is recommended as a gamer tv but with pcs the gamma flicker is way worse than every other oled I’ve tried, making it a poor choice for any pc gamer despite the high scores and nice panel.
Hi TVViewer,
Totally agree—proper G-SYNC and FreeSync behavior is crucial for gamers, and VRR-induced gamma flicker can be a real deal-breaker. I’ve personally run into VRR flickering on my OLED too, so I know exactly where you’re coming from.
We’re definitely planning to add a dedicated VRR Flicker test to our TV reviews, similar to what we already do for monitors. Before we do so however, we want to improve the current VRR Flicker testing methodology in order to provide the best and most accurate results available. Our next TV test bench is focused on cinematic motion and image processing, but VRR flicker testing will very likely be a top priority for the one just after that—I’m personally very invested in making this happening sooner rather than later. :)
Really appreciate you bringing this up. Don’t hesitate to reach out if you’ve got more feedback or ideas—we’re all ears.
Cheers
Hi,
Thanks for taking the time to reach out with your suggestion!
I completely understand where you’re coming from—I’ve experienced this myself on different TVs, and it can definitely impact the user experience. It would be an interesting metric to track. One challenge might be the variability in startup time even on the same unit, but it’s absolutely something worth exploring. I’ll add your suggestion to our list for consideration during our next test bench update planning.
Cheers
Hi Deepo,
Thanks for taking the time to reach out with your suggestion!
We are considering adding VRR flicker to our TV reviews in the future :) It will not be part of the upcoming TBU 2.1 update, which is going to be focused on Motion and Processing, but we are actively looking at adding the test to our TV reviews for a subsequent test bench update.
Don’t hesitate should you have any other feedback or suggestions for us.
Cheers
Rtings Team! Thanks a lot for your test bench 2.0. But ‘Viewing Angle’ and ‘Gray Uniformity’ do not seem to be reflected in Mixed Usage score. This is also important factor when watching TV. Would you please include these again in Mixed Usage score?
Hi Johnie2,
Thanks for reaching out with your suggestions!
We are currently working on our next test bench update for TVs. As part of it, we will conduct an assessment of our current usage scores and performance usages, which may feature some adjustments at that time.
Cheers
Could you consider having a color volume test for 4000 nits in addition to 10,000? While I understand the benefit of having something available for future proofing, it doesn’t seem too practical considering nearly all existing content isn’t mastered for it. To me, it seems equivalent to having a max speed test on a car, while living in a world where race tracks don’t exist.
Hi pittsportsfan,
Thanks for taking the time to reach out!
When we send the 10 000 cd/m² HDR signal to the TV, we are essentially asking it to give us the best color and luminance performance that it can provide. However, our scoring does take into account that TVs can’t reach these brightness levels.
We currently also include the 1 000 cd/m² color volume since a lot of HDR content is still limited to this luminance level. If we see a trend to content moving towards 4 000 cd/m² instead of the more common 1 000 cd/m², it’s likely that we will replace it in our reviews as well.
Don’t hesitate should you have any other suggestions or feedback for us.
Looks great! A suggestion, although I don’t know if it applies to newer TV’s anymore. Pairing them with an external box (eg. Apple TV 4K) and a soundbar seems like it can introduce lip sync issues, vs just using TV based apps. As we seem to rely more and more on soundbars, could it make sense to include a test that measures if the TV introduces any weird delay if an external source is used with them? (I appreciate it might change from source to source and soundbar to soundbar but I’m also not sure. It could be really useful if it can be generalised and measured).
Hi Pikman,
Thanks for reaching out!
The soundbar, rather than the TV, is almost always the cause of audio sync issues. If you haven’t already, I would recommend having a read of our R&D article about sync issues between a TV and a soundbar: Your Soundbar’s eARC Connection May Not Give You the Best Experience. It’s something we measure as part of our soundbar testing. If a soundbar offers an HDMI passthrough, it is generally the best way to reduce AV-sync errors.
Hi leongza,
Thanks for taking the time to reach out with your suggestions!
Your timing is great—we’re currently in the process of planning our next update to the projector test bench. Gaming and motion performance, including input latency, are high on our list of areas to evaluate further. You’re spot on about it being worthwhile to test or investigate across different resolutions and with pixel shifting disabled when a projector allows for it)
Testing for ghosting artifacts is another great idea. While it may be less relevant for DLP projectors, as you mentioned, it could definitely help highlight differences in motion performance among LCD models, especially for gaming use.
As for RBE artifacts, this is also something on our radar. While detecting their presence is relatively straightforward, we’re actively looking into how we can measure them in a more objective and consistent way—ideally across multiple settings like resolution and refresh rate.
Don’t hesitate should you have any other suggestions for us!
Cheers
I am glad to see you using the gamut rings visualization to accurately show the display’s color capability and its gamut coverage relative to a standard reference gamut. This analysis harmonizes with the color evaluation method recommended by IEC and CIE. I also think that it’s wonderful that you added the specular powder distributions of TVs. That is really informative. However, the new Reflections section is disappointing. The reflected images of small and large light sources may be qualitatively informative, but the measured values are misleading. For example, the reflected intensity profile given in your Direct Reflection example of a display with a matte surface does accurately represent the true physics behind the scatter. The peak reflection intensity that you report is a combination of Lambertian, Haze, and specular reflections (see https://doi.org/10.1002/msid.1099). Its value will change with the luminance of the light source, the distance from the light source to display, and the size of the light source. Therefore, unless the users reproduce your exact geometry and spectra, they will not get your results. A better approach would be to measure the individual reflection coefficients for the specular, haze, and Lambertian scatter components. This would allow the users to anticipate how the display would perform for a wide variety of illumination conditions. One way to separately measure the three reflection components is to use an annular light source (see https://doi.org/10.1002/sdtp.15529). An alternative approach is to measure the reflected point spread function (https://doi.org/10.1002/sdtp.17707). If that is too involved, then I think the next best thing that you could provide would be the hemispherical diffuse reflectance with specular included (http://keltekresearch.com/NIST/Publications/reflection/Ambient_Contrast_NISTIR6738.pdf). This is a useful and reproducible reflection measurement. It indicates how reflective the display is to uniform diffuse illumination, which is commonly present in both indoor and outdoor applications.
Hi Dr. Penczek,
Thank you very much for your thoughtful and detailed feedback on our new reflections testing suite.
Regarding our new reflections testing, we fully agree with your assessment from a theoretical and metrological standpoint. In fact, during the development of these tests, we closely reviewed the methodologies you referenced, and also the ones involving variable aperture light sources to separate the Specular, Haze and Lambertian components of reflections. These approaches are unquestionably more rigorous and comprehensive in terms of capturing the reflection behavior of displays.
However, we made a deliberate decision to prioritize a methodology that is easier for our audience to understand and relate to real-world usage. While this comes at the cost of less generalizable or replicable absolute values, our goal was to offer a comparative and perceptually intuitive measure of reflections across different displays—anchored in a controlled, standardized test setup. Internally, we validated the consistency of our rankings with side-by-side perceptual impressions to ensure the results aligned with the viewing experience.
We recognize that this user-centric framing may not satisfy the needs of researchers or engineers looking for reference-grade reflection metrics. Your suggestions are well-taken and offers us some perspective for the future of our reflections testing. Our goal would ultimately be to provide more physically meaningful results while ensuring they are still accessible to a broader audience, as this is the core of our work.
We’re grateful for your insights and hope to continue this conversation as we improve our protocols. Community input—especially from experts like yourself—is invaluable to striking the right balance between technical fidelity and practical relevance.
Best regards
Hello, Well, not necessarily, since temperature does not depend on the level of green. The TCL Q750G (model C745 in EU) (https://www.rtings.com/tv/reviews/tcl/q7-q750g-qled#test_142) is a very good example. The temperature is 6500 K, and it’s pretty well calibrated from the outset (at least, the 3 RGB levels are more or less constant), but the green level is recessed, so its delta E is mediocre. This is normal, since the distance between green and the other 2 primary colors is large. The delta E decreases considerably by reducing the distance between green and the other 2 colors (Blue, Red). Just by using offset (2 IRE points). Translated with DeepL.com (free version)
You’re correct, I should have said D65 white point target, instead of 6500 K color temperature target! :)
Thank you
I think the idea of adding new ambient black raise and color saturation tests in bright room conditions is a very good approach. However, there is something I would like to review additionally regarding the evaluation of ambient black raise. It seems to evaluate the black raise due to indirect reflection in bright room. But as you can see in the image at the top of this article, when the ambient illumination increases, not only indirect reflection but also mirror reflection is severely recognized. For example, this is the case of the TV on the far left in the image at the top. I think that is a very annoying point when we watch the TV in day time.
In order to test this mirror like characteristic, I suggest to measure the black level by placing the white background behind the measuring equipment. Then, you can measure the effect of both indirect and direct reflection together. Anyway, thank you always for researching objective and practical evaluation methods.
Hi DaveTW,
Thanks for taking the time to write to us with your feedback!
You’re absolutely correct about direct, mirror-like reflections having a very negative impact on black level raise. For this new test bench, we decided to evaluate Ambient Black Level Raise due to indirect reflections separately from the impact direct reflections can have. As a matter of fact, we have a section of our new TV Reflection Handling R&D Article dedicated to the relationship between Direct Reflection handling and Ambient Black Level Raise.
What we generally see is that TVs that are the worst at handling direct reflections are often the best for Ambient Black Level Raise performance, and vice-versa. If you look at the scores of both our new tests: Direct Reflections & Ambient Black Level Raise, you should get a good idea of how a TV would behave in a home environment.
Nonetheless, it would be interesting to evaluate the impact direct reflections have on black level raise separately from our direct reflections test. I have added this suggestion to our tracking list for review when we plan work on our next TV test bench update :)
Don’t hesitate should you have any other suggestions or feedback for us!
Cheers
Great to see Rtings addressing raised blacks in ambient light, but can you also add a measure of the effective contrast ratio of the TV when in ambient light? As this will allow an approximate comparison with a standard measure (contrast ratio) that most people are already familiar with across past and future TV reviews. Can you also please standardize your Upscaling test settings? For example, the settings used for the LG B3 were Sharpness 16 and Super Resolution High, but for the C3 were Sharpness 16 and Super Resolution Off, and were different again for the for the G3 with Sharpness 11 Super Resolution High. I suggest just keeping settings on the Filmmaker Mode defaults for all TVs (so for LG Sharpness 10 and Super Resolution Off) to make for fair comparisons across models. Can you also add back the 720p and 1080p upscaling tests? As these are more common resolutions that most users will be viewing these days via e.g. streaming, YouTube, local video files and Blu-rays.
Hi b0b,
Thanks for taking the time to reach out with your suggestions!
While we currently don’t measure effective contrast as part of a dedicated test, you can calculate it by dividing our peak brightness results by our Ambient Black Level Raise results for a specified ambient illumination value. This will yield a good approximation of what kind of effective contrast ratio you can expect in home environments.
Our Upscaling: Sharpness Processing test currently has our testing team subjectively evaluate and determine the best upscaling/sharpness settings for tested TVs, which is why the settings aren’t the same on all TVs. However, there is also value in standardizing the settings to see how TVs fare against one another (perhaps using the out-of-the-box settings!).
I added your suggestions to our tracking list for review and prioritization when we start working on our next TV test bench update. :)
Don’t hesitate should you have any other suggestions or feedback for us!
I really like the idea of adding the “Ambient Black Level Raise” and “Ambient Color Saturation” tests because I often notice that ambient light can degrade picture quality. The results you get in a dark room sometimes don’t match my real-world experience. I think it’s essential to measure additional attributes like ambient resolution, ambient gamma curve, ambient color accuracy etc to better understand how a TV performs under lighting conditions. Just as you measure black level and color saturation under ambient condition, why not extend these tests to include how ambient light affects other picture quality attributes? I believe your ultimate goal in adding these tests is to quantify the impact of light on overall picture quality.
Hey Justin03,
Thanks for taking the time to share your thoughts with us!
You’re absolutely right—ambient lighting can have a noticeable impact on picture quality, and what you see in a dark test environment doesn’t always reflect real-world conditions. That’s exactly why we introduced the Ambient Black Level Raise and Ambient Color Saturation tests—to start quantifying those differences.
We agree that other tests, including bright room color accuracy, are also worth exploring. For example, while most TVs include a basic light sensor, some higher-end models go a step further with a color temperature sensor that dynamically adjusts image tone based on ambient lighting—a feature we don’t currently evaluate but recognize as valuable to the user experience.
The Ambient Black Level Raise & Ambient Color Saturation tests are just the start of us testing TVs in bright environments, and your suggestions help guide where we go next. I’ve added your ideas to our internal tracking list for consideration as we continue to expand our testing methodology.
If you think of anything else or want to share more feedback, we’d love to hear it!
Cheers
With recent advancements in laser illumination dimming, including the EBL (enhanced black level) feature of this projector, I suggest that Rtings figure out a way to quantify the impact to “real scene” contrast. If you’re not familiar with the function, the illumination level is optimized based on the content. So for a very dark scene, the illumination is adjusted to a lower level, which lowers the black level and improves the contrast. It makes a huge positive impact if designed well. This is a concept that’s been around for a long time, but with lamps/color wheels it was more difficult, and LED illumination has low contrast issues to begin with. Laser has the advantage of starting with higher contrast and also more precise and fast control of the illumination level.
Hi jrich05,
That’s a great suggestion. Features like this play a big role in enhancing the projector viewing experience. Testing for this would be beneficial as it could help set apart different projector technologies from one another. We are planning work on our projector test bench this year, so I have added your suggestion to our tracking list for review and investigation as we develop the next iteration of our projector test bench.
Don’t hesitate should you have any other suggestions or feedback for us!
Cheers
Wow! Big changes. I hope that while you’re evaluating how rtings will deal with ‘Motion Handling’ in the future, you will also consider adding 25p content to the judder test. This would tell us how a particular TV will handle television/series content from practically the entire globe outside of North America. It could be as simple as evaluating how a TV handles watching the BBC’s Black Mirror episodes (which are filmed at 25p) under the same conditions that you subject 24p content to. Please, consider it
Hi Orkestar,
Thanks so much for reaching out with your suggestion!
We’re actually looking into that :) The timing is great, as we’re currently evaluating how we approach motion handling, and it’s worth investigating whether a TV’s ability to handle 24p content without judder also applies to 25p. As you mentioned, shows like Black Mirror—shot in 25p and widely available on streaming platforms—highlight how relevant this frame rate is globally, even in regions like North America where 24p is more common.
I’ve added your suggestion to our tracking list for the next round of updates to our TV test bench, where we’ll review and prioritize it alongside other potential additions and improvements.
Let us know if you have any more feedback or ideas—we really appreciate them!
Cheers
Great update! Look forward to seeing the final version. I would love to see some different color points in the Calman Colorchecker. The “Flesh Tone” targets are so small that it’s hard to really see, and more than a couple points seem redundant. Maybe you could exclude those for some clarity, and included Primary and Secondary color points in that chart, instead. That way it still some cross referenceability with previous CIE ‘76 charts. And since that chart shows the accuracy of the target vs the measured color point already, would you consider swapping the DeltaE ITP chart, which combines the xy/UV error with color luminance, for a chart that shows just the color luminance error?
Hi Amazon_Fan,
Thanks for reaching out with your feedback!
When developing our new HDR Color Accuracy tests, we prioritized skin tones because they’re particularly challenging to reproduce accurately. Viewers are highly sensitive to even subtle variations in skin tones, making any deviation in tint—whether too yellow, red, or green—immediately noticeable.
That said, you make a great point about the Flesh Tone targets being tightly packed and difficult to evaluate. As we refine this test, we see an opportunity to strategically reduce some of these targets and replace them with additional primary and secondary color points. This adjustment could improve clarity while helping cross-referencing with previous CIE ‘76 charts.
We’ve also noted your suggestion about replacing the DeltaE ITP chart with (or adding) a dedicated color luminance error chart, as it could provide more targeted insights. We’ll consider this as we plan the next iteration of our TV test bench.
Don’t hesitate should you have any other feedback or suggestions for us!
Cheers
Hi Kirkkautta,
Thanks for reaching out with your question!
In essence, White Balance dE measures the average error across all tested grayscale slides relative to the standard 6,500K color temperature target. The closer the dE is to zero, the more accurate the color temperature across the grayscale range.
For example, the LG 27GR95QE-B has the following results:
We feel it is a fairer metric for evaluating color accuracy as White Balance dE accounts for the absolute value of errors. Average Color Temperature is useful but can hide some issues, for example a screen could still have an average Color Temperature of 6 500 K if we tested only two grayscale patches that were very off from target, such as 6 000 K and 7 000K.
I hope this answers your question. Don’t hesitate should you have other questions or feedback for us. :)
Cheers
Hi pib319,
Thanks for sharing this paper! It was a very interesting read.
We’re familiar with Florian Friedrich’s work on HDR, and we’ve recently been exploring how spatio-temporal factors impact a display’s performance metrics, including brightness and contrast. The approach outlined in the paper, using dynamic test sequences with spatio-temporal complexity, aligns well with some of our ongoing investigations.
One of the most compelling aspects of this research is how different display technologies respond to motion picture backgrounds compared to static test patterns. The variability in luminance, contrast, and black level behavior across display types reinforces the idea that traditional static measurements don’t always capture real-world performance.
We definitely see this as a logical next step for our testing methodology. Moving away from mostly static content should help make our results more representative of real-world usage while also minimizing the influence of manufacturers optimizing for specific test patterns. The latest Spears & Munsil UHD Benchmark disc contains dynamic patterns that we’re evaluating in this context.
For the last few months, we’ve been working on a major revision of our TV test bench, which is set to be published around mid-to-late March. Once that’s complete, we’ll be focusing on improving our brightness testing, potentially incorporating a brightness stability test. This research provides useful insights into how we might refine that approach.
If you have any specific suggestions for improving our current tests, we’d love to hear them!
Cheers
Hi tilengeni,
Thanks for taking the time to share your suggestion!
I completely agree. While developing our first projector test bench, we noticed speckle firsthand on some laser projectors, particularly when viewing up close to the projection screen.
We’d love to evaluate and quantify this effect objectively in our projector reviews. I’ve added your suggestion to our tracking list, which we reference when planning future test bench updates. This will help us prioritize it alongside other planned improvements and new tests.
If you have any other suggestions or feedback, don’t hesitate to reach out!
Cheers
Please consider a higher penalty for lack of variable refresh rate (VRR) in future methodologies. Current gen consoles like the Xbox Series X and the PS5 are at the point now where they’re not even able to run the top graphically demanding games on the market at a smooth 4K 60 FPS (much less 120 fps). As a result, newer graphically demanding games (such as Stalker 2) are almost unplayable at 4k 60 fps on TVs without VRR. I found this out because we got a Roku Select for our bedroom, and I tried to play Stalker 2 on it. It was a very choppy, unplayable experience.
Hi DreSand,
Thanks for sharing your thoughts!
We completely understand how the lack of VRR support can impact the gaming experience, especially with the increasing demands of newer titles. As you pointed out, modern consoles often struggle to maintain a smooth 4K 60 FPS output, which makes VRR an increasingly essential feature for gaming on TVs.
We’re currently reviewing our usage scores across all categories, including gaming, and re-evaluating the weight of VRR in our scoring system is definitely on our radar. Your feedback is valuable for this, and I’ve added your suggestion to our tracking list for consideration in our next test bench update.
Don’t hesitate should you have any other thoughts or suggestions for us!
Cheers
Thank you for the response. It just seemed odd that it measured very high brightness for the last two real tests and low for the first one but I guess it’s likely using a MaxDML tone curve like Samsung and TCL instead of a MaxCLL like LG and Panasonic. Have y'all considered using a mastering monitor like the Asus PA32UCXR to help evaluate displays?
Hi,
We have a Canon DP-V2411 reference monitor that we use for test development and HDR validation purposes, but it’s not something we currently use during testing since we control the HDR metadata sent to TVs during testing. We can then compare the results taken using our instruments to what should be displayed if a TV was able to display the entire HDR10 brightness range.
Thanks. Do you all have any way of quantifying how distracting ABL actually is on each tv? In the same way you have blooming tests, any way for me to see how bad it actually is in real content (including gaming where it’s more likely to stay on something static). I HATE the blooming on my tv, but I’m afraid that I’d switch for ABL it’s just trading one for another
Hi Tvcurious,
That’s a great question! While we currently evaluate ABL to some extent through our brightness tests in both SDR and HDR, we don’t have a specific test that directly quantifies how distracting ABL might be in real-world content. Our ABL coefficient is calculated based on the difference in brightness between sustained window tests, which gives some insight into how aggressive ABL is on different TVs, but doesn’t offer a complete picture of how it can act.
However, we recognize the importance of understanding ABL’s impact in real-world scenarios. From our experience, ABL is most noticeable during scenes with dramatic shifts in average picture level (APL)—essentially, the overall brightness of the scene. For example, you might notice it during a hockey game when the broadcast transitions from a dark ad to the bright ice rink or when playing a video game and quickly switching from a dimly lit cave to a bright outdoor area. These are the types of situations where ABL might stand out, but they’re relatively rare in typical viewing.
In my personal experience with OLEDs, ABL isn’t usually noticeable unless you’re actively looking for it, except in extreme cases like the ones mentioned above.
For gaming, you might also encounter a related mechanism known as ASBL (Automatic Static Brightness Limiter), which dims static scenes to prevent burn-in. This is generally not noticeable unless you’re in a very bright static scene for a prolonged period (e.g., 2–3 minutes).
We’d love to expand our testing in the future to better quantify ABL’s impact on real-world content, so we can provide even more detailed information about how different TVs perform in this regard. Thanks again for your question—please let us know if there’s anything else you’d like to know!
Cheers
Dear RTINGS.com Thank you for this incredible study. Please let me make a proposal. As heat is the most important factor leading to burn-in in OLED TVs, and burn-in pattern is not heterogenous, but it frequently impacts most on the lower third of the TV screen. Have you done an experiment with one of your new OLED TVs but including an active cooling system (like in a PC: fans, Liquid-cooling) in order to verify whether or not an active cooling system might improve the lifetime of the proper, perfect OLED TV image. If not, could you please consider such an option in your ongoing study? The result of this experiment would indicate whether a regular user (who could pay 2000 EUR for a perfect image on an OLED TV) could combat overheating in OLED TVs and extend the lifetime of the OLED TV image. Thank you
Hi tetshuo,
Thank you for your suggestion and for taking the time to share such an interesting idea with us!
You’re absolutely right that heat is a critical factor in OLED burn-in, and the idea of testing the impact of an active cooling system is intriguing. While we haven’t conducted such an experiment as part of our ongoing studies, we agree that it could provide valuable insights into how one could extend the longevity of OLED TVs and keep their performance more stable over time.
While we can’t make any promises, your suggestion is now on our radar, and we’ll definitely keep it in mind as we continue to refine and expand our testing. Exploring unconventional solutions like active cooling could open up new possibilities for our testing and help consumers make more informed decisions.
Thanks again for your thoughtful proposal! If you have any further ideas or feedback, feel free to reach out :)
Cheers,
Allo! Je me disais que ce serait pertinent de connaître le niveau de bruit émit par chaque projecteur pour votre version 1.0 des test de projecteurs! :) Merci!
Bonjour marseila,
Merci beaucoup d'avoir pris le temps de nous écrire et de partager cette excellente suggestion !
Nous prévoyons justement de mettre à jour notre banc d’essai pour les projecteurs cette année, et intégrer un test du niveau sonore est une idée que nous envisageons sérieusement. Il nous est souvent arrivé d’être surpris par le bruit de certains modèles lors de nos tests !
N’hésitez surtout pas à nous faire part d’autres suggestions, commentaires ou questions. Votre retour est précieux pour nous. 😊
Bien cordialement
Having a hard time finding any information on this… Does the Samsung S90D run short compensation cycles (pixel optimizer) every 4 hours of cumulative use? I am use to using LG products (C2 and C4). LG specifically states that it is ran after 4 hours cumulative use. Also, how do I know the S90D has completed the cycle? I have tried listening for a “click” like the LG’s but have not heard the “click”. Any information would be greatly appreciated. Edit: I know rtings made a video on this on there burn in videos on how this was a firmware issue however I believe this was on the 90/95C series? Thankyou again.
Hi Slickster0387,
Thanks for taking the time to reach out! That’s a great question.
While we haven’t directly verified the Samsung S90D’s short compensation cycles as part of our ongoing tests, there’s an effective way to check if they’re functioning: look for the absence of temporary image retention (IR) on your TV.
From our testing experience, static content displayed for just a few hours can cause temporary IR. These short compensation cycles are specifically designed to address that. To check, you can display a dark gray slide on your TV—if no image retention artifacts are visible, it’s a good indicator that the cycles are running as intended.
If you’re concerned they might not be running automatically, you can also manually initiate a cycle via: Settings > General & Privacy > Panel Care > Pixel Refresh.
I hope this clears things up! Let us know if you have further questions or feedback—always happy to help!
Cheers
Cuantas zonas de atenuación local tiene la qn85d en el tamaño de 75 pulgadas? Soy de chile (Sudamérica)
How many local dimming zones does the qn85d have in the 75 inch size? I’m from Chile (South America)
Hi Oshondajr_47,
Thank you for reaching out to us!
We’d love to assist you, but we’ve only tested the 65" model of the QN85D. Unfortunately, we can’t provide data about the number of local dimming zones on the 75" model since this requires manual testing.
If you have any other questions or feedback, please don’t hesitate to reach out—we’re happy to help!
Cheers
Hi Hamzah,
Thank you for taking the time to share your suggestions!
Evaluating the rainbow effect on DLP projectors is a fantastic idea, and assessing gaming performance, including input lag, is also something we’re keen to explore further.
We’re currently planning our development work for 2025, including updates to our projector test bench. I’ve added both of your suggestions to our tracking list so we can assess and prioritize them alongside other new tests and improvements in our pipeline.
Please feel free to share any additional feedback or ideas—we love to hear feedback from the community!
Cheers
Hi Rich_Mahogany
Thanks for sharing your thoughts with us!
That’s an excellent suggestion and definitely something we’ve considered. We’ve seen firsthand how firmware updates can impact a TV’s performance—sometimes with bug fixes or performance improvements, but occasionally with more noticeable changes, like variations in brightness for the same scene.
It would be ideal to track this over time; however, the main challenge is the labor involved. With our limited resources, we prioritize testing new products and updating popular models with the latest test bench procedures.
That said, we do retest TVs when firmware updates are expected—or reported by the community—to have a significant impact on performance. Expanding this retesting to more closely track other firmware updates as they come out is something we’ll continue to keep in mind as we refine our testing processes.
Thanks again for the suggestion, don’t hesitate should you have any other feedback or questions for us.
Cheers
As noted above with the power supply. For this “Portable” group, it would be nice to see in the sortable complete list of devices to have “battery (built in or removable)” to truly be portable.
Hi woj027,
Thanks for taking the time to reach out with your suggestion!
It’s a great one and we are looking to add this test in our upcoming projector test bench update. While the schedule isn’t set in stone yet, we do plan on working on improving our projector reviews in 2025 following the publication of our next TV test bench.
Don’t hesitate if you have any other suggestions or feedback for us!
Cheers
Hi RtingsUser9522080,
Thanks for taking the time to reach out with your question!
The next TV test bench is scheduled to be published around February/March 2025. We’re confident the new and updated tests will help you find the best TV for your needs! :)
Don’t hesitate should you have any other questions or feedback for us.
Cheers
Hi, Thank you for the review ! In TV Review you are sharing recommended settings for color accuracy, is this something you could so as well for projectors please ? Looking specifically for this projector :) Thanks !
Hi Abrases,
Thank you for taking the time to write to us with your suggestion!
It’s something that we are considering going forward in a future test bench update. The challenging part of reporting calibration settings for projectors is that, beyond variance between individual units of a same projector, the settings we use calibrate the projectors for our specific viewing environment and, more importantly, our projection screen. We are considering testing on different screen types to see the impacts this can have on calibration.
In the meantime, if you are looking for the most accurate color accuracy out of the box, I would suggest giving either the “Office” or “Custom (warm)” picture modes as they gave us the most accurate colors in our testing.
Don’t hesitate should you have any other suggestions or feedback for us!
Cheers
As a subscriber to RTINGS from the very beginning, I wanted to thank you very much for posting that Gradient video showing how this monitor crushes blacks under the Gradient heading and that this shortcoming cannot be mitigated by adjusting Gamma without causing posterization at the transition point. I think that this should set a precedent for future reviews where a small video such as the one you included is better than 10,000 words describing this effect. It is very clear from the video exactly what the issue is. One could say that OLED monitors may display perfect blacks, but they certainly don’t display perfect very dark grays, having the exact same black crush problem that plagues VA panels to this day.
Hi SharpEars,
Thanks for taking the time to reach out with your suggestion!
You’re quite on point about OLED displays having issues with displaying very dark grays. It currently seems to be a limitation of the thin-film transistor layer driving the OLED subpixels themselves.
I agree that adding a video in future reviews where this effect is noticeable is a good idea. Perhaps even better would be for us to develop a dedicated test for this in the future so it can be objectively scored. Is this something that you think would be beneficial going forward?
Cheers!
Thanks for your work on this issue. My TV is a glossy screen located in a small bright room with large windows (affecting daylight sports viewing) or lamp-lighted in the evening (affecting viewing of other shows). Moving stuff around to improve viewing is not an option. So I am thinking about replacing it with one of the matte screen TVs offered by Samsung, Hisense or TCL, and came across your comments on reflection handling. It’s counter-intuitive to me that a glossy screen can be better at reflection handling than a matte screen. Sure enough when I go into Best Buy, I can see reflections of the store’s lights and my own presence in ALL the glossy screens, but in NONE of the 3 brands’ matte screens, which were all represented in the store 10/25/24. Yet in the table that you provide, you identify many glossy screens as better at reflection handling than your highest-rated matte, the Samsung. My current TV works fine even though it’s old–the only issue it really has is the reflections on the glossy screen. I don’t want to replace it with say, the highly rated Samsung C4 OLED shown in your table, and then bring it home and wonder why I can still see the windows and lamp light reflected in the screen. Am I misunderstanding your rating and testing? If an old glossy screen is my only problem now, is that likely to continue with a new glossy screen? Would appreciate your thoughts.
Hi erict,
Thank you for reaching out with your question! I’m glad you asked—it’s an insightful one, and it makes sense that our results might seem counterintuitive at first glance.
Reflections on TVs can be broken down into three main types: specular reflections (mirror-like), haze reflections, and Lambertian reflections. This image captures the distinctions well: Specular, Haze & Lambertian Reflections
Here’s how each screen type handles these:
Matte Screens: Matte finishes are excellent at handling specular (mirror-like) reflections, so you won’t see clear reflections of yourself or nearby lights on the screen. This can be ideal if you can’t control lighting around your TV, as it minimizes these types of direct reflections. However, matte screens tend to struggle with haze and Lambertian reflections, which can result in an overall lightened or “raised” black level in bright environments, reducing contrast and slightly desaturating colors.
Glossy Screens: Glossy finishes typically perform very well at handling haze & Lambertian reflections, especially if they are combined with a polarizer such as is the case with LG OLEDs. This makes them excellent for overall reflection handling, though where they fall short compared to matte finishes is at handling mirror-like specular reflections, which will be readily visible unless the TV is able to fight them off with a high brightness output.
When choosing a screen finish, it comes down to what’s more distracting for you. If mirror-like reflections are the main concern, a matte screen could be a great choice. But if you’re primarily concerned about maintaining contrast and color in well-lit rooms, a glossy screen with good overall reflection management might be more suitable.
Finally, if your TV environment has a lot of ambient light, regardless of the finish, a bright TV can make a significant difference in daytime use. The extra brightness can help cut through the reflections and maintain a clear, vivid picture.
On a side note, we are currently in the midst of reworking our reflections testing for release in early 2025 :)
I hope this provides useful insights for your next TV purchase, don’t hesitate should you have any other questions or feedback for us!
Cheers
Small addition. Ideally, you’d like to measure the white level but also the black level, as the black level tends to increase at high frequencies. It could give this kind of curve on a TV with 10,000~100,000 zones (this is not a measurement but an estimate). https://i.ibb.co/28BmJCZ/IMG-00003.png Voilà.
Hi RTINGS007,
Thanks for the add-on to your suggestion, I’ve added it to the main note concerning the luminance frequency response for local dimming of LCD displays :)
Cheers!
Hi sknaumov,
Thanks for taking the time to reach out to us with your suggestion!
I think you’ll be glad to learn that we are currently working on developing tests for color & contrast performance in bright environments. It’s part of our upcoming TV test bench which we expect to publish in early 2025. It’s very likely that this will be ported to monitor reviews afterwards.
Stay tuned :)
Cheers!
Please consider highlighting any mounting issues for all your TV reviews. E.g. the QM8 85" has a 400x500 Vesa pattern and few mounts will support that size - that weird size could easily add $100 to the cost of the TV Does the back of the TV have cable mounts or places to attach zip ties to hide dangling cables? Is the power cable ‘bowtied’ and thus permanently kinked or does it come neatly coiled in a bag? How high does the included stand go - will a center speaker fit underneath the tv?
Hi billium,
Thanks for taking the time to write to us with your suggestion. I definitely understand how unexpected mounting issues could be a let down.
We currently provide the stand height information in our “Stand” test, which for our 65" TCL QM8 was from 2.23 to 3.5 inches above the table. In our “Back” test, which provides the wall mount information for our tested TV size, the review text indicates when a TV has cable management options (e.g., cable slots, ties, etc.).
For other TV sizes such as the QM8 85" with a 400 x 500 VESA pattern, we couldn’t verify it ourselves as we only purchase 1 size of each tested models, but it’s something that we could still flag in our reviews, so I’ve added this suggestion along with the one about power cables directly to the list that we’ll review when planning our next TV test bench update.
Don’t hesitate should you have any other suggestions or feedback for us!
Cheers
I want to know how individual scores are obtained, whether they have theoretical basis and physical significance, or if they are subjective ratings. For example, how is the brightness data for the SDR Real Scene Brightness category in SDR Brightness used to obtain the score for the Real Scene Peak Brightness category?
Hi hhhlala,
Thanks for reaching out with your question.
How individual scores are obtained depends on the test. For brightness tests, the results are all objective, based on the measurements which are then scored using a scoring spline. For SDR Real Scene Peak Brightness, we score the highest measurement we gather during our test run.
Don’t hesitate should you have any other questions or if you have any feedback for us!
Cheers
Hi RTINGS007,
Thanks for taking the time to reach out with your suggestion!
It’s quite interesting, I’ve added it to our internal R&D tracking list so we can assess and prioritize it relative to other work in our pipeline when we get to working on our next TV test bench updates.
Don’t hesitate should you have any other suggestions or feedback for us.
Cheers!
You mentioned the effects of edge vs full-array lighting (number of LEDs used), brightness measured in NITs (is content dependent) and HDR (also content dependent) can overshadow the underlying technology’s power consumption. Can you incorporate those into the formula for your calculator so that we can have a reasonable chance of comparing the real power consumption of monitors from their specs?
Hi Swan17!
Thanks for taking the time to reach out with your suggestion.
It would be interesting to do so, but it may prove very tricky due to differences between specific models of monitors. Many enthusiasts have been asking us to test the power consumption of monitors as part of our review process, like we do for TVs, is this something you’d like for us to do going forward?
Cheers
Thanks for the reply! Regarding ANSI - sorry, terminology mishap there. You are not strictly measuring ANSI as in the spec but you are measuring 50% ADL which is what ANSI measures as well. My comments regarding the room/screen combo having a pretty big impact on the results stands. I really have to disagree with your decision to measure off the screen and the purpose of that being to mimic “optimized theater environments.” As I mentioned in the previous post there are multiple ways to optimize your setup further without much work. Using an ALR or grey screen would impact your ADL measurements and many readers will be buying those screens. All of your UST measurements will be understating the performance if used with a lenticular or especially fresnel screen. I’d expect Rtings to set up a lab-type of environment and not be victim to the impacts of the environment. Even just measuring off the lens would remove a lot of the environment. There’s just no way to know if your room is a good match to your readers. It could be better, it could be worse. The way your room reflects light could be different for a UST than a long throw based on how each type disperses light into the room. The reason we as reviewers typically want to measure in the most optimal conditions is that is how you level the playing field and tell the reader what is absolutely possible. If a reader has a near perfect room with black velvet then your 50% ADL measurements make it seem like there is very little difference between many projectors - which is just not true. Back to the lap-type of environment. It’s not hard to set this up, you can do it in a couple different ways. Either use a velvet lined tube or a velvet tent. The tent may be the easiest for alignment with various types of projectors and could be made out of PVC and triple black velvet from a hobby store. You could also just get velvet for your whole room. Or, getting a velvet cover for the screen that can drop down would substantially cut down on light pollution in the room and get you near ideal measurements. FOFO contrast - if you measured FOFO on a projector that doesn’t automatically dim for a full black field with and without a pixel in the corner(s) you’d find the measured contrast would be either identical or within a few tenths of a percent of each other. FOFO with a few pixels is FOFO. If you don’t want to call it FOFO then call it “0% ADL - 1px”. Doing this will speak to the performance of under 1% ADL. What I believe would be awesome for you guys to do is to instruct readers on how a projector’s contrast is influenced by the environment and screen. Projection Dream did this in a fantastic study but it is only known to enthusiasts in the community. You guys have a platform to show what a projector’s ideal contrast is, how it performs in a living room, with a grey screen, and ALR screen, a lenticular screen etc etc (for those projectors that work with such screens, of course). Projection Dream documented this with ADL graphs showing the impacts of the environment. If you made a video and an article educating the readers then they could make much more informed decisions. If they have a white living room they could see how a white screen would essentially make ADL contrast differences null. If they had a perfect black velvet room they could see the inverse. There’s no one better than you guys to take a scientific approach to this and really help members of the community understand the relationship of room vs projector vs screen. https://projectiondream.com/en/contrast-projector-environment/
Hi RtingsUser1708119,
While our current projector test bench is limited in scope, we are looking to expand it and improve our tests going forward, so we really appreciate you taking the time to share all your thoughts and insights with us about how we could improve our testing environment and contrast tests.
It’s given us a lot to consider going forward, and while we can make any guarantees of a timeline or scope of changes to our current projector test bench, we’ll be keeping all of this in mind when we get back to working on our projector test bench.
Cheers!