It appears the X90L actually came in first place for the new HDR pre-calibration accuracy test results. So even if I luck out and a financial miracle comes my way, this makes me question whether or not I should upgrade. How come even the best HDR pre-calibration and post-calibration accuracy have higher color dE than the best SDR pre-calibration and post-calibration accuracy?
Check out post #1905 of the AVS Forum Sony calibration thread ( https://www.avsforum.com/posts/63905930 ). While I am still left with questions, I appreciate the explanation of how certain picture modes are designed for certain use cases. I thought this might be of interest to X90L owners.
So if at some point I decide to try those settings, (1) for SDR, would I leave the two-point white balance and color management controls at their default values, and (2) for HDR, would I leave the ten-point white balance and color management controls at their default values?
On the calibration and settings page, I am just confirming that I am reading things correctly and (1) fror HDR, you only provided two-point white balance settings, and (2) for SDR, you only provided ten-point white balance and color management settings.
It’s very possible your older TVs were oversaturating colors. The Costco variant should be identical to the normal model. In their most accurate picture modes, colors will look the same on those three TVs with content in Rec.709 SDR outside of small differences with pre-calibration accuracy.
I started out in music. As a musician, I’ve always respected the original intent of composers, arrangers, instrumentalists, vocalists, and recording engineers. I wish to approach the world of video the same way, even if it means I’ve been viewing SRGB/Rec.709 the wrong way for years and have to adapt. I think this experience is similar to when I got transparent, accurate headphones such as the Sennheiser HD 600 and HD 560 S 2022 revision. At first, I thought their mids were a little strong, but I knew I couldn’t stand the rumbling of bass-heavy sounds which are supposed to be quiet, such as refridurators, HVAC systems, etc. when part of a movie. So I knew I was on the right path and just had to adapt. I appreciate straight, transparent, scientific answers. That’s all I’ve been looking for since I got this TV and have been trying to determine whether the issue was the TV or me used to inaccurate playback for many years like I was with headphones and speakers. For a $2,000 85-inch model, while the Hisense U8K had lower color dE, it was lacking in several other areas the X90L excels in. While the Sony X93L, Sony X95L, and TCL QM850G and QM851G are brighter and have more dimming zones, they didn’t excel in areas the X90L excels in. While the Sony Bravia 7 and Bravia 9 have improved upon the X90L, they’re out of my price range. While we don’t yet know what 2025 will bring, the X90L so far has checked as many boxes in important areas of performance for me as possible for the price, including pre-calibration color accuracy, PQ EOTF tracking, gradiant handling, and sustained full-screen brightness, making the Sony 85-inch X90L continue to remain in that perfect sweet spot of performance, size, and price. Even if 2025 improves in these areas for a similar price, I think it would be too late to sell mine used and get enough money for an upgrade being that I bought it in November of 2023. I know if I had waited for a few more years, I might have been able to go OLED or mini-LED, but I’ve been collecting a lot of 4K Blu-rays, didn’t have a TV compatible with HDR10 and Dolby Vision, and wanted to have some kind of experience with the formats. Maybe after I get my audio situation straightened out, I’ll save for an OLED or mini-LED and give my X90L away to a friend or family member or move it to another room in my house if I have the space to do so.
Don’t apologize! I’ll do my best to answer a few of your concerns in your last couple of comments. The X90L is a very good TV, and it has no problem displaying the entirety of the Rec.709 color space used for SDR content. There’s some very slight inaccuracies with certain colors, but they’re so minor that even keen eyed observers likely wouldn’t notice the difference. You can see this in the pre-calibration of our review. Every unit is a bit different with pre-calibration results, but Sony is known for their accuracy, so you should have very similar results on your X90L. The settings you mentioned are correct if you want the most accurate image without a full on calibration. If you want the most accurate colors possible, you’ll need to get the TV calibrated by a professional. This can be pretty expensive, but it’s definitely cheaper than buying the tools and learning to do it yourself.
Like you said, the X90L is capable of displaying a wide range of colors, but if you’re after an accurate image, you shouldn’t be forcing DVDs and Blu-rays to the TV’s native color gamut. SDR content isn’t mastered in the wider colors space like HDR content is, so naturally colors won’t be as vibrant in SDR as they are in HDR. The whole point of modern TVs having wide color gamuts is for HDR content. Pretty much any decent TV will cover the entirety of the Rec.709 color space. Yes, the Live Color setting definitely oversaturates colors, but if you’d rather a more colorful image, there’s no harm in using it.
I’ve never used a X800M2, so I can’t comment on that, but as long as you have the UB820’s Color Mode set to YCbCr (Auto) and the Deep Color Output set to Auto (12-bit priority), it should be outputting DVDs and Blu-Rays in 8-bit, HDR10 content in 10-bit, and some Dolby Vision content in 12-bit. I hope that helps clear some things up!
Thank you. Red, green, aqua, and cyan don’t look as good as how I remember them looking on CRTs, plasmas, and 1080p LCDs with a native SRGB/Rec.709 color space, unless they were oversaturating these colors. When forcing my laptop to show these single colors for my desktop background, I have the same undersaturation issues with the laptop’s HDR output turned off, outputting RGB full-range, and the X90L in the Custom picture mode with the most accurate settings. Could the Costco variant be inferior in certain areas? Also, while I can’t afford a Samsung S90D, Samsung S95D, or Sony A95L at this time, being that they’re QD-OLEDs, will Rec.709 appear more saturated even in their most accurate modes with their most accurate settings than on the X90L?
I appologize if I came across as offensive or annoying. I can’t justify spending hundreds of dollars for a colorimeter and measurement software I’d rarely use. If that’s the only way I can determine whether my X90L is correctly displaying red, green, blue, and cyan when playing SRGB/Rec.709 sources in the Custom picture mode with Color set to 50, Color Temperature set to Expert 1, Live Color set to Off, Color Space set to Auto, and everything in the Adv. Color Adjustment sub menu set to its respective default values, I’ll see if I can find a friend or close acquaintance who has such tools so I wouldn’t have to pay them too much to have this checked out, and if I can’t find anyone, I’ll just have to wait for God to somehow provide the answer for me in His own time in His own way.
When we tested the X90L we had HDR Mode, HDMI Video Range, and Color Space set to Auto. We always leave these types of settings on Auto during our testing, since we want the TV choose the proper settings depending on the content. The only time we manually change the Color Space setting is when a TV doesn’t properly clamp to the correct color space of the content it detects, but that’s very rare. On your UB820, you’ll want to have the Color Mode set to YCbCr (Auto) and the Deep Color Output set to Auto (12-bit priority). I’m pretty sure these are the default settings since I don’t remember changing them on my UB820. It’s best to leave these settings on Auto so you don’t have to make adjustments based on what you’re watching.
I personally wouldn’t recommend forcing SDR content into BT.2020, as doing so will give you inconsistent, inaccurate, and very oversaturated colors. The X90L doesn’t under saturate colors in SDR. 4k Blu-rays are mastered in a wider color space with HDR, so colors tend to pop more and they look more vibrant. If you’re regularly switching between DVDs/Blu-rays and 4k Blu-rays, you’re probably just noticing the difference between SDR and HDR.
If you’re after the most accurate image possible, you’ll want to leave settings like Live Color disabled. However, if you want more vibrant colors when watching SDR content, I’d recommend using the Live Color setting on Low or Medium instead of forcing Rec.709 content into BT.2020. I have an X90L in my living room and I’ll sometimes use the Live Color setting when I want more vibrant colors. I find that it works well without oversaturating colors so much that they look unnatural. I hope that helps.
I want the combination of accuracy and potential. My friend who is getting into photography likes the Live Color Medium setting the best, but a few people at AVS claim it oversaturates things. So I wanted to continue pursuing this until I got extra confirmation from experts such as more people at AVS, well-respected Youtubers, you guys, etc.. I would never use Live Color for HDR10 and Dolby Vision sources because that would oversaturate things being that they’re already taking advantage of the X90L’s color potential. Also, I strictly use my Panasonic DP-UB820P-K for 4K Blu-rays (1) so I don’t have to manually switch Dolby Vision on and off, and (2) for increasing the Dynamic Range Adjustment setting for playing HDR10 sources in brighter environments. I strictly use my Sony UBP-X800M2 for 1080p Blu-rays with video output resolution set to 1080p, SACDs with video output resolution set to 720p, and DVDs with video output resolution set to 480i/576i. Its YCbCr/RGB output mode is set to YCbCr (4:2:2) and Deep Color output is set to Off. On my Panasonic and my Sony, they both claim to be outputting 12-bit even with these settings.
Edited 1 month ago: Forgot to mention that Live Color should be Off for HDR10 and Dolby Vision sources.
I know the X90L is perfectly capable of high quality, saturated colors because I am able to achieve them when either forcing the X90L into BT.2020 via the Video Signal submenu or setting Live Color to Medium. I know Live Color is supposed to be off in order to match what the distributor of the source wants us to see. But what about mapping everything to the X90L’s native color space? When Rtings tested the X90L with SRGB/Rec.709 sources, was everything in the Video Signal submenu set to Auto, such as HDR Mode, HDMI Video Range, and Color Space? If so, when playing 1080p Blu-rays and SD DVDs, should my players’ HDMI modes be set to YCbCr (4:2:2) and Deep Color modes be set to Off? If so, then that would mean no SRGB/Rec.709 undersaturation issues exist with the X90L and I would be the one who needs to adjust to and accept how 1080p Blu-rays and SD DVDs appear compared to 4K UHD Blu-rays.
Edited 1 month ago: Added the part about color mapping to the X90L's native color space
When using the X90L as a monitor for my laptop, I use the Graphics picture mode. Being that the Graphics picture mode disables local dimming and leaves all the LED backlights constantly running, how much could this shorten the life of the LEDs or the power supply for the backlight system?
Just a thought regarding the X90L and Dolby Vision. Being that the X90L is limited to a peak brightness of 1,250 nits, is there any chance Dolby is preserving detail by mapping everything to lower luminance levels to avoid clipping? If so, I can accept this, and if I want 4K Blu-rays to be brighter, I should switch to HDR10 which will let me do my own tone mapping with my Panasonic 820 player and X90L’s HDR Tone Mapping options.
It’s hard to say. Assuming both have 100% coverage of Rec. 709 and are displaying the colors accurately, they’d look the same.
With all enhancements disabled, Color set to 50, and everything in Video Signal set to Auto on the X90L, when playing the test screens on Sony 1080p Blu-rays and Sony 4K UHD Blu-rays, the single squares which change color showing BT.2020 inside BT.2020 on Sony 4K UHD Blu-rays look better than the colors on the multi-colored pattern on Sony 1080p Blu-ray discs, as well as when generating single background colors from my laptop when sending SRGB.
What Philips is doing here is assuming that the colors in Rec.709 content are duller than what the content creator intended due to the limited range of colors in that space. As far as accuracy goes, such a feature would be less accurate than if the TV was setup to display only Rec.709 colors, but it would be more vibrant. The problem with features like this is even if it’s “AI” powered, it’ll still get it wrong at least some of the time, resulting in oversaturated colors that are most definitely not what the original creator intended.
At the end of the day it comes down to personal preference. While we almost always disable features like that to ensure that content is as close as possible to the source, if you prefer a more vibrant image then by all means enable them. It’s your TV, you should set it up however you prefer.
Will Rec.709 look better on an older TV designed for Rec.709 than on a newer TV designed for BT.2020?
Doesn’t everything from 2:38 to 3:03 in this video ( https://www.youtube.com/watch?v=T_n1bomSkkk&pp=wgIGCgQQAhgB ) prove what I keep saying about SDR/Rec.709 looking better on older TVs being that they’re designed to display SDR/Rec.709 signals, and could look as good on newer TVs with standards conversion, such as when setting Live Color to Medium only when playing SDR/Rec.709 sources? If Vincent thought Philips was wrong, he would have said this strays from how the source was mastered. In this case, by not applying any color remapping on newer TVs, wouldn’t you be experiencing something inferior to what the masterors of the source want you to be experiencing?
In the X90L’s Custom picture mode and the A95L’s Professional picture mode, with Color set to 50, Live Color set to Off, and HDMI Color Space set to Auto, when playing SDR/BT.709, would red, green, and cyan still appear significantly more saturated on the A95L’s QD-OLED panel, or would QD-OLED’s stronger saturation only be noticed when inputting raw HDR/BT.2020?
While playing SDR/BT.709 sources in the Custom picture mode, I decided to experiment with combining Live Color with the various Color Space options. Live Color is not available when the Color Space is set to DCI. This suggests to me that Live Color is more sophisticated than raising color saturation, and involves color remapping, which I was posting about between a few days ago to a week ago. For SDR/BT.709 sources, when Color Space is set to Auto and Live Color is Off, red, green, and cyan appear undersaturated. Blue has a touch of green in it to make it brighter. setting the Color Space to Adobe RGB undersaturates red no matter how Live Color is set. When setting Live Color to High, red appears undersaturated no matter what Color Space you choose, and blue appears purple when Color Space is set to SRGB/BT.709. When Color Space is set to DCI, Live Color is not available, green 0/255/0 and cyan 0/255/255 are more saturated than when Color Space is set to Auto and Live Color set to Off, but still not as saturated as the display is capable of when displaying raw BT.2020. When setting the Color Space to BT.2020 while playing SDR/SRGB/BT.709 sources, all colors are oversaturated even when Live Color is off, even when one color has an RGB value of 255 while the other two RGB values are 128. When playing SDR/SRGB/BT.709 sources, I get the best results in the Custom picture mode with Color Space set to Auto and Live Color set to Medium. So unless my source isn’t outputting something properly, that’s the only way I can get the results I’m looking for while playing SDR/SRGB/BT.709 sources. I haven’t yet experimented with the Live Color options when playing 4K UHD Blu-ray discs because some of them utilize the full BT.2020 color gamut, while others only utilize BT.709 inside BT.2020, and I’m concerned that Live Color might oversaturate certain moments during 4K UHD Blu-ray playback.
I know I’ve brought this up before. I just feel like something isn’t adding up. One thing I forgot to mention is that when playing a source recorded using the BT.2020 color space, the colors are as saturated as when playing sources recorded using the BT.709 on pre-4K LCDs, plasmas, and CRTs. However, when playing sources recorded using the BT.709 color space with the X90L, the colors aren’t as saturated as when playing BT.709 sources on pre-4K displays, as though the X90L isn’t properly mapping BT.709 sources. I don’t know how many of you are familiar with creating colors with RGB values, but here’s a description of the effect I’m getting. When my source is HDR10/BT.2020 and the X90L’s Video Signal > HDR Mode is set to either Auto or HDR10 and Video Signal > Color Space is set to either Auto or BT.2020, red looks like 255/0/0, green looks like 0/255/0, and cyan looks like 0/255/255. However, when my source is SDR/BT.709 and the X90L’s Video Signal > HDR Mode is set to Auto or Off and Video Signal > Color Space is set to Auto or BT.709, red looks like 255/128/128, green looks like 128/255/128, and cyan looks like 128/255/255. I have Color set to 50, Live Color Off, and everything in Adv. Color Adjustment set to its default values. Is SDR/BT.709 supposed to appear this way so it’s distinguishable from HDR10/BT.2020? I know the colors on my laptop’s internal display look like I’d expect when HDR is turned off.
I am in process to replace my old Sony XBR75x940D TV. I have narrow down to 2 Sony models, Bravia 7 & X90L, 85". The price difference of the 2 models is $1600, ($4300 vs $2700), in Toronto, Canada. My view room is in my basement with lighting control, and narrow viewing angle. My question : Is the Bravia 7 worth $1600 more for my case? I do understand the Bravia 7 is a better TV of the two as I have read your reviews.
If it’s against Rtings forum policy for those who aren’t Rtings staff members to answer this guy’s question, please delete my reply. Get the best you can afford. The best case sinario would be owning both an 85-inch Bravia 9 QLED for daytime use and extra peak brightness, and a 77-inch A95L QD-OLED for extra color richness and deep black levels for use in less ambient light. I know I can’t afford owning both. What are your budget and size limits? Do the best you can. If you have a lot of ambient light, your best choices from Hisense are the U8N, U7N, and U6N. From Sony, the Bravia 9, Bravia 7, and X90L. From TCL, the QM8, QM7, and Q6. If you don’t have to deal with a lot of ambient light, but your budget is limited, your choices from Hisense are the U8N, U7N, and U6N. From Sony, the X90L, X85K, and Bravia 3. From TCL, the QM8, QM7, and Q6. If you don’t have to deal with a lot of ambient light and your budget isn’t too limited, OLED might be a better option for you. From LG, you have the G4, C4, and B4. From Samsung, the S95D and S90D. From Sony, the A95L, A80L, and Bravia 8. As an AVfile who prefers accuracy and sometimes watching movies with my drapes and windows open on mild days but has a $2,000 limit, I went with the 85-inch Sony X90L. I know there is better out there, but in order to get started in the world of HDR and Dolby Vision, it falls within my sweetspot for performance, size, and price. I hope to make enough money in a few years to be able to sell my X90L locally and upgrade to either a QLED or QDOLED. But I’m not in any rush like I was when I had my plasma because the X90L gives me a satisfactry experience at least for starting out with these formats, and will probably be my daily driver for at least a few years. So I’ll still be at X90L threads on a regular basis. Good luck!
I appologize for hijacking this thread like I’ve been. As an enthusiast, for the X90L’s price, I prefer giving up a little contrast for its accuracy in so many areas. If one of the similarly priced TVs for 2024 combined the X90L’s strengths with better brightness and better minimization of blooming, I would have sold mine to someone local and upgraded. But based on my budget and preferences and the X90L’s strengths, I still have no buyer’s remorse. While other similarly priced TVs for 2024 beat the X90L in peak brightness of smaller window sizes and minimization of blooming, there are other areas in which the X90L beats similarly priced TVs for 2024, such as pre-calibration color accuracy, PQ EOTF tracking, and possibly gradient handling. Being that Sony has carried the X90L over from last year, I think its benefits make it relevant in 2024. Would the reviewers consider adding similarly priced TVs for 2024 in the X90L’s “Compared to Other TVs” section, as well as add the X90L in the same section of reviews for similarly priced TVs for 2024?
In the SDR Custum picture mode, how come Sony chose a default Gamma of -2 instead of 0? In the SDR and HDR Custom picture modes, how come Sony chose a default contrast of 90 instead of Max, yet for all the Dolby Vision picture modes, Sony chose a default contrast of Max?
I bought the Samsung S90C for just slightly more than the Hisense/TCL (they are both $999 @ BB) and couldn’t be happier. It is the first big screen TV (over 40") that we have owned the my wife notices the difference in the picture (we got this to replace a 2019 TCL S535 QLED with max brightness of around 350 nits). While I’m sure both TVs are exceptional values for the price they don’t match the out of box accuracy of the Samsung and it is plenty bright for us (day or night - no direct window lights on the TV and the blinds from the window and a covered patio for our patio door. We are sure we have an outstanding TV that will give us many years of viewing and when ever it does give out who knows where the state of the art will be in 5+ years (have a warranty that covers it for 5 years (including “burn in” if necessary).
I wish Samsung would acquire licenses for Dolby Vision and passing Dolby TrueHD and all the DTS audio formats. Not everyone needs the processing of the Sony A95L. If Samsung adds support for these things in their QD-OLEDs, I’ll bet you could get a 77-inch for $3,500 or less. Right?
Here’s the other thing to think about with all the brightness wars - if a movie or TV show is mastered in 600, 1000, or even 1,500 nits (until very recently a lot were mastered for 600 nits) then the extra brightness doesn’t really do you much good because the HDR meta data (during the info) or continuous meta data (Dolby Vision) will have your TV rendering your videos down to the correct brightness. So, unless you play with the TV’s various adjustments all those eye-searing mini-leds will not be melting your eyeballs. Personally, I’ve been thrilled with my S-90C which tops out at somewhere in the 1,500 nit range (depends on which peak brightness you look and how long it can sustain it). Having now experiencing OLED quality displays I am sure I will find it difficult to go back to something like the mini-leds UNLESS there is a wave of new content over the next few years that are mastered at higher brightness levels (say 2,000 nits and above. Right now, I think that most content is being mastered at the 600-1000 nit range which caters to mainstream TVs that can reach those brightness levels. Good Luck finding that perfect TV for your needs. P.S. If you like the TV and it fits you and your family needs stay off the forums because all you’ll hear is all the things wrong with your TV. It amazes me how many bad things people find if they look hard enough. I also am appalled by how many people plan to abuse their TV in hopes of “breaking it” under the extended warranty plans in order to get a newer model for
“free”.
I’m not trying to offend anyone. I’m speaking for the home theater enthusiasts who are looking for performance within this price range. After reading https://www.rtings.com/tv/tools/compare/hisense-u8-u8n-vs-tcl-qm8-qm851g-qled/50406/60898?usage=1&threshold=0.10 , the only advantages I see over my Sony X90L are higher brightness and deeper black levels. Color performance and PQ EOTF tracking are things I expect for $1,000 and up. The 2024 Hisense U8N and TCL QM8 are fine if all you care about are visibility during the day and the deepest black levels at night. But as a home theater enthusiast, I still have no buyer’s remorse about getting my Sony X90L. I don’t mind its blooming. When it blooms, black levels aren’t as bad as when local dimming is disabled. It is still bright enough for me to use in bright surroundings. I choose the X90L and think it is at least as relevant in 2024 as it was in 2023 because of its similar price. If you want the brightness and black levels of the 2024 Hisense U8N and TCL QM8 and the accuracy of the Sony X90L, then you’ll have to pay more for the Sony Bravia 7 or Bravia 9. This is where we stand in 2024. I think we had better out-of-the-box performance in 2023 with the Hisense U8K and Samsung S90C. But at least Sony decided to carry the X90L over, giving 2024 some hope for purists on a budget looking to upgrade.
While the Sony X90L can’t get as bright as the 2024 TVs, it measures better than the 2024 mini-LEDs when it comes to pre-calibration color accuracy and PQ EOTF tracking. Being that it is still being manufactured, I think it is relevant in 2024 because of its similar price to the Hisense U8N and TCL 2024 QM851G. Unless the 2024 QM851G can equal the X90L’s PQ EOTF tracking and pre-calibration color accuracy, if you’re looking for something in that price range, don’t rule out the Sony X90L just yet. Although in the event the QM851G DOES have a combination of the X90L’s accuracy AND the peak brightness of mini-LED, then I might consider selling my X90L to someone locally and replace it with the QM851G. But I’m not getting my hopes up until the numbers have been exposed.
If money was no object for me, I’d have an 85-inch Bravia 9 QLED mini-LED and 77-inch A95L QD-OLED on wheels. I’d roll one into my viewing area and the other off to the side depending on my lighting conditions and what I’m watching. That’s perfection. Right? But to tell you the truth, how many of us can afford to do that? I know I can’t. But it’s still fun to dream. Right? Or if you need color accuracy yet are willing to compromise the small peak highlights of mini-LED and the pixel-level dimming of OLED, the X90L from 55 inches to 98 inches isn’t a bad alternative. I’ve used everything including crappy TN panels on laptops, CCFL-backlit LCD TVs, LED-backlit monitors, a plasma, CRTs, and and an AMOLED panel on my current laptop. After seven months with my 85-inch X90CL, I’d say its performance is a mix of LED-backlit LCDs, plasmas?, CRTs, and AMOLEDs I have used. With Auto Local Dimming and Peak Luminance set to High, the elevated black levels caused by blooming are nowhere near as high as when disabling Auto Local Dimming. Here are MY current settings. My main objective is to play Blu-rays as they are mastered while maintaining the X90CL’s potential. I am not a gamer. I only stream when audio and video performance aren’t important. Depending on your environment, configuration, preferences, etc., they might not work for you.
Picture Settings
Category - Setting - SDR&Rec.709/SRGB - HDR10&BT.2020 in Normal Environment - HDR10&BT.2020 in Excess Ambient Light - Dolby Vision in Normal Environment - Dolby Vision in Excess Ambient Light
Basic - Picture Mode - Custum - Custum - Custom - Dolby Vision Dark - Dolby Vision Bright or Vivid
Basic - Auto Picture Mode - Off - Off - Off - Off - Off
Ambient Optimization Pro - Light Sensor - Off - Off - Off - Off - Off
Ambient Optimization Pro - Auto Luminance Level - Off - Off - Off - Off - Off
Ambient Optimization Pro - Auto Tone Curve - Off - Off - Off - Off - Off
Brightness - Brightness - Max - Max - Max - Max - Max
Brightness - Contrast - 90 - 90 - 90 - Max - Max
Brightness - Gamma - -2 - 0 - 0 - 0 - 0
Brightness - HDR Tone Mapping - Unavailable - Gradation Preferred - Unavailable - Off - Brightness Preferred
Brightness - Black Level - 50 - 50 - 50 - 50 - 50
Brightness - Black Adjust - Off - Off - Off - Off - Off
Brightness - Adv. Contrast Enhancer - Off - Off - Off - Off - High
Brightness - Auto Local Dimming - High - High - High - High - High
Brightness - Peak Luminence - High - High - High - High - High
Color - Color - 50 - 50 - 50 - 50 - 50
Color - Hugh - 0 - 0 - 0 - 0 - 0
Color - Color Temperature - Expert 1 - Expert 1 - Expert 1 - Expert 1 - Expert 1
Color - Live Color - Medium - Off - Off - Off - Off
Clarity - Sharpness - 50 - 50 - 50 - 50 - 50
Clarity - Reality Creation - Off - Off - Off - Off - Off
Clarity - Resolution - Min - Min - Min - Min - Min
Clarity - Random Noise Reduction - Off - Off - Off - Off - Off
Clarity - Digital Noise Reduction - Off - Off - Off - Off - Off
Clarity - Smooth Gradation - Off - Off - Off - Off - Off
Motion - Motion Flow - Off - Off - Off - Off - Off
Motion - Smoothness - Min - Min - Min - Min - Min
Motion - Clearness - Min - Min - Min - Min - Min
Motion - Cinemotion - Off - Off - Off - Off - Off
Video Signal - HDR Mode - Auto - Auto - Off - Unavailable - Unavailable
Video Signal - HDMI Video Range - Auto - Auto - Auto - Auto - Auto
Video Signal - Color Space - Auto - Auto - Auto - Unavailable - Unavailable
Adv. Color Adjustment Settings
Category - Setting - Option
Basic - R Gain - Max
Basic - G Gain - Max
Basic - B Gain - Max
Basic - R Bias - 0
Basic - G Bias - 0
Basic - B Bias - 0
Color Gamma Adjustment Points 1-10 - R Offset - 0
Color Gamma Adjustment Points 1-10 - G Offset - 0
Color Gamma Adjustment Points 1-10 - B Offset - 0
Per Color Adjustment (All Colors) - Hugh - 0
Per Color Adjustment (All Colors) - Saturation - 0
Per Color Adjustment (All Colors) - Lightness - 0
Sound Settings
Setting - Normal Listening - Clarity
Sound Mode - Standard - Standard
Advanced Auto Volume - Off - Off
Ballance - 0 - 0
TV Position - Tabletop Stand - Wall Mount
Sound Customization Settings
Setting - Option
Surround - On
Surround Effect - Max
Equalizer (All Bands) - 0
Voice Zoom - 0
Volume Level Settings
Setting - Option
Volume Offset - 0
Dolby Dynamic Range - Standard
Good luck.
Edited 8 months ago: Edited labels in my settings chart.
While I can’t afford a 77-inch A95L and am glad my 85-inch X90CL is still current, I would have liked to have seen Sony replace the A95L with a model with Samsung Display’s third-generation QD-OLED panel, and the X90L with a model with granular dimming even though it’s not mini-LED. I would have sold my X90CL for $1,500 to a local buyer and replaced it. The Bravia 3 is more like the X80K and wouldn’t make a good replacement for the X90L, hense my previous sentence.
As I was turning Live Color off to prepare for watching a title on 4K UHD HDR10 BT.2020 Blu-ray to make sure nothing got oversaturated, it dawned on me that maybe SDR Rec.709 is not supposed to look like HDR BT.2020 no matter how hard you try getting it to do so. Maybe reds and greens are supposede to be less saturated while blues have a sprinkle of green in them. Inotherwords, a different color scheme.
I discovered that there are plenty of scenes best for HDR10 and Dolby Vision calibration on the Warner Bros. December 2018 4K UHD Blu-ray release of 2001: A Space Odyssey ( https://www.gruv.com/product/2001_a_space_odyssey_4k_ultra_hd_blu_ray_uhd ), particularly the very beginning of chapter 21, the very beginning of chapter 25, and various sections throughout chapter 31. This disc reveals to me (1) the full potential of and differences between HDR10 and Dolby Vision, and (2) there is nothing wrong with the X90L’s HDR10 and Dolby Vision performance. With Auto Local Dimming off, I was able to analyse everything at the pixel level. I discovered that Dolby Vision Dark has the most distinction between luminance levels up and down the gray scale compared to Dolby Vision Bright and HDR10, and actually achieves higher peak brightness in highlights at the pixel level than Dolby Vision Bright when something is mastered that way. When setting Auto Local Dimming back to Max, I notice Dolby Vision Dark mildly dims the backlight while keeping everything at the pixel level as I described. While I wish the backlight wouldn’t dim in Dolby Vision Dark and would like Sony to fix this in a future firmware update, the peak brightness in highlights is so close to Dolby Vision Bright and HDR10 that I’d rather use Dolby Vision Dark to maintain that extra distinction between luminance levels up and down the gray scale. Now that I understand what’s going on in the world of HDR, I don’t mind when things don’t always get bright enough for the X90L to perform at its full potential because for the purpose of mood and effect, they’re not always supposed to. Although if I want to accommodate for daytime viewing or any HDR10 and Dolby Vision content which doesn’t reach that same peak brightness in highlights by default, in order to maintain the BT.2020 color space and peak brightness in highlights while raising the rest of the gray scale without clipping anything, when playing Dolby Vision, I have to choose Dolby Vision Bright, set HDR Tone Mapping to Brightness Preferred, and Adv. Contrast Enhancer to High, and when playing HDR10, set HDR Tone Mapping to either Off or Gradation Preferred and Adv. Contrast Enhancer to High. Setting Video Signal> HDR format to Off maintains the BT.2020 color space and raises the rest of the gray scale, but dims the peak brightness in highlights. Disabling my player’s HDR output maintains the peak brightness in highlights and raises the rest of the gray scale, but changes the color space to BT.709, kind of like playing a 1080p Blu-ray or SD DVD. Being that 4K UHD Blu-rays are mastered in the BT.2020 color space, I’d rather only enable Live Color when playing 1080p Blu-rays and SD DVDs.
Just curious. When in the Game and Graphics picture modes with all brightness, color, clarity, and motion enhancements disabled, what is the vibration that appears on images such as still text?
Here are my solutions for playing HDR10 and Dolby Vision. If an HDR10 source doesn’t reach the X90L’s peak brightness, I think I’m going to take the brightest scene in the movie and set HDR Tone Mapping to Brightness Preferred and Adv. Contrast Enhancer to whatever level reaches the X90L’s peak brightness in the brightest area of the screen without clipping. So far, the only HDR10 title which I don’t think has actual HDR10 metadata is Lionsgate’s December 2017 4K UHD Blu-ray edition of “Terminator II: Judgment Day.” But for HDR10 sources with 1,000CD/M2 metadata, an HDR Tone Mapping setting of Off without the Adv. Contrast Enhancer should reach the X90L’s peak brightness during the brightest scene, provide propper PQ EOTF tracking, and avoid clipping. For HDR10 metadata above 1,000CD/M2, an HDR Tone Mapping setting of Gradation Preferred without the Adv. Contrast Enhancer should reach the X90L’s peak brightness during the brightest scene, provide propper PQ EOTF tracking, and avoid clipping. If a Dolby Vision source doesn’t reach the X90L’s peak brightness (none of my Dolby Vision Blu-rays do), I think I’m going to take the brightest scene in the movie, set Picture Mode to Dolby Vision Bright, HDR Tone Mapping to Brightness Preferred, and Adv. Contrast Enhancer to whatever level reaches the X90L’s peak brightness in a particular area without clipping. This should help me get Dolby Vision performance to at least match HDR10 and SDR. Is PQ EOTF tracking also a part of Dolby Vision performance? If so, I’ll use whatever Dolby Vision picture mode and HDR Tone Mapping setting which remain faithful to the metadata, go to a movie’s brightest scene, and set the Adv. Contrast Enhancer to whatever level reaches the X90L’s peak brightness in the brightest area of the screen without clipping.
Edited 11 months ago: Added "should reach the X90L's peak brightness in the brightest scene,"
While the matt screen prevents the picture quality from reaching its full potential, resulting in the lower scores, the S95D might have the best looking picture for a display with a matt screen. If Samsung would also sell the S95D in the same sizes with a glossy screen, that would allow buyers to choose between the matt screen for use in a bright room at the expense of picture quality, or the glossy screen to preserve picture quality with possible side effects of reflections. But unless Samsung decides to do that, for now our choices are the S95D if you prefer matt, and the S90D if you prefer preserving picture quality and don’t mind the side effects of reflections.
So I set Brightness (backlight) to Max, and HDR Tone Mapping, Adv. Contrast Enhancer, Auto Local Dimming, and Live Color to Off in order to begin investigating the science behind Dolby Vision Vivid, Dolby Vision Bright, and Dolby Vision Dark. When playing the black void in the beginning of “2001: A Space Odyssey,” I didn’t notice a change in the backlight itself when switching between the three modes. When playing other parts of the movie, I notice Dolby Vision Dark is only SLIGHTLY darker than the other two modes. The other two modes are identical. Even in Dolby Vision Vivid and Dolby Vision Bright, the peak luminance is still significantly lower than HDR10 and SDR without my work-arounds. I think the purpose of Dolby Vision Vivid is to adjust the image according to personal preference. Is the difference between Dolby Vision Dark and the other two modes strictly at the pixel/processing level? Or are they still adding or subtracting based on the metadata, even with HDR Tone Mapping Off?
Edited 11 months ago: Replaced "pixel" with "pixel/processing."
While HDR10 does reach the peaks I expect from this display, it appears Dolby Vision material is holding back its potential. I have some work-arounds for it. However, before I apply them, I would like to understand the science behind the X90L’s Dolby Vision Dark and Dolby Vision Bright picture modes, as well as how my Sony UBP-X800M2 4K UHD Blu-ray player sends Dolby Vision to the X90L when Dolby Vision Output is set to On and a disc includes Dolby Vision encoding so I will be able to apply my work-arounds without causing clipping in bright scenes. But until I have a better understanding of how the X90L and X800M2 handle Dolby Vision, I think I’m just going to set my player’s Dolby Vision Output to Off, and the X90L’s HDR Mode to Off, and keep the HDMI Video Range and Color Space set to Auto so I can at least take advantage of the X90L’s peak brightness and color accuracy.
I just thought sharing this might help determine if the problem is with my player or the X90L. I hate to sound like I’m beating a dead hourse. I understand that the range from darkest to brightest constantly changes when inputting Dolby Vision. Every time I play Dolby Vision, it has the effect of decreasing the Contrast when playing SDR and HDR10. I’ll only give up on Dolby Vision as a last resort. My favorite Blu-ray disc reviewers swear by the Dolby Vision remasters. When set to Dolby Vision Dark with HDR Tone Mapping and Adv. Contrast Enhancer set to Off, the only way I can achieve brighter peaks is to raise the Black Level, which obviously elevates blacks to a point I’m not sure if even I can tolerate. When I go to the Video Signal category, I am unable to change the HDR Mode and Color Space. I am able to determine that my player is locked to the Full HDMI Video Range when outputting Dolby Vision because there is no change when switching HDMI Video Range from Auto to Full on the X90L, but there is a change when switching from Full to Limited, yet no noticeable difference in the contrast.
Being that the X90CL’s/X90L’s SDR and HDR brightness are pretty much the same, when playing a 4K UHD Blu-ray HDR10 disc, has anyone tried going into the Video Signal category and setting HDR to off, while leaving everything else set to Auto? Based on my observation, from what I can tell, this setup maintains the ritcher BT.2020 colors as mastered while the X90CL/X90L ignores the HDR10 metadata. So if you think HDR10 is too dark, this might work. Although compared to Dolby Vision, with HDR Mode set to Auto, I still get plenty of peak highlights which take advantage of its potential. Speaking of Dolby Vision, the HDR Mode cannot be changed during Dolby Vision playback. Although I normally would leave the HDR Mode set to Auto and tell my player to output in Dolby Vision and enjoy the benefits of HDR10 and Dolby Vision, the only reason I would even consider setting HDR Mode to Off and having the X90CL/X90L ignore the HDR10 and Dolby Vision metadata while maintaining BT.2020 is if I can’t see the picture when I have my curtains and windows open during bright afternoons. I just thought I’d pass my results of this setup along in case anyone might be interested based on my results.
Being that Sony is carrying this model over to their 2024 lineup, it’s still relevant for comparing similarly priced Hisense and TCL models, especially the Hisense U8N and 2024 makeover of the TCL QM8. While the U8N and QM8 will get brighter, the question is how close their color accuracy, PQ EOTF tracking, motion performance, and upscaling can get to the Sony X90L.
Out of the box, these settings are identical. They exist so that you can adjust them independently if you have different preferences for certain inputs.
What settings are affected by Expert 1 and Expert 2?
In the Color Temperature options, so many users including myself can’t notice a difference between Expert 1 and Expert 2. So how come these options exist if there is no difference between them? If there IS a difference, is Expert 2 warmer (as in more red) than Expert 1?
Out of the two low-latency picture modes, I notice when in the Game mode and Auto Local Dimming is Off, the backlight still turns off when inputting a signal with a black screen. Yet when in the Graphics mode and Auto Local Dimming is Off, the backlight actually stays on at all times. While black levels are slightly elevated, I do appreciate the OLED-like response during sudden changes in scene brightness, and OLED-like behavior when objects move without affecting parts of the picture which are supposed to be static. Yet all this is achieved without OLED’s aggressive ABL. I think of slightly elevated black levels the same way I think of slightly audible hiss on high-fidelity analog audio recordings. Will using the X90L this way in the Graphics modestill beat the X85K when it comes to PQ EOTF tracking, HDR native gradiant handling, HDR color volume, color gamut, and sustained full-screen brightness? If so, the combination of low latency and pixel-level response could really grow on me. This kind of reminds me of how an AVR performs as little modification to the audio signal as possible when in the Pure Direct mode. Here are my settings, with a few explanations along the way. Picture Mode=Graphics. Everything in Ambient Pro=Off. Brightness=Max. Contrast=Max (I notice higher pixel brightness and no clipping in bright areas on some of the test patterns found on Sony Pictures 4K UHD Blu-rays when going from 90 to Max). Gamma=0. HDR Tone Mapping=Off (in the Graphics mode, I notice higher pixel brightness and less clipping in bright areas on some of the test patterns found on Sony Pictures 4K UHD Blu-rays than when set to Brightness Preferred or Gradation preferred.) Black Level=50. Black Adjust=Off. Adv. Contrast Enhancer=Off. Auto Local Dimming=Off. Peak Luminance=Off (in the Graphics picture mode, setting Peak Luminance to High, Medium, Low, or Off doesn’t make a difference when Auto Local Dimming is Off). Color=50. Hugh=0. Color Temperature=Warm (in all the picture modes, Warm shows the brightest whites). Live Color=Off (I’ll live with Rec.709 content having faded colors to keep latency at a minimum). Everything in Clarity and Motion=Off and Min. Everything in Video Signal=Auto. Everything in Adv. Color Adjust=defaults. When inputting Dolby Vision, the X90L is limited to the three modes with high latency–Vivid, Dolby Vision Bright, and Dolby Vision Dark. I use Dolby Vision Dark with the same above settings verbatim.
Edited 1 year ago: Revised explanations for Contrast and HDR Tone Mapping settings.
I still look at this stuff even though I’m quite happy with my 85-inch Sony X90L and it’s past its return window. Even though this model is out of my price range even in a 75-inch, Based on Sharp’s reputation from years ago, I was really rooting for this Sharp to outdo my Sony X90L in the areas of performance of great importance to me. But numbers and science don’t lie. I’m surprised Hisense’s UX fell short of their U8K in certain areas as well. Based on Panasonic’s reputation, I’d also root for them to score well if they ever sell TVs in North America again.
If the Game Picture Mode is recommended for gamers, Graphics is recommended for graphics and use with computers, and Custom emphasizes true reproduction of the original signal, how come Custom has similar input lag to Vivid, Standard, Cinema, Photo, and IMax Enhanced? Are the other five modes using all the processing capabilities so that the X90 will be ready for using maximum settings for the enhancements in the Ambient Pro, Brightness, Color, Clarity, and Motion categories? While I wouldn’t be able to fully enjoy QD-OLED in a bright, sunny room, nor do I have the money for it, I’ll admit that the one advantage of QD-OLED would be maintaining quality in the Game and Graphics picture modes without the need of using the other picture modes and enabling all those enhancements. But for X90L users, it looks like we have to choose between responsiveness for tasks like typing and gaming, or quality for movies and shows. Right?
I visited the subject of latency a month or two ago, but didn’t know how to find test material for that. It turns out that Youtube has some AV sync test videos in which the flashing and the clicking should be as synchronized as possible. In the Custom picture mode, when a source is connected directly to my AVR for audio or audio and video, while the AVR or source is connected directly to the X90L for video, by default, I’d say there’s between a tenth to fifteenth of a second (100ms-150ms) of delay. My four choices are (1) do away with my AVR and speaker system, (2) being that my AVR doesn’t support eARC, use ARC with my AVR and give up DTS-HD Master Audio and 7.1-channel audio, (3) manually adjust my AVR’s delay to 145ms for 4K and 160ms for 1080, or (4) use the X90L’s Game and Graphics picture modes for everything so I can enjoy compatibility without having to switch my AVR’s delay setting back and forth according to the video source. How’s the X90L’s picture quality for movies in the Game and Graphics picture modes?
I discovered that with the Sony X90L, in addition to simultaneously taking advantage of the HDR10 and BT.2020 component of a 4K UHD Blu-ray disc by leaving everything in the Video Signal category set to Auto, it is possible to separately take advantage of these components. If you want to only take advantage of the HDR10 component, set HDR Mode to Auto and Color Space to BT.709. If either it is the middle of the day or you are in a bright room and only want to take advantage of the BT.2020 component, set HDR Mode to Off and Color Space to Auto. When playing the test pattern after the single color squares on a Sony Pictures 4K UHD Blu-ray disc, the bottom grayscale bar still ramps (if you know what I mean) without clipping or missing anything, but the difference is less distinguishable than with HDR set to Auto. It kind of reminds me of how it would look on those older CRTs which I do miss at times. When Dolby Vision is input, the only thing I can change in the Video Signal category is the HDMI Video Range, which I just leave on Auto no matter what. So I’m thinking of not using Dolby Vision encodes during the day or when I have my lights on. However, when watching at night or in a dark room, I appreciate the distinguishability HDR10 and Dolby Vision provide and will make sure everything in the Video Signal category is set to Auto. As you know, I’m not bothered by elevated black levels whether from blooming or turning off Auto Local Dimming. So in my case, especially now that I got my BT.709 issues resolved, this provides me viewing flexibility I just wouldn’t get from an OLED, including rich colors, good HDR10 and Dolby Vision performance for viewing with my drapes closed or at night, and 718 Nits of sustained full-screen brightness for impact of highlights and viewing with my lights on or with my drapes and windows open during the day.
Great news! I can finally play content mastered in BT./Rec.709 with accurate flesh tones AND rich colors! I previously explained that because of / due to my sight condition, while I know what less complex colors look like, I am not entirely familiar with how flesh tones are supposed to look. So after my friend who is into photography and I tried various settings, we got the best results in the Custom picture mode with most of Rtings’s recommended settings. The only thing I changed was setting the Live Color setting to Medium. When set at Low, green and cyan aren’t rich enough. When set at High, while cyan and green were fully saturated, red and blue are brighter yet undersaturated. To my surprise, setting Live Color to medium actually makes blue more OLED-like than leaving Live Color off and adjusting saturation and/or Color Space. Before you play 4K HDR content, it might be a good idea to remember to turn Live Color off, or flesh tones might look a little too red again😊. I thought it would be a good idea to share this in case the former SDR color issues of the past were giving anyone second thoughts about the X90L. I don’t know if other Sony displays have these SDR color issues, but if they do, at least my friend who is into photography and I found a solution.
I’m glad to be happy with how this TV handles HDR10 and Dolby Vision via HDMI. When playing content which uses the BT.709 color space, I find myself choosing between (1) accurate fleshtones with undersaturated colors, or (2) clear colors with red flesh tones. I have becoming more and more suspicious that the X90L’s native color space is higher/wider than BT.709 and is not properly converting BT.709 to its native color space. I know I’ve been going on and on about this, but after paying either $1,000 for a 55-inch, $1,300 for a 65-inch, $1,800 for a 75-inch, $2,000 for an 85-inch, or $8,000 for a 98-inch, I can’t help thinking that even BT.709 content should have clear colors AND accurate fleshtones. Is there any chance that proper color space conversion could be addressed in a future software/firmware update?
Hello. I bought a Sony 75x90l. There are two scratches on the bottom of the TV. I don’t know when they appeared. Either during transportation or when installing legs. Maybe the quality is like this now? It doesn’t affect anything, but it’s unpleasant. Have you ever had such cases?
https://drive.google.com/file/d/1yKdflAH-kDWXXapU_5sgNDgIihVwUwK_/view?usp=sharing
If you got it from Costco or Best Buy, see if they could handle removing your current unit and delivering a new one. If you got it elsewhere, you have to decide how much trouble it’s worth returning your current unit and exchanging it for a new one.
Hi! When it comes to adjusting colors on your display, we advise against using our color calibration settings. This is because these settings can vary significantly from panel to panel, and making adjustments here may negatively affect picture quality. Instead, we recommend sticking to our suggested settings, as these have been carefully chosen to provide the most accurate picture for users.
Additionally, we do not recommend using the Live Color feature, as it can have a negative impact on picture quality.
It’s important to note that when we calibrate SDR and HDR color values, there is some overlap from Expert 1 SDR to HDR. That’s why we use both Expert 1 and Expert 2 when calibrating to separate SDR and HDR calibrations, as this overlap can cause picture issues.
As for your Blu-ray player, we don’t specifically test for this, so I can’t offer any definitive solution. However, feel free to experiment with the Deep Color setting and see what works best for your viewing needs.
I hope this information helps! Thank you for reaching out.
I’m not using Rtings’s Adv. Color Adjustment settings. I have all color, clarity, and motion processing turned off. When playing SDR, in the Custom mode, I have Brightness=Max, Contrast=90, Gamma between -2 and 0, Black Adjust=Off, Auto Local Dimming=High, Peak Luminance=High, Color=50, Hugh=0, Color Temperature=Expert 1, Live Color Off, HDR Mode=Auto, HDMI Range=Auto, and Color Space=Auto. The Deep Color and YCbCr settings on my Blu-ray player didn’t make a difference. While HDR10 and Dolby Vision look great, these settings give me accurate flesh tones with colors which are less saturated than I’m comfortable with. Getting back to X90L settings, with Color Space set to Auto, when I increased the saturation high enough to display everything from RGB0-RGB255, flesh tones appear lobster-red. When I leave Color at 50 and change the Color Space to BT.2020, colors are clear, but some of the higher RGB values are clipped and flesh tones are lobster-red. If leaving Color at 50 and Color Space set to Auto is the best I can do, at least my SDR DVDs and Blu-rays and 4K UHD Blu-rays with HDR10 and Dolby Vision are distinguishable.
It appears the X90L actually came in first place for the new HDR pre-calibration accuracy test results. So even if I luck out and a financial miracle comes my way, this makes me question whether or not I should upgrade. How come even the best HDR pre-calibration and post-calibration accuracy have higher color dE than the best SDR pre-calibration and post-calibration accuracy?
Check out post #1905 of the AVS Forum Sony calibration thread ( https://www.avsforum.com/posts/63905930 ). While I am still left with questions, I appreciate the explanation of how certain picture modes are designed for certain use cases. I thought this might be of interest to X90L owners.
So if at some point I decide to try those settings, (1) for SDR, would I leave the two-point white balance and color management controls at their default values, and (2) for HDR, would I leave the ten-point white balance and color management controls at their default values?
On the calibration and settings page, I am just confirming that I am reading things correctly and (1) fror HDR, you only provided two-point white balance settings, and (2) for SDR, you only provided ten-point white balance and color management settings.
I started out in music. As a musician, I’ve always respected the original intent of composers, arrangers, instrumentalists, vocalists, and recording engineers. I wish to approach the world of video the same way, even if it means I’ve been viewing SRGB/Rec.709 the wrong way for years and have to adapt. I think this experience is similar to when I got transparent, accurate headphones such as the Sennheiser HD 600 and HD 560 S 2022 revision. At first, I thought their mids were a little strong, but I knew I couldn’t stand the rumbling of bass-heavy sounds which are supposed to be quiet, such as refridurators, HVAC systems, etc. when part of a movie. So I knew I was on the right path and just had to adapt. I appreciate straight, transparent, scientific answers. That’s all I’ve been looking for since I got this TV and have been trying to determine whether the issue was the TV or me used to inaccurate playback for many years like I was with headphones and speakers. For a $2,000 85-inch model, while the Hisense U8K had lower color dE, it was lacking in several other areas the X90L excels in. While the Sony X93L, Sony X95L, and TCL QM850G and QM851G are brighter and have more dimming zones, they didn’t excel in areas the X90L excels in. While the Sony Bravia 7 and Bravia 9 have improved upon the X90L, they’re out of my price range. While we don’t yet know what 2025 will bring, the X90L so far has checked as many boxes in important areas of performance for me as possible for the price, including pre-calibration color accuracy, PQ EOTF tracking, gradiant handling, and sustained full-screen brightness, making the Sony 85-inch X90L continue to remain in that perfect sweet spot of performance, size, and price. Even if 2025 improves in these areas for a similar price, I think it would be too late to sell mine used and get enough money for an upgrade being that I bought it in November of 2023. I know if I had waited for a few more years, I might have been able to go OLED or mini-LED, but I’ve been collecting a lot of 4K Blu-rays, didn’t have a TV compatible with HDR10 and Dolby Vision, and wanted to have some kind of experience with the formats. Maybe after I get my audio situation straightened out, I’ll save for an OLED or mini-LED and give my X90L away to a friend or family member or move it to another room in my house if I have the space to do so.
Thank you. Red, green, aqua, and cyan don’t look as good as how I remember them looking on CRTs, plasmas, and 1080p LCDs with a native SRGB/Rec.709 color space, unless they were oversaturating these colors. When forcing my laptop to show these single colors for my desktop background, I have the same undersaturation issues with the laptop’s HDR output turned off, outputting RGB full-range, and the X90L in the Custom picture mode with the most accurate settings. Could the Costco variant be inferior in certain areas? Also, while I can’t afford a Samsung S90D, Samsung S95D, or Sony A95L at this time, being that they’re QD-OLEDs, will Rec.709 appear more saturated even in their most accurate modes with their most accurate settings than on the X90L?
I appologize if I came across as offensive or annoying. I can’t justify spending hundreds of dollars for a colorimeter and measurement software I’d rarely use. If that’s the only way I can determine whether my X90L is correctly displaying red, green, blue, and cyan when playing SRGB/Rec.709 sources in the Custom picture mode with Color set to 50, Color Temperature set to Expert 1, Live Color set to Off, Color Space set to Auto, and everything in the Adv. Color Adjustment sub menu set to its respective default values, I’ll see if I can find a friend or close acquaintance who has such tools so I wouldn’t have to pay them too much to have this checked out, and if I can’t find anyone, I’ll just have to wait for God to somehow provide the answer for me in His own time in His own way.
I want the combination of accuracy and potential. My friend who is getting into photography likes the Live Color Medium setting the best, but a few people at AVS claim it oversaturates things. So I wanted to continue pursuing this until I got extra confirmation from experts such as more people at AVS, well-respected Youtubers, you guys, etc.. I would never use Live Color for HDR10 and Dolby Vision sources because that would oversaturate things being that they’re already taking advantage of the X90L’s color potential. Also, I strictly use my Panasonic DP-UB820P-K for 4K Blu-rays (1) so I don’t have to manually switch Dolby Vision on and off, and (2) for increasing the Dynamic Range Adjustment setting for playing HDR10 sources in brighter environments. I strictly use my Sony UBP-X800M2 for 1080p Blu-rays with video output resolution set to 1080p, SACDs with video output resolution set to 720p, and DVDs with video output resolution set to 480i/576i. Its YCbCr/RGB output mode is set to YCbCr (4:2:2) and Deep Color output is set to Off. On my Panasonic and my Sony, they both claim to be outputting 12-bit even with these settings.
I know the X90L is perfectly capable of high quality, saturated colors because I am able to achieve them when either forcing the X90L into BT.2020 via the Video Signal submenu or setting Live Color to Medium. I know Live Color is supposed to be off in order to match what the distributor of the source wants us to see. But what about mapping everything to the X90L’s native color space? When Rtings tested the X90L with SRGB/Rec.709 sources, was everything in the Video Signal submenu set to Auto, such as HDR Mode, HDMI Video Range, and Color Space? If so, when playing 1080p Blu-rays and SD DVDs, should my players’ HDMI modes be set to YCbCr (4:2:2) and Deep Color modes be set to Off? If so, then that would mean no SRGB/Rec.709 undersaturation issues exist with the X90L and I would be the one who needs to adjust to and accept how 1080p Blu-rays and SD DVDs appear compared to 4K UHD Blu-rays.
When using the X90L as a monitor for my laptop, I use the Graphics picture mode. Being that the Graphics picture mode disables local dimming and leaves all the LED backlights constantly running, how much could this shorten the life of the LEDs or the power supply for the backlight system?
Just a thought regarding the X90L and Dolby Vision. Being that the X90L is limited to a peak brightness of 1,250 nits, is there any chance Dolby is preserving detail by mapping everything to lower luminance levels to avoid clipping? If so, I can accept this, and if I want 4K Blu-rays to be brighter, I should switch to HDR10 which will let me do my own tone mapping with my Panasonic 820 player and X90L’s HDR Tone Mapping options.
With all enhancements disabled, Color set to 50, and everything in Video Signal set to Auto on the X90L, when playing the test screens on Sony 1080p Blu-rays and Sony 4K UHD Blu-rays, the single squares which change color showing BT.2020 inside BT.2020 on Sony 4K UHD Blu-rays look better than the colors on the multi-colored pattern on Sony 1080p Blu-ray discs, as well as when generating single background colors from my laptop when sending SRGB.
Will Rec.709 look better on an older TV designed for Rec.709 than on a newer TV designed for BT.2020?
Doesn’t everything from 2:38 to 3:03 in this video ( https://www.youtube.com/watch?v=T_n1bomSkkk&pp=wgIGCgQQAhgB ) prove what I keep saying about SDR/Rec.709 looking better on older TVs being that they’re designed to display SDR/Rec.709 signals, and could look as good on newer TVs with standards conversion, such as when setting Live Color to Medium only when playing SDR/Rec.709 sources? If Vincent thought Philips was wrong, he would have said this strays from how the source was mastered. In this case, by not applying any color remapping on newer TVs, wouldn’t you be experiencing something inferior to what the masterors of the source want you to be experiencing?
In the X90L’s Custom picture mode and the A95L’s Professional picture mode, with Color set to 50, Live Color set to Off, and HDMI Color Space set to Auto, when playing SDR/BT.709, would red, green, and cyan still appear significantly more saturated on the A95L’s QD-OLED panel, or would QD-OLED’s stronger saturation only be noticed when inputting raw HDR/BT.2020?
While playing SDR/BT.709 sources in the Custom picture mode, I decided to experiment with combining Live Color with the various Color Space options. Live Color is not available when the Color Space is set to DCI. This suggests to me that Live Color is more sophisticated than raising color saturation, and involves color remapping, which I was posting about between a few days ago to a week ago. For SDR/BT.709 sources, when Color Space is set to Auto and Live Color is Off, red, green, and cyan appear undersaturated. Blue has a touch of green in it to make it brighter. setting the Color Space to Adobe RGB undersaturates red no matter how Live Color is set. When setting Live Color to High, red appears undersaturated no matter what Color Space you choose, and blue appears purple when Color Space is set to SRGB/BT.709. When Color Space is set to DCI, Live Color is not available, green 0/255/0 and cyan 0/255/255 are more saturated than when Color Space is set to Auto and Live Color set to Off, but still not as saturated as the display is capable of when displaying raw BT.2020. When setting the Color Space to BT.2020 while playing SDR/SRGB/BT.709 sources, all colors are oversaturated even when Live Color is off, even when one color has an RGB value of 255 while the other two RGB values are 128. When playing SDR/SRGB/BT.709 sources, I get the best results in the Custom picture mode with Color Space set to Auto and Live Color set to Medium. So unless my source isn’t outputting something properly, that’s the only way I can get the results I’m looking for while playing SDR/SRGB/BT.709 sources. I haven’t yet experimented with the Live Color options when playing 4K UHD Blu-ray discs because some of them utilize the full BT.2020 color gamut, while others only utilize BT.709 inside BT.2020, and I’m concerned that Live Color might oversaturate certain moments during 4K UHD Blu-ray playback.
I know I’ve brought this up before. I just feel like something isn’t adding up. One thing I forgot to mention is that when playing a source recorded using the BT.2020 color space, the colors are as saturated as when playing sources recorded using the BT.709 on pre-4K LCDs, plasmas, and CRTs. However, when playing sources recorded using the BT.709 color space with the X90L, the colors aren’t as saturated as when playing BT.709 sources on pre-4K displays, as though the X90L isn’t properly mapping BT.709 sources. I don’t know how many of you are familiar with creating colors with RGB values, but here’s a description of the effect I’m getting. When my source is HDR10/BT.2020 and the X90L’s Video Signal > HDR Mode is set to either Auto or HDR10 and Video Signal > Color Space is set to either Auto or BT.2020, red looks like 255/0/0, green looks like 0/255/0, and cyan looks like 0/255/255. However, when my source is SDR/BT.709 and the X90L’s Video Signal > HDR Mode is set to Auto or Off and Video Signal > Color Space is set to Auto or BT.709, red looks like 255/128/128, green looks like 128/255/128, and cyan looks like 128/255/255. I have Color set to 50, Live Color Off, and everything in Adv. Color Adjustment set to its default values. Is SDR/BT.709 supposed to appear this way so it’s distinguishable from HDR10/BT.2020? I know the colors on my laptop’s internal display look like I’d expect when HDR is turned off.
If it’s against Rtings forum policy for those who aren’t Rtings staff members to answer this guy’s question, please delete my reply. Get the best you can afford. The best case sinario would be owning both an 85-inch Bravia 9 QLED for daytime use and extra peak brightness, and a 77-inch A95L QD-OLED for extra color richness and deep black levels for use in less ambient light. I know I can’t afford owning both. What are your budget and size limits? Do the best you can. If you have a lot of ambient light, your best choices from Hisense are the U8N, U7N, and U6N. From Sony, the Bravia 9, Bravia 7, and X90L. From TCL, the QM8, QM7, and Q6. If you don’t have to deal with a lot of ambient light, but your budget is limited, your choices from Hisense are the U8N, U7N, and U6N. From Sony, the X90L, X85K, and Bravia 3. From TCL, the QM8, QM7, and Q6. If you don’t have to deal with a lot of ambient light and your budget isn’t too limited, OLED might be a better option for you. From LG, you have the G4, C4, and B4. From Samsung, the S95D and S90D. From Sony, the A95L, A80L, and Bravia 8. As an AVfile who prefers accuracy and sometimes watching movies with my drapes and windows open on mild days but has a $2,000 limit, I went with the 85-inch Sony X90L. I know there is better out there, but in order to get started in the world of HDR and Dolby Vision, it falls within my sweetspot for performance, size, and price. I hope to make enough money in a few years to be able to sell my X90L locally and upgrade to either a QLED or QDOLED. But I’m not in any rush like I was when I had my plasma because the X90L gives me a satisfactry experience at least for starting out with these formats, and will probably be my daily driver for at least a few years. So I’ll still be at X90L threads on a regular basis. Good luck!
Did Rtings post separate Adv. Color Adjustment settings for SDR and HDR?
Am I missing something, or did Rtings only post Adv. Color Adjustment settings for HDR, but not SDR?
I appologize for hijacking this thread like I’ve been. As an enthusiast, for the X90L’s price, I prefer giving up a little contrast for its accuracy in so many areas. If one of the similarly priced TVs for 2024 combined the X90L’s strengths with better brightness and better minimization of blooming, I would have sold mine to someone local and upgraded. But based on my budget and preferences and the X90L’s strengths, I still have no buyer’s remorse. While other similarly priced TVs for 2024 beat the X90L in peak brightness of smaller window sizes and minimization of blooming, there are other areas in which the X90L beats similarly priced TVs for 2024, such as pre-calibration color accuracy, PQ EOTF tracking, and possibly gradient handling. Being that Sony has carried the X90L over from last year, I think its benefits make it relevant in 2024. Would the reviewers consider adding similarly priced TVs for 2024 in the X90L’s “Compared to Other TVs” section, as well as add the X90L in the same section of reviews for similarly priced TVs for 2024?
In the SDR Custum picture mode, how come Sony chose a default Gamma of -2 instead of 0? In the SDR and HDR Custom picture modes, how come Sony chose a default contrast of 90 instead of Max, yet for all the Dolby Vision picture modes, Sony chose a default contrast of Max?
I wish Samsung would acquire licenses for Dolby Vision and passing Dolby TrueHD and all the DTS audio formats. Not everyone needs the processing of the Sony A95L. If Samsung adds support for these things in their QD-OLEDs, I’ll bet you could get a 77-inch for $3,500 or less. Right?
I’m not trying to offend anyone. I’m speaking for the home theater enthusiasts who are looking for performance within this price range. After reading https://www.rtings.com/tv/tools/compare/hisense-u8-u8n-vs-tcl-qm8-qm851g-qled/50406/60898?usage=1&threshold=0.10 , the only advantages I see over my Sony X90L are higher brightness and deeper black levels. Color performance and PQ EOTF tracking are things I expect for $1,000 and up. The 2024 Hisense U8N and TCL QM8 are fine if all you care about are visibility during the day and the deepest black levels at night. But as a home theater enthusiast, I still have no buyer’s remorse about getting my Sony X90L. I don’t mind its blooming. When it blooms, black levels aren’t as bad as when local dimming is disabled. It is still bright enough for me to use in bright surroundings. I choose the X90L and think it is at least as relevant in 2024 as it was in 2023 because of its similar price. If you want the brightness and black levels of the 2024 Hisense U8N and TCL QM8 and the accuracy of the Sony X90L, then you’ll have to pay more for the Sony Bravia 7 or Bravia 9. This is where we stand in 2024. I think we had better out-of-the-box performance in 2023 with the Hisense U8K and Samsung S90C. But at least Sony decided to carry the X90L over, giving 2024 some hope for purists on a budget looking to upgrade.
While the Sony X90L can’t get as bright as the 2024 TVs, it measures better than the 2024 mini-LEDs when it comes to pre-calibration color accuracy and PQ EOTF tracking. Being that it is still being manufactured, I think it is relevant in 2024 because of its similar price to the Hisense U8N and TCL 2024 QM851G. Unless the 2024 QM851G can equal the X90L’s PQ EOTF tracking and pre-calibration color accuracy, if you’re looking for something in that price range, don’t rule out the Sony X90L just yet. Although in the event the QM851G DOES have a combination of the X90L’s accuracy AND the peak brightness of mini-LED, then I might consider selling my X90L to someone locally and replace it with the QM851G. But I’m not getting my hopes up until the numbers have been exposed.
If money was no object for me, I’d have an 85-inch Bravia 9 QLED mini-LED and 77-inch A95L QD-OLED on wheels. I’d roll one into my viewing area and the other off to the side depending on my lighting conditions and what I’m watching. That’s perfection. Right? But to tell you the truth, how many of us can afford to do that? I know I can’t. But it’s still fun to dream. Right? Or if you need color accuracy yet are willing to compromise the small peak highlights of mini-LED and the pixel-level dimming of OLED, the X90L from 55 inches to 98 inches isn’t a bad alternative. I’ve used everything including crappy TN panels on laptops, CCFL-backlit LCD TVs, LED-backlit monitors, a plasma, CRTs, and and an AMOLED panel on my current laptop. After seven months with my 85-inch X90CL, I’d say its performance is a mix of LED-backlit LCDs, plasmas?, CRTs, and AMOLEDs I have used. With Auto Local Dimming and Peak Luminance set to High, the elevated black levels caused by blooming are nowhere near as high as when disabling Auto Local Dimming. Here are MY current settings. My main objective is to play Blu-rays as they are mastered while maintaining the X90CL’s potential. I am not a gamer. I only stream when audio and video performance aren’t important. Depending on your environment, configuration, preferences, etc., they might not work for you.
Picture Settings
Category - Setting - SDR&Rec.709/SRGB - HDR10&BT.2020 in Normal Environment - HDR10&BT.2020 in Excess Ambient Light - Dolby Vision in Normal Environment - Dolby Vision in Excess Ambient Light
Basic - Picture Mode - Custum - Custum - Custom - Dolby Vision Dark - Dolby Vision Bright or Vivid Basic - Auto Picture Mode - Off - Off - Off - Off - Off Ambient Optimization Pro - Light Sensor - Off - Off - Off - Off - Off Ambient Optimization Pro - Auto Luminance Level - Off - Off - Off - Off - Off Ambient Optimization Pro - Auto Tone Curve - Off - Off - Off - Off - Off Brightness - Brightness - Max - Max - Max - Max - Max Brightness - Contrast - 90 - 90 - 90 - Max - Max Brightness - Gamma - -2 - 0 - 0 - 0 - 0 Brightness - HDR Tone Mapping - Unavailable - Gradation Preferred - Unavailable - Off - Brightness Preferred Brightness - Black Level - 50 - 50 - 50 - 50 - 50 Brightness - Black Adjust - Off - Off - Off - Off - Off Brightness - Adv. Contrast Enhancer - Off - Off - Off - Off - High Brightness - Auto Local Dimming - High - High - High - High - High Brightness - Peak Luminence - High - High - High - High - High Color - Color - 50 - 50 - 50 - 50 - 50 Color - Hugh - 0 - 0 - 0 - 0 - 0 Color - Color Temperature - Expert 1 - Expert 1 - Expert 1 - Expert 1 - Expert 1 Color - Live Color - Medium - Off - Off - Off - Off Clarity - Sharpness - 50 - 50 - 50 - 50 - 50 Clarity - Reality Creation - Off - Off - Off - Off - Off Clarity - Resolution - Min - Min - Min - Min - Min Clarity - Random Noise Reduction - Off - Off - Off - Off - Off Clarity - Digital Noise Reduction - Off - Off - Off - Off - Off Clarity - Smooth Gradation - Off - Off - Off - Off - Off Motion - Motion Flow - Off - Off - Off - Off - Off Motion - Smoothness - Min - Min - Min - Min - Min Motion - Clearness - Min - Min - Min - Min - Min Motion - Cinemotion - Off - Off - Off - Off - Off Video Signal - HDR Mode - Auto - Auto - Off - Unavailable - Unavailable Video Signal - HDMI Video Range - Auto - Auto - Auto - Auto - Auto Video Signal - Color Space - Auto - Auto - Auto - Unavailable - Unavailable
Adv. Color Adjustment Settings
Category - Setting - Option Basic - R Gain - Max Basic - G Gain - Max Basic - B Gain - Max Basic - R Bias - 0 Basic - G Bias - 0 Basic - B Bias - 0 Color Gamma Adjustment Points 1-10 - R Offset - 0 Color Gamma Adjustment Points 1-10 - G Offset - 0 Color Gamma Adjustment Points 1-10 - B Offset - 0 Per Color Adjustment (All Colors) - Hugh - 0 Per Color Adjustment (All Colors) - Saturation - 0 Per Color Adjustment (All Colors) - Lightness - 0
Sound Settings
Setting - Normal Listening - Clarity
Sound Mode - Standard - Standard Advanced Auto Volume - Off - Off Ballance - 0 - 0 TV Position - Tabletop Stand - Wall Mount
Sound Customization Settings
Setting - Option
Surround - On Surround Effect - Max Equalizer (All Bands) - 0 Voice Zoom - 0
Volume Level Settings
Setting - Option
Volume Offset - 0 Dolby Dynamic Range - Standard
Good luck.
While I can’t afford a 77-inch A95L and am glad my 85-inch X90CL is still current, I would have liked to have seen Sony replace the A95L with a model with Samsung Display’s third-generation QD-OLED panel, and the X90L with a model with granular dimming even though it’s not mini-LED. I would have sold my X90CL for $1,500 to a local buyer and replaced it. The Bravia 3 is more like the X80K and wouldn’t make a good replacement for the X90L, hense my previous sentence.
As I was turning Live Color off to prepare for watching a title on 4K UHD HDR10 BT.2020 Blu-ray to make sure nothing got oversaturated, it dawned on me that maybe SDR Rec.709 is not supposed to look like HDR BT.2020 no matter how hard you try getting it to do so. Maybe reds and greens are supposede to be less saturated while blues have a sprinkle of green in them. Inotherwords, a different color scheme.
I discovered that there are plenty of scenes best for HDR10 and Dolby Vision calibration on the Warner Bros. December 2018 4K UHD Blu-ray release of 2001: A Space Odyssey ( https://www.gruv.com/product/2001_a_space_odyssey_4k_ultra_hd_blu_ray_uhd ), particularly the very beginning of chapter 21, the very beginning of chapter 25, and various sections throughout chapter 31. This disc reveals to me (1) the full potential of and differences between HDR10 and Dolby Vision, and (2) there is nothing wrong with the X90L’s HDR10 and Dolby Vision performance. With Auto Local Dimming off, I was able to analyse everything at the pixel level. I discovered that Dolby Vision Dark has the most distinction between luminance levels up and down the gray scale compared to Dolby Vision Bright and HDR10, and actually achieves higher peak brightness in highlights at the pixel level than Dolby Vision Bright when something is mastered that way. When setting Auto Local Dimming back to Max, I notice Dolby Vision Dark mildly dims the backlight while keeping everything at the pixel level as I described. While I wish the backlight wouldn’t dim in Dolby Vision Dark and would like Sony to fix this in a future firmware update, the peak brightness in highlights is so close to Dolby Vision Bright and HDR10 that I’d rather use Dolby Vision Dark to maintain that extra distinction between luminance levels up and down the gray scale. Now that I understand what’s going on in the world of HDR, I don’t mind when things don’t always get bright enough for the X90L to perform at its full potential because for the purpose of mood and effect, they’re not always supposed to. Although if I want to accommodate for daytime viewing or any HDR10 and Dolby Vision content which doesn’t reach that same peak brightness in highlights by default, in order to maintain the BT.2020 color space and peak brightness in highlights while raising the rest of the gray scale without clipping anything, when playing Dolby Vision, I have to choose Dolby Vision Bright, set HDR Tone Mapping to Brightness Preferred, and Adv. Contrast Enhancer to High, and when playing HDR10, set HDR Tone Mapping to either Off or Gradation Preferred and Adv. Contrast Enhancer to High. Setting Video Signal> HDR format to Off maintains the BT.2020 color space and raises the rest of the gray scale, but dims the peak brightness in highlights. Disabling my player’s HDR output maintains the peak brightness in highlights and raises the rest of the gray scale, but changes the color space to BT.709, kind of like playing a 1080p Blu-ray or SD DVD. Being that 4K UHD Blu-rays are mastered in the BT.2020 color space, I’d rather only enable Live Color when playing 1080p Blu-rays and SD DVDs.
Just curious. When in the Game and Graphics picture modes with all brightness, color, clarity, and motion enhancements disabled, what is the vibration that appears on images such as still text?
Here are my solutions for playing HDR10 and Dolby Vision. If an HDR10 source doesn’t reach the X90L’s peak brightness, I think I’m going to take the brightest scene in the movie and set HDR Tone Mapping to Brightness Preferred and Adv. Contrast Enhancer to whatever level reaches the X90L’s peak brightness in the brightest area of the screen without clipping. So far, the only HDR10 title which I don’t think has actual HDR10 metadata is Lionsgate’s December 2017 4K UHD Blu-ray edition of “Terminator II: Judgment Day.” But for HDR10 sources with 1,000CD/M2 metadata, an HDR Tone Mapping setting of Off without the Adv. Contrast Enhancer should reach the X90L’s peak brightness during the brightest scene, provide propper PQ EOTF tracking, and avoid clipping. For HDR10 metadata above 1,000CD/M2, an HDR Tone Mapping setting of Gradation Preferred without the Adv. Contrast Enhancer should reach the X90L’s peak brightness during the brightest scene, provide propper PQ EOTF tracking, and avoid clipping. If a Dolby Vision source doesn’t reach the X90L’s peak brightness (none of my Dolby Vision Blu-rays do), I think I’m going to take the brightest scene in the movie, set Picture Mode to Dolby Vision Bright, HDR Tone Mapping to Brightness Preferred, and Adv. Contrast Enhancer to whatever level reaches the X90L’s peak brightness in a particular area without clipping. This should help me get Dolby Vision performance to at least match HDR10 and SDR. Is PQ EOTF tracking also a part of Dolby Vision performance? If so, I’ll use whatever Dolby Vision picture mode and HDR Tone Mapping setting which remain faithful to the metadata, go to a movie’s brightest scene, and set the Adv. Contrast Enhancer to whatever level reaches the X90L’s peak brightness in the brightest area of the screen without clipping.
While the matt screen prevents the picture quality from reaching its full potential, resulting in the lower scores, the S95D might have the best looking picture for a display with a matt screen. If Samsung would also sell the S95D in the same sizes with a glossy screen, that would allow buyers to choose between the matt screen for use in a bright room at the expense of picture quality, or the glossy screen to preserve picture quality with possible side effects of reflections. But unless Samsung decides to do that, for now our choices are the S95D if you prefer matt, and the S90D if you prefer preserving picture quality and don’t mind the side effects of reflections.
So I set Brightness (backlight) to Max, and HDR Tone Mapping, Adv. Contrast Enhancer, Auto Local Dimming, and Live Color to Off in order to begin investigating the science behind Dolby Vision Vivid, Dolby Vision Bright, and Dolby Vision Dark. When playing the black void in the beginning of “2001: A Space Odyssey,” I didn’t notice a change in the backlight itself when switching between the three modes. When playing other parts of the movie, I notice Dolby Vision Dark is only SLIGHTLY darker than the other two modes. The other two modes are identical. Even in Dolby Vision Vivid and Dolby Vision Bright, the peak luminance is still significantly lower than HDR10 and SDR without my work-arounds. I think the purpose of Dolby Vision Vivid is to adjust the image according to personal preference. Is the difference between Dolby Vision Dark and the other two modes strictly at the pixel/processing level? Or are they still adding or subtracting based on the metadata, even with HDR Tone Mapping Off?
While HDR10 does reach the peaks I expect from this display, it appears Dolby Vision material is holding back its potential. I have some work-arounds for it. However, before I apply them, I would like to understand the science behind the X90L’s Dolby Vision Dark and Dolby Vision Bright picture modes, as well as how my Sony UBP-X800M2 4K UHD Blu-ray player sends Dolby Vision to the X90L when Dolby Vision Output is set to On and a disc includes Dolby Vision encoding so I will be able to apply my work-arounds without causing clipping in bright scenes. But until I have a better understanding of how the X90L and X800M2 handle Dolby Vision, I think I’m just going to set my player’s Dolby Vision Output to Off, and the X90L’s HDR Mode to Off, and keep the HDMI Video Range and Color Space set to Auto so I can at least take advantage of the X90L’s peak brightness and color accuracy.
I just thought sharing this might help determine if the problem is with my player or the X90L. I hate to sound like I’m beating a dead hourse. I understand that the range from darkest to brightest constantly changes when inputting Dolby Vision. Every time I play Dolby Vision, it has the effect of decreasing the Contrast when playing SDR and HDR10. I’ll only give up on Dolby Vision as a last resort. My favorite Blu-ray disc reviewers swear by the Dolby Vision remasters. When set to Dolby Vision Dark with HDR Tone Mapping and Adv. Contrast Enhancer set to Off, the only way I can achieve brighter peaks is to raise the Black Level, which obviously elevates blacks to a point I’m not sure if even I can tolerate. When I go to the Video Signal category, I am unable to change the HDR Mode and Color Space. I am able to determine that my player is locked to the Full HDMI Video Range when outputting Dolby Vision because there is no change when switching HDMI Video Range from Auto to Full on the X90L, but there is a change when switching from Full to Limited, yet no noticeable difference in the contrast.
Being that the X90CL’s/X90L’s SDR and HDR brightness are pretty much the same, when playing a 4K UHD Blu-ray HDR10 disc, has anyone tried going into the Video Signal category and setting HDR to off, while leaving everything else set to Auto? Based on my observation, from what I can tell, this setup maintains the ritcher BT.2020 colors as mastered while the X90CL/X90L ignores the HDR10 metadata. So if you think HDR10 is too dark, this might work. Although compared to Dolby Vision, with HDR Mode set to Auto, I still get plenty of peak highlights which take advantage of its potential. Speaking of Dolby Vision, the HDR Mode cannot be changed during Dolby Vision playback. Although I normally would leave the HDR Mode set to Auto and tell my player to output in Dolby Vision and enjoy the benefits of HDR10 and Dolby Vision, the only reason I would even consider setting HDR Mode to Off and having the X90CL/X90L ignore the HDR10 and Dolby Vision metadata while maintaining BT.2020 is if I can’t see the picture when I have my curtains and windows open during bright afternoons. I just thought I’d pass my results of this setup along in case anyone might be interested based on my results.
Being that Sony is carrying this model over to their 2024 lineup, it’s still relevant for comparing similarly priced Hisense and TCL models, especially the Hisense U8N and 2024 makeover of the TCL QM8. While the U8N and QM8 will get brighter, the question is how close their color accuracy, PQ EOTF tracking, motion performance, and upscaling can get to the Sony X90L.
Which pads were used during testing? Will there be another review with the opposite pads?
Someone clearly explained the Expert 1 and Expert 2 situation at this AVS Forum thread ( https://www.avsforum.com/threads/sony-bravia-x90l-full-array-owners-thread-no-price-talk.3269181/post-63208961 ). I just thought I’d pass it along so you guys can spend more time with new product reviews than explaining aspects of older products.
What settings are affected by Expert 1 and Expert 2?
In the Color Temperature options, so many users including myself can’t notice a difference between Expert 1 and Expert 2. So how come these options exist if there is no difference between them? If there IS a difference, is Expert 2 warmer (as in more red) than Expert 1?
Out of the two low-latency picture modes, I notice when in the Game mode and Auto Local Dimming is Off, the backlight still turns off when inputting a signal with a black screen. Yet when in the Graphics mode and Auto Local Dimming is Off, the backlight actually stays on at all times. While black levels are slightly elevated, I do appreciate the OLED-like response during sudden changes in scene brightness, and OLED-like behavior when objects move without affecting parts of the picture which are supposed to be static. Yet all this is achieved without OLED’s aggressive ABL. I think of slightly elevated black levels the same way I think of slightly audible hiss on high-fidelity analog audio recordings. Will using the X90L this way in the Graphics modestill beat the X85K when it comes to PQ EOTF tracking, HDR native gradiant handling, HDR color volume, color gamut, and sustained full-screen brightness? If so, the combination of low latency and pixel-level response could really grow on me. This kind of reminds me of how an AVR performs as little modification to the audio signal as possible when in the Pure Direct mode. Here are my settings, with a few explanations along the way. Picture Mode=Graphics. Everything in Ambient Pro=Off. Brightness=Max. Contrast=Max (I notice higher pixel brightness and no clipping in bright areas on some of the test patterns found on Sony Pictures 4K UHD Blu-rays when going from 90 to Max). Gamma=0. HDR Tone Mapping=Off (in the Graphics mode, I notice higher pixel brightness and less clipping in bright areas on some of the test patterns found on Sony Pictures 4K UHD Blu-rays than when set to Brightness Preferred or Gradation preferred.) Black Level=50. Black Adjust=Off. Adv. Contrast Enhancer=Off. Auto Local Dimming=Off. Peak Luminance=Off (in the Graphics picture mode, setting Peak Luminance to High, Medium, Low, or Off doesn’t make a difference when Auto Local Dimming is Off). Color=50. Hugh=0. Color Temperature=Warm (in all the picture modes, Warm shows the brightest whites). Live Color=Off (I’ll live with Rec.709 content having faded colors to keep latency at a minimum). Everything in Clarity and Motion=Off and Min. Everything in Video Signal=Auto. Everything in Adv. Color Adjust=defaults. When inputting Dolby Vision, the X90L is limited to the three modes with high latency–Vivid, Dolby Vision Bright, and Dolby Vision Dark. I use Dolby Vision Dark with the same above settings verbatim.
I still look at this stuff even though I’m quite happy with my 85-inch Sony X90L and it’s past its return window. Even though this model is out of my price range even in a 75-inch, Based on Sharp’s reputation from years ago, I was really rooting for this Sharp to outdo my Sony X90L in the areas of performance of great importance to me. But numbers and science don’t lie. I’m surprised Hisense’s UX fell short of their U8K in certain areas as well. Based on Panasonic’s reputation, I’d also root for them to score well if they ever sell TVs in North America again.
If the Game Picture Mode is recommended for gamers, Graphics is recommended for graphics and use with computers, and Custom emphasizes true reproduction of the original signal, how come Custom has similar input lag to Vivid, Standard, Cinema, Photo, and IMax Enhanced? Are the other five modes using all the processing capabilities so that the X90 will be ready for using maximum settings for the enhancements in the Ambient Pro, Brightness, Color, Clarity, and Motion categories? While I wouldn’t be able to fully enjoy QD-OLED in a bright, sunny room, nor do I have the money for it, I’ll admit that the one advantage of QD-OLED would be maintaining quality in the Game and Graphics picture modes without the need of using the other picture modes and enabling all those enhancements. But for X90L users, it looks like we have to choose between responsiveness for tasks like typing and gaming, or quality for movies and shows. Right?
I visited the subject of latency a month or two ago, but didn’t know how to find test material for that. It turns out that Youtube has some AV sync test videos in which the flashing and the clicking should be as synchronized as possible. In the Custom picture mode, when a source is connected directly to my AVR for audio or audio and video, while the AVR or source is connected directly to the X90L for video, by default, I’d say there’s between a tenth to fifteenth of a second (100ms-150ms) of delay. My four choices are (1) do away with my AVR and speaker system, (2) being that my AVR doesn’t support eARC, use ARC with my AVR and give up DTS-HD Master Audio and 7.1-channel audio, (3) manually adjust my AVR’s delay to 145ms for 4K and 160ms for 1080, or (4) use the X90L’s Game and Graphics picture modes for everything so I can enjoy compatibility without having to switch my AVR’s delay setting back and forth according to the video source. How’s the X90L’s picture quality for movies in the Game and Graphics picture modes?
I discovered that with the Sony X90L, in addition to simultaneously taking advantage of the HDR10 and BT.2020 component of a 4K UHD Blu-ray disc by leaving everything in the Video Signal category set to Auto, it is possible to separately take advantage of these components. If you want to only take advantage of the HDR10 component, set HDR Mode to Auto and Color Space to BT.709. If either it is the middle of the day or you are in a bright room and only want to take advantage of the BT.2020 component, set HDR Mode to Off and Color Space to Auto. When playing the test pattern after the single color squares on a Sony Pictures 4K UHD Blu-ray disc, the bottom grayscale bar still ramps (if you know what I mean) without clipping or missing anything, but the difference is less distinguishable than with HDR set to Auto. It kind of reminds me of how it would look on those older CRTs which I do miss at times. When Dolby Vision is input, the only thing I can change in the Video Signal category is the HDMI Video Range, which I just leave on Auto no matter what. So I’m thinking of not using Dolby Vision encodes during the day or when I have my lights on. However, when watching at night or in a dark room, I appreciate the distinguishability HDR10 and Dolby Vision provide and will make sure everything in the Video Signal category is set to Auto. As you know, I’m not bothered by elevated black levels whether from blooming or turning off Auto Local Dimming. So in my case, especially now that I got my BT.709 issues resolved, this provides me viewing flexibility I just wouldn’t get from an OLED, including rich colors, good HDR10 and Dolby Vision performance for viewing with my drapes closed or at night, and 718 Nits of sustained full-screen brightness for impact of highlights and viewing with my lights on or with my drapes and windows open during the day.
Great news! I can finally play content mastered in BT./Rec.709 with accurate flesh tones AND rich colors! I previously explained that because of / due to my sight condition, while I know what less complex colors look like, I am not entirely familiar with how flesh tones are supposed to look. So after my friend who is into photography and I tried various settings, we got the best results in the Custom picture mode with most of Rtings’s recommended settings. The only thing I changed was setting the Live Color setting to Medium. When set at Low, green and cyan aren’t rich enough. When set at High, while cyan and green were fully saturated, red and blue are brighter yet undersaturated. To my surprise, setting Live Color to medium actually makes blue more OLED-like than leaving Live Color off and adjusting saturation and/or Color Space. Before you play 4K HDR content, it might be a good idea to remember to turn Live Color off, or flesh tones might look a little too red again😊. I thought it would be a good idea to share this in case the former SDR color issues of the past were giving anyone second thoughts about the X90L. I don’t know if other Sony displays have these SDR color issues, but if they do, at least my friend who is into photography and I found a solution.
I’m glad to be happy with how this TV handles HDR10 and Dolby Vision via HDMI. When playing content which uses the BT.709 color space, I find myself choosing between (1) accurate fleshtones with undersaturated colors, or (2) clear colors with red flesh tones. I have becoming more and more suspicious that the X90L’s native color space is higher/wider than BT.709 and is not properly converting BT.709 to its native color space. I know I’ve been going on and on about this, but after paying either $1,000 for a 55-inch, $1,300 for a 65-inch, $1,800 for a 75-inch, $2,000 for an 85-inch, or $8,000 for a 98-inch, I can’t help thinking that even BT.709 content should have clear colors AND accurate fleshtones. Is there any chance that proper color space conversion could be addressed in a future software/firmware update?
If you got it from Costco or Best Buy, see if they could handle removing your current unit and delivering a new one. If you got it elsewhere, you have to decide how much trouble it’s worth returning your current unit and exchanging it for a new one.
I’m not using Rtings’s Adv. Color Adjustment settings. I have all color, clarity, and motion processing turned off. When playing SDR, in the Custom mode, I have Brightness=Max, Contrast=90, Gamma between -2 and 0, Black Adjust=Off, Auto Local Dimming=High, Peak Luminance=High, Color=50, Hugh=0, Color Temperature=Expert 1, Live Color Off, HDR Mode=Auto, HDMI Range=Auto, and Color Space=Auto. The Deep Color and YCbCr settings on my Blu-ray player didn’t make a difference. While HDR10 and Dolby Vision look great, these settings give me accurate flesh tones with colors which are less saturated than I’m comfortable with. Getting back to X90L settings, with Color Space set to Auto, when I increased the saturation high enough to display everything from RGB0-RGB255, flesh tones appear lobster-red. When I leave Color at 50 and change the Color Space to BT.2020, colors are clear, but some of the higher RGB values are clipped and flesh tones are lobster-red. If leaving Color at 50 and Color Space set to Auto is the best I can do, at least my SDR DVDs and Blu-rays and 4K UHD Blu-rays with HDR10 and Dolby Vision are distinguishable.