Since November 2021, we've purchased and tested over 145 laptops, providing readers with unbiased, detailed, and trustworthy reviews. A lot goes into our reviews, from the purchasing decision—we buy our units, so there's no cherry-picking—to the publication of the review. Our testers follow strict protocols designed by our test developers to ensure that they evaluate all laptops the same way, and our writers ensure that even tech novices can easily understand the most complex aspects of each laptop. We use a wide range of specialized equipment to provide you with the most accurate results so you can make an informed buying decision based on your needs. If you've ever wondered what goes into producing a laptop review, keep reading, as we'll give you a summary of our review process from start to finish.
The first step in our review process is, of course, the production selection. As countless new laptops are released each year, it would be impossible for us to test every single model, so we have to choose which models and configurations to test. Many of you already know that insiders can vote for the laptop they would like us to review, but beyond that, we also make our own picks based on the products' popularity and whether they fit a specific use. You may also know we buy all our units anonymously from online retailers or directly on the brand's website. This is somewhat of a disadvantage when it comes to the timeliness of the review; however, purchasing our own units allows us to keep our contact with each brand to a minimum, thus avoiding any conflicts of interest or biases. We often have multiple laptops in the pipeline; you can see them on the review pipeline page.
Before we get into the details of our testing process, here's a brief summary of our overall philosophy and why we test laptops the way we do.
Like all other products we review at RTINGS.com, we've built our laptop reviews based on standardized testing, meaning we use the same test patterns and materials for every laptop we test. This method provides a level playing field, making every laptop comparable. That said, there are limitations to this method, as some of our tests, like gaming benchmarks, don't work on Chrome OS or macOS devices. Beyond the tests that are already part of the current test bench, we sometimes perform additional tests for unique features or to verify a manufacturer's claim. We also perform tests based on community feedback, so if there's something you would like us to test, it's important to let us know in the discussions.
Once we've received a laptop and unpacked it, it goes to one of our two professional photographers, who are responsible for most of the photos you see in our reviews. Like our tests, they photograph all laptops in the same manner, so they're easily comparable. We then assign the laptop to one of our trained testers, who will test every aspect of the device; this usually takes about five working days.
While there isn't a specific order for laptop testing, we do group some tests together for efficiency, as they require the same setup (like display testing). Laptop reviews contain seven parts; see the tests comprised in each part below. You can click the links for more in-depth articles on how we perform each test. Some are objective tests performed using professional equipment, like the display's contrast and accuracy, while some are subjective tests in which we go through a series of questions, like the feel of the touchpad and the quality of the webcam.
Design | Display | Interface | Connectivity | Configuration | Performance | Additional Features and Software |
---|---|---|---|---|---|---|
Style | Screen Specs | Keyboard | Ports | CPU | GeekBench 5 | Software |
Build Quality | Refresh Rate | Touchpad | Wireless Communication | GPU | Cinebench R23 | Extra Features |
Hinge | Contrast | Speakers | RAM | Blender | ||
Portability | Brightness | Webcam & Microphone | Storage | Basemark GPU | ||
Serviceability | Reflections | Storage Drive Performance | ||||
In The Box | Black Uniformity | Battery | ||||
Horizontal Viewing Angle | Borderlands 3 | |||||
Vertical Viewing Angle | Civilization VI | |||||
Out-Of-The-Box Accuracy | Counter-Strike 2 | |||||
Color Gamut | Shadow of the Tomb Raider | |||||
Flicker | Thermals And Noise | |||||
Performance Over Time |
You can track the progress of a review via our status tracker (shown below) on the product's discussion page.
The design section is pretty straightforward, compromising only a few test boxes to cover the basics. The tests provide information about the laptop's design, form factor, build quality, hinge quality, portability, and the content of the box. We also touch on serviceability so you know which parts are user-replaceable and the ease of access to the internals. The build quality score is largely subjective, taking into account aspects like the chassis' material (metal vs. plastic vs. carbon fiber...) and rigidity, the finish's resistance to light scratches and smudges, and the overall construction (whether there are any gaps or faults).
The tests comprised in the display section are largely the same as our monitor reviews, albeit significantly simplified. This is to prevent laptop reviews, which are already quite long, from feeling overly bloated. That said, we may revise the tests in a future test bench to provide additional information if a particular aspect becomes more important, like HDR brightness (we currently only test HDR brightness to verify a manufacturer's claim, such as the Apple-silicon MacBook Pros' advertised 1600 cd/m² brightness in HDR).
We use various tools to test the display, depending on the test itself; these include a Konica Minolta LS-100 Luminance Meter, as well as a Colorimetry Research CR-100 colorimeter and CR-250 Spectroradiometer. As ambient light can affect measurement accuracy, we perform these tests in a dark, light-controlled room. We won't get into the minute details of the testing here, as it would be far too long; you can refer to our monitor tests for more information.
One area where the laptop test differs significantly from our monitor testing is motion performance—we indicate the maximum refresh rate and VRR support, but we don't currently test the response time because the test we currently use doesn't work on Chrome OS and macOS. We only provide a motion picture, which gives a rough idea of how the display performs.
The interface section covers four aspects of the laptop: keyboard, touchpad, speakers, and webcam. The keyboard test is the same as the keystroke and typing quality tests from our keyboard reviews. We use the same Mecmesin MultiTest-i System Test Stand to measure the pre-travel and total travel distance, as well as the operating and actuation force. There's also a subjective typing quality test, touching on the general feel of the keyboard, its layout, and whether there's anything that stands out.
Moving on to the touchpad. Except for the size and material (glass vs. plastic), we evaluate the touchpad subjectively, considering aspects like its responsiveness to movements and gestures and whether there are any issues with palm rejection or actions like dragging and dropping items across a long distance. We also evaluate the quality of the clicking mechanism and whether you can click anywhere or only on a specific part of the touchpad.
To test the speakers, we play a sine sweep and use a calibrated Earthworks Audio M23 microphone to capture the frequency response at 65dB and at max volume. We then use the two frequency responses to determine the standard error, slope, bass and treble extensions, and the dynamic range compression at max volume. The tester also listens to the speakers to ensure the sound matches the measurements, though the tester's subjective opinion has no impact on the score.
Aside from the basic specs, we evaluate the webcam's video quality subjectively based on the image's sharpness, amount of fine details, color temperature, tint, exposure, and noise. We also evaluate the microphone subjectively based on the recorded voice's loudness, clarity, and the amount of background noise present. To avoid interference from ambient noise, we record the voice in a sound-controlled box with speakers pointed at the laptop.
The connectivity section is simple and comprises only two tests: ports and wireless communication. We score the ports section subjectively based on the number of ports, the variety, and the ports' specifications (speed and capability). No equipment is necessary for this test, as we can simply obtain the information via the laptop's hardware specifications. The HDMI port is the exception, as many manufacturers advertise HDMI 2.1, even if the port can only output a max resolution of 4k @ 60Hz (see Fake HDMI 2.1). To verify the HDMI version, we check if the port can output a 4k @ 120Hz signal to an external monitor—we consider an HDMI port that can only output a 4k @ 60Hz signal as HDMI 2.0.
As for the wireless communication, we only report the wireless adapter model and the Wi-Fi standard it supports. We don't test the wireless performance.
The configuration section doesn't contain any tests per se; it's where we list our unit's specifications and provide some basic information about the available CPU, GPU, memory, and storage options. Like the connectivity section, no testing equipment is necessary for this section, as we can obtain all the information from the laptop's hardware specifications within Windows' task manager as well as the respective CPU, GPU, and storage drive manufacturers' product pages.
The performance section comprises primarily of synthetic benchmarks to the CPU, GPU, and storage drive's performance. The current benchmarks include Geekbench 5, Cinebench R23, Blender, Basemark GPU, and a few games, like Borderlands 3, Civilization VI, Counter-Strike 2, and Shadow of the Tomb Raider. The Cross Platform Disk Test is our benchmark app for testing the storage drive's speed, as it works on Windows, macOS, and Chrome OS.
We perform the benchmarks three times (with a 2-minute break between each run) in a temperature-controlled room (22 °C (71.6 °F)) with the laptop plugged in. The posted scores are averages of the three runs. It's worth noting that we've recently started performing the benchmarks using the Performance mode instead of the default Balanced mode. This is to ensure our results are representative of the laptop's maximum potential rather than the tuning of the Balanced mode, which can vary depending on the laptop and manufacturer.
We typically conduct the performance over time and thermals/noise tests at the same time, as they require the same setup. We run Cinebench R23's multi-thread and Unigine's Heaven benchmarks to stress the CPU and GPU simultaneously. Using a script, we run these benchmarks repeatedly, record the CPU/GPU's temperatures, and check for any throttling based on the Cinebench and Heaven test results. While running the benchmarks, we use a Flir gun to record the temperatures on the keyboard and bottom of the laptop. To measure fan noise, we place the laptop in the same sound-controlled box we use for the speaker test with a microphone.
The battery tests usually occur at the end of a working day, as draining the battery can sometimes take an entire night. We use a script to test all battery tests (web browsing, video playback, and gaming). We test the video playback test using a locally stored video.
As the name suggests, the 'additional features and software' section is where we specify which operating system comes pre-installed and whether there are any third-party applications or features like RGB lighting, pen input support, a secondary display (like the Intel-based Apple MacBook Pros with Touch Bar), or biometrics (fingerprint sensor and/or facial recognition camera). We don't score these tests, as the mentioned features are optional, and the choice of the operating system is a matter of personal preference.
Once a tester has finished the testing, the results go through a peer-review process. A second trained tester and a writer validate the results and ask for any additional testing or explanation if something doesn't align with what the community says or our expectations. We also make sure that the results align with real-world user experience. After the validation process, we publish the results in early access so that our insiders get a first look at the reviews, and writing begins.
Writing a laptop review is a complex task, as the writer must be knowledgeable in every aspect of the laptop, including the laptop market as a whole and current CPU and GPU development. We must also know how people typically use their devices and what they care about or don't care about. While most of our test results are straightforward, they sometimes need deeper explanation (especially highly technical aspects) or context, and that's where the text bridges the gap.
After the first draft, the second validation process involves a second writer and the original tester to ensure that the text matches our findings. Finally, the review goes to the editing team, where an editor ensures the final product is properly formatted, consistent, and error-free. They also make sure the review follows our internal style guidelines and matches our high level of quality before publishing it. Laptop reviews are generally quite long, ranging from 3,000 to 5,000 words, so the entire writing process can take a few days at the very least.
After the review is published, the same writers involved in the review process look at our recommendation articles to see if the tested laptop deserves to be on our recommendation list. We update the recommendation articles frequently, making it easier for readers who just want a quick roundup instead of having to read through entire reviews and ensuring that they're up-to-date with any changes in the market.
We don't make our picks solely based on the scores; we also consider other factors like the price and availability, and occasionally things like customer support or overall buying experience. Not all picks change with every update, as there are often models that really stand out, so they tend to stick around for longer. Lastly, it's important to note that we don't only recommend products that are available through affiliate partners. This is especially the case with laptops, as many models, particularly higher-end configurations, are only available through the brand's website.
A benefit of buying our own units is that we keep the products much longer than other reviewers, often keeping models for over two years. This allows us to perform retests (for various reasons, including firmware updates) and better answer community questions about a specific feature or aspect of a product. Keeping the units also lets us compare products side by side so that we can more accurately describe the differences between various models.
The retest process is similar to our main testing process. A tester performs the retests, gets the results validated by a second tester and a writer, and then passes it on to the writer and an editing team member to update the review and recommendations accordingly. Our writing, testing, and test development teams regularly work together to decide which products to keep and which we can safely resell. If a product is no longer relevant within the market, available to buy, or featured in our recommendation articles, we may consider it okay to be resold.
There you go, folks! That was a brief overview of our review process from start to finish. If you want to learn more, you can check out our overview video below. For more information about a specific test, check out our laptop articles or click the "Learn about _____" link on any review page. Finally, we always try to improve, so your feedback is important. If you have any questions, criticism, or suggestions, you can reach us in the comments section of this article, on our forums, or by email to feedback@rtings.com.