Game monitors are designed to make the output of the graphics card and CPU look as good as possible while gaming. They are responsible for displaying the end result of all of the computer's image rendering and processing, but they vary greatly in terms of color, motion, and image clarity. When considering what to look for in a gaming display, it's worth taking the time to understand all the things a gaming display can do, so you can translate the specs and marketing of your gaming display into real performance.
Display technology will change over time, but the basic goal of monitor manufacturers remains the same. We'll break down each group of monitor features below to isolate their benefits.
Resolution is a key feature of any display. It measures the width and height of the screen in pixels or "picture elements" (the tiny dots of illumination that make up an image). For example, a 2,560 x 1,440 screen has a total of 3,686,400 pixels.
Common resolutions include 1,920 x 1,080 (sometimes referred to as "Full HD" or FHD), 2,560 x 1,440 ("Quad HD", QHD or "Widescreen Quad HD"), and 2,560 x 1,440 ("Quad HD", QHD or "Widescreen Quad HD"). ", WQHD) or 3840 x 2160 (UHD or " 4K Ultra HD ") "). Ultra wide displays with resolutions of 2560 x 1080 (UW-FHD) and 3440 x 1440 (UW-QHD), 3840 x 1080 (DFHD) and 5120 x 1440 (DQHD) are also available.
Sometimes manufacturers use only one measurement for standard resolutions: 1080p and 1,440p for height, and 4K for width. Any resolution is higher than 1,280 x 720 in a high definition (HD).
The pixels counted in these measurements are usually presented in the same way: as squares on a two-dimensional grid. To do this, you can either move closer to the screen (or zoom in on it) until you perceive individual blocks of color, or you can zoom in on the image until it becomes "pixelated" and see a step of small squares instead of a clean diagonal line.
As the display resolution increases, it becomes harder for the naked eye to distinguish individual pixels, and the clarity of the image increases as well.
In addition to increasing the on-screen detail of a game or movie, higher resolutions have another benefit. They provide you with a more usable desktop space. This means you'll get a larger workspace on which to arrange windows and applications.
#2 Screen Size and PPI
Manufacturers measure screen size diagonally, from one corner to the other. Larger screen sizes and higher resolutions mean more usable screen space and more immersive gaming experience.
Players typically sit or stand in the 20″-24″ range, close to the monitor. This means that the screen itself fills your view more than an HDTV (sitting on the couch) or smartphone/tablet. (With the exception of virtual reality headsets, the optimal ratio of the diagonal screen size of the monitor to the viewing distance between the regular monitor is optimal.) In such close situations, the advantages of 1440p or 4K resolution are even more pronounced.
Basically, you want to find a screen that will never perceive individual pixels. You can do this using an online tool that measures pixel density (in pixels per inch), which can tell you the relative "sharpness" of your screen by determining how tightly stacked the pixels are, or by automatically comparing alternative pixel counts per degree formulae to measure within the limits of human vision.
You also need to consider your eyesight and desktop settings. If your vision is 20/20 and your eyes are about 20 inches from the screen, then a 27-inch 4K panel will provide an immediate visual upgrade. However, if you know your eyesight is poorer than 20/20, or you prefer to sit 24 inches away, then a 1440p panel is just as good for you.
When viewing two displays side-by-side, it's sometimes easy to see that they have more vibrant hues, deeper blacks, or more realistic color palettes. However, when reading the specs, it can be difficult to put the image over your head because there are multiple ways to evaluate the colors in the display. There is no one spec to focus on: contrast, brightness, black level, color gamut, etc. Let's define each of these terms before moving on to the larger color features.
Contrast ratio is one of the most fundamental indicators of monitor performance, measuring the ratio between the black and white limits of what the screen can display. A baseline contrast ratio (e.g. 1,000:1) means that the white part of the image is 1,000 times brighter than the dark part.
In terms of contrast, the higher the number, the better. A high contrast ratio (e.g. 4,000:1) means highlights, pitch black and darker areas (where detail can still be seen). A contrast ratio of 200:1, on the other hand, means that blacks look more like gray, while colors look faded and unclear from one another.
Be careful when the LCD advertises very high "dynamic contrast", which is achieved by changing the behavior of the backlight. For gaming or everyday use, the standard "static" contrast ratio discussed above is a better marker of monitor quality.
Brightness is usually measured in "luminance", which is the precise measure of how much light a screen emits. It is measured in units of candela per square meter (cd/m2), also known as "nits". For HDR displays, VESA (Video Electronics Standards Association) has standardized on a set of luminance tests using specific test patches. When comparing luminance metrics, please check to make sure they use this consistent test platform and not proprietary metrics.
In all LCD screens, light from the backlight inevitably leaks through the liquid crystal. An LCD screen with zero light leakage would have an infinite contrast ratio. However, current LCD technology cannot do this.
"Glow" is a special problem in the dark viewing environment, which means that to achieve low black levels is the main selling point of the LCD display. However, unless the LCD screen is completely off, it can not reach 0 nits of black level.
The monitor needs to show many subtle shadows. If they don't transition smoothly between slightly different tones, we see "bands" of color on the screen - noticeable shifts between two different colors, resulting in noticeably lighter and darker bands, and we should see a seamless gradient. This is sometimes referred to as "crushing" the colors.
The ability of a monitor to display colors slightly differently, thus avoiding streaks and inaccuracies, is measured by color depth. Color Depth specifies the amount of data (in bits) that the screen can use to construct a pixel's color.
Each pixel on the screen has three color channels (red, green, and blue) that are illuminated at different intensities to create (usually) millions of shadows. 8-bit color indicates the use of 8 bits per color channel. In a screen with 8-bit color depth, the total number of possible shadows is 2 8 x 2 8 x 2 8 = 16,777,216.
Common color depths.
6-bit color = 262,144 colors
8-bit color, or "true color" = 16.7 million colors
10-bit color or "dark" = 1.07 billion colors
True 10-bit monitors are rare - many monitors use internal color processing in the form of, for example, FRC (Frame Rate Control) to estimate greater color depth. A " 10-bit" monitor can be an 8-bit monitor with an additional FRC stage, usually written as " 8 + 2FRC".
High Dynamic Range (HDR)
HDR monitors display brighter images with better contrast and retain more detail in the light and dark areas of the screen. With an HDR monitor, you can better spot objects moving down a dark corridor in a horror game or see more dramatic sunlight in an open-world game.
While they're best suited for HDR content (only certain games and movies are supported), these monitors typically support 10-bit color depth as well as backlighting with wide color gamut support, which will also improve standard content (SDR). (Note that HDR monitors are usually not true 10-bit color, but rather 8 + 2FRC monitors that accept 10-bit input signals.)
For LCD monitors, a high-end backlight feature called local dimming is critical to HDR quality. The dimming area of the backlight behind the screen controls the brightness of the LED banks. A larger dimming area means more precise control, less "frosting" (the brighter areas of the image making the darker areas brighter), and generally improved contrast.
#4 Refresh Rate
The refresh rate is the frequency with which the entire screen refreshes the image. A higher refresh rate makes movement on the screen look smoother because the screen can update the position of each object more quickly. This can make it easier for competing players to track moving enemies in a first-person shooter, or make the screen more responsive when scrolling a web page or opening an app on your phone.
The response rate is measured in hertz: for example, a 120Hz response rate means that the monitor refreshes each pixel 120 times per second. 60Hz used to be the standard for PC monitors and smartphones, but manufacturers are increasingly adopting higher refresh rates.
The benefits of jumping from 60Hz to 120Hz or 144Hz are obvious to most gamers, especially in fast-paced first-person games. (However, you'll only see the benefits if you have a sufficiently powerful GPU and are able to render frames at speeds higher than 60fps at the selected resolution and quality settings.)
Higher refresh rates make it easier for you to track moving objects with your eyes, allowing for smoother sharp camera movements and less motion blur. The online community is divided over the improvements offered by monitors above 120Hz. If interested, it's worth checking it out for yourself to see how much of an impact it might have on you.
#5 Response Time
Response time is a measure of how long it takes a single pixel to change color (measured in milliseconds). Lower response time means fewer visual artifacts, such as motion blur or "trails" after moving images.
The response time must be fast enough to keep up with the refresh rate. For example, on a 240Hz screen, a new frame is sent to the screen every 4.17 milliseconds (1000/240 = 4.17).
Manufacturers often list a "gray to gray" response time, which is the time it takes for a pixel to change from one grayscale to another. The quoted numbers usually represent the manufacturer's best results under different test conditions, rather than a reliable average.
An image sharpening process known as overdrive can also affect test results. Overdrive applies an increased voltage to the pixels in order to speed up the color change. If carefully adjusted, overdrive reduces visible traces of motion and ghosting (blurred double images). If not, it may "exceed" the expected value and cause other visual artifacts.
Turning up the overdrive can produce better results in gray to gray tests, but it can also produce visual artifacts that are not made public when quoting the best numbers in these gray to gray tests. Because of all the reported factors that affect response time, this is the best reference for an independent review, who can measure response time at different vendors.
Players sometimes confuse response time with input lag, which is a measure of the delay in milliseconds before your action appears on the screen. Feeling rather than seeing input lag is often a priority for fighting gamers and first-person shooters.
Input lag is a side effect of processing by the monitor scaler and the on-screen internal electronics. Selecting "Game Mode" on the monitor's adjustment menu will usually turn off the image processing and reduce the input lag. Disabling "Vertical Sync" in the in-game options menu (which prevents some visual distortion) can also help reduce input lag.