Nvidia's new partnership with MediaTek has just killed the module which made G-Sync monitors so damned expensive
Nvidia has announced it has partnered with MediaTek to produce a scaler chip for gaming monitors, that has the full G-Sync feature set built-in. Rather than having to use a separate G-Sync module, display vendors can now use this single chip to bring Nvidia's variable refresh rate system to more products.
In 2013, Nvidia launched G-Sync, a system that allows monitors to vary the refresh rate so that when the GPU has finished rendering a frame, the display can instantly display it instead of having to wait. In that situation, there's a risk of the frame being swapped during the display process, resulting in a 'tear' across the screen.
VRR also greatly reduces any stuttering induced by differences between the display's refresh rate and a game's frame rate.
G-Sync isn't the only variable refresh rate (VRR) technology though, as DisplayPort 1.2 or newer, and HDMI 2.1 both feature it. AMD also has a VRR system called FreeSync, which is based on the DisplayPort version, though it has been substantially improved since it first appeared in 2015.
FreeSync doesn't require any additional hardware inside the monitor, just that the display has to be able to adjust its refresh rate over a given range (e.g. between 30 Hz and 144 Hz). However, if a monitor manufacturer wants to offer full G-Sync support, it needs to purchase and fit a separate add-in board, with Nvidia's G-Sync chip and a little bit of RAM.
That adds to the bill of materials and since FreeSync is also royalty-free, vendors such as Asus, Acer, Gigabyte, MSI, et al have preferred to go with AMD's system (especially since it works with AMD, Intel, and Nvidia GPUs).
Hence why Nvidia has teamed up with MediaTek to produce a scaler chip, with the full G-Sync feature set built into it, including the latest Pulsar technology—this is a system to reduce motion blur, keeping small details as clear as possible, even when whipping the camera about in a game.
Three vendors—Acer, AOC, and Asus—have already announced some gaming monitors that will use the chip. They're all 27-inch, 1440p gaming monitors with a maximum refresh rate of 360 Hz.
There's no word on how expensive the Predator XB273U F5, Agon Pro AG276QS2, and ROG Swift PG27AQNR will be, or when they will be available to buy, though I should imagine that an announcement will be made soon enough.
The more important question to ask, regardless of the price tag, is "Why should I buy a G-Sync monitor instead of a FreeSync one?" On paper, there's little to separate the two technologies and it comes down to the individual application of each one in a gaming monitor.
A display that supports G-Sync Ultimate will meet a certain level of hardware capabilities, whereas you're not necessarily guaranteed that with a standard FreeSync one. AMD does have FreeSync Premium, with higher specifications and more features than the original FreeSync, though.
Best gaming monitor: Pixel-perfect panels for your PC.
Best high refresh rate monitor: Screaming quick.
Best 4K monitor for gaming: When only high-res will do.
Best 4K TV for gaming: Big-screen 4K gaming.
While you might think that having a GeForce RTX graphics card requires you to be using a G-Sync gaming monitor, the reality is far from being that specific. Nvidia has a list of monitors that it certifies as having G-Sync Ultimate or standard G-Sync features, or just being G-Sync Compatible (i.e. it's a FreeSync monitor but it will work with Nvidia GPUs).
That said, G-Sync Ultimate monitors do have a very wide VRR range, typically from 1 Hz up to the monitor's maximum refresh rate, whereas G-Sync Compatible displays have a narrower range, e.g. 48 Hz to 144 Hz. With those monitors, if a game's frame rate drops below the lowest value in the VRR range, it might activate blur reduction, LFC (Low Framerate Compensation), or simply double the refresh rate to keep things in the VRR range.
With these new MediaTek G-Sync equipped monitors, you should get the complete G-Sync Ultimate features without the big cost of having to pay for the additional G-Sync module. How all of this pans out in the real world… well, we'll let you know when we get one in for review!