VRR support would be nice if it removed the issue of the time it take a TV to resync when you change refresh rates, but I don't see that it is designed to fix the issue of dropping/repeating frames due to the drift between the Audio and Video Clock. We now have two soln:
1) VideoClock (resample audio to keep in sync with the video refresh rates.... but only works with decoded audio an some will not like that it is resampled)
2) Custom Refresh Rates (fine tune the out of the box refresh rates to be closer to the Audio Clock rate - this needs to be done on a PC by PC basis as the clocks on every PC will be different. MadVR now gives us this ability to tune the Refresh Rate. Works with Bitstreaming as well as decoded audio and does not require either the Audio or the Video streams to be altered as they now both play in sync for hours).
VRR flips around the way that video sync happens, which eliminates the need for mode switching.
Instead of the display refreshing at a fixed update rate which the computer (or any other device) has to synchronize its output to, it only refreshes when it has been sent a new frame.
Since these displays don't flicker like CRTs did, it can seamlessly change "refresh rate" without the viewer being aware of it.
If you send it a frame every 40ms, the display is refreshing at "25Hz". Send it a frame every 10ms and it is refreshing at "100Hz". But this can change
every frame.
If a frame is presented slightly early or late, you don't get dropped/repeated frames or stutters like you would with a fixed refresh rate display - the VRR display just updates slightly earlier or later, completely in sync with it.
That's why the focus has mainly been on gaming thus far.
With video, you're syncing up a fixed framerate source to a fixed refresh rate display - you just have to get the two close enough that they don't go out of sync over the course of 1-3 hours.
With games, the workload placed on the GPU is highly dynamic. To lock a game to 60Hz with V-Sync enabled, you might need 30% overhead to prevent it ever dropping below 60 FPS and stuttering.
With a VRR display you can unlock the framerate and let the game run at 60-78 FPS instead (assuming 30% overhead), and it will appear completely smooth since the monitor is updating in sync with the framerate.
It enables you to take full advantage of your GPU instead of having to keep that overhead, and you get a better experience since the average framerate is much higher.
It also means you're far less likely to notice if the framerate drops below 60 too. You could have a game running at 54-70 FPS instead - still keeping the average above 60 FPS, but relaxing the requirements somewhat.
So it
is a much bigger deal for gaming, but having a video player/renderer which supports it has advantages too - especially looking forward to things like HFR content which adds more framerates to the mix.
We already have HFR content running at 48 FPS which current televisions cannot display - despite that being within the range of 24Hz and 60Hz that they do support.
Hopefully that explanation makes it clear why trying to sync up the output from a computer to the display is so chaotic, compared to simply presenting frames to the display and having it update as soon as they are received, and why all displays should support VRR going forward - even for video.