DWM and mixed refresh rate performance
Mixed refresh rates still an issue? Lets find found!
Lets head back to May 27, 2020.
Microsoft releases a new feature update (build) they called "2004" (codename 20H1).
Along with "DirectX 12 Ultimate", this build introduced some changes to how DWM (Desktop Window Manager) operates.
This was a sought after change by many. It changed the way DWM would sync to the fastest (instead of the slowest) monitors refresh rate for composition updates.
Some people claimed it was a great success, whereas others claimed there were still a lot of issues.
When this update dropped, I was quite curious as to why some people experienced that their issues were solved, where as some swore it was still a large problem to run mixed refresh rate.
Being a curious boi, and having 2 monitors capable of different refresh rates, I did some basic testing around that time, but never really anything in depth, or factual/data driven. Mainly it was having hardware accelerated apps running on both monitors, and messing with the fresh rates.
My good old Zowie monitor was capable of up to 144hz, while out of the box also allowing me to select 120, 100 and 60hz. This was paired with a Dell U2415 (60hz only).
In my basic testing it FELT like 144hz performed a lot worse than what 120hz did. Speaking to some friends who were in a similar situation it did not seem like they were sharing my experience. Maybe it was all in my head, but I ended up keeping my monitor at 120hz.
I saw several others parroting my findings; doing multiples of the lowest refresh rate monitor would have positive effects. To me, this made a lot of intuitive/logical sense, as updating the slower monitor every other time (in sync), seemed a lot simpler and optimal than doing it a different intervals of 2,4'ish cycles.
Bothering to actually test...
Recently I got into a minor discussion regarding this, and I essentially had to admit that my experience was exactly that, a single persons experience + some hearsay.
Feeling in the mood to do a few short tests, I fired one of my fav test cases, "Serious Sam Fusion 2017".
It has built in benchmark mode, which is quite flexible/adjustable, as well has having multiple Graphics APIs available (dx11, dx12 and vulkan).
In addition it will log benchmark data to a log file, including highs, lows, averages etc.
Serious Sam 2017 Fusion (dx11 and dx12)
32 gigs of crappy ram
I ran a few tests on 144hz + 60hz, as well as 120hz + 60hz. Both on dx11 and dx12 modes.
Set up OBS with the most basic scene (single game capture), at 1080p60, nv12 709 partial. Had it open on the 60hz monitor, enabled recording with NVENC (stock settings).
I let the game auto pick its own settings, with the exception of Graphics API.
Vertical sync was off, and framerate was uncapped.
It was quite bizarre to see how much of a difference this made. It almost seemed like it stuck hovering around 144 fps, but it did in fact go quite a lot higher a lot of the time.
GPU-Z seemed to indicate that the GPU Load was not maxed out, but PerCap reason was still VRel (which it usually is when my GPU is maxed out).
So if we do some comparisons:
184,95 / 135,15 * 100 = 136% (36% increase)
36% frames per second on average for 120hz
15,25 / 9,25 * 100 = 165% (65% increase)
65% higher Low 1% frame times (milliseconds)
And then there is Dx11...
Everything remained the same, except for the Graphics API of course.
After the initial Dx12 results, I certainly expected the dx11 benches to go the same way, but it does not seem to matter very much. At least as far as the numbers are concerned.
Maybe there is some latency/feeling/magic that is missing when doing non-multiples, but I'd rather not speculate or make any claims on that for the time being.
When I think back on my original experience, it did in fact happen to be a Dx12 title, but that was not something I was initially considering to be relevant. This could also explain why some people claimed there were no issues for them, even when running non-multiple frame rates.
Dragging my legs
At this point, it's been several years, and not really all that topical anymore. Still, I just wanted a little write up, with some actual data and tests that can hopefully demonstrate that there is something to this whole "multiples of lowest refresh rate" saying.
Thank you for your time <3