LTSC: does it still make sense with NTLite?

MacVap

New Member
Microsoft (the OS) is shutting down the graphics driver every 160 miliseconds since Vista, for the purpose of saving energy. That HAS to add some slugishness and/or latency.

I apply the reg tweak from there everywhere I can and I always see the difference, so do other people that are not tech savvy.

But some dont. ¯\_(ツ)_/¯

Edit: It applies to all OSs to date, after Vista.
 

Taosd

Active Member
Microsoft (the OS) is shutting down the graphics driver every 160 miliseconds since Vista, for the purpose of saving energy. That HAS to add some slugishness and/or latency.

I apply the reg tweak from there everywhere I can and I always see the difference, so do other people that are not tech savvy.

But some dont. ¯\_(ツ)_/¯

Edit: It applies to all OSs to date, after Vista.

i believe the article says it can, not it will. also there are other factors to take into consideration......... and also it isnt exactly "shutting down the graphics driver" as the article states,
"A Windows Vista with SP1 or later system with a driver that follows the WDDM and that supports this feature will turn off the counting feature of the VSync interrupt if no GPU activity occurs for 10 continuous periods of 1/Vsync, where VSync is the monitor refresh rate. If the VSync rate is 60 hertz (Hz), the VSync interrupt occurs one time every 16 milliseconds. Thus, in the absence of a screen update, the VSync interrupt is turned off after 160 milliseconds. If GPU activity resumes, the VSync interrupt is turned on again to refresh the screen." ,

i don't actually believe that the graphics driver itself is being shut down. just the vsync counter. But also, the article only goes with the example of WDDM not with the graphics card manufacturers drivers.

i could be well off the mark here, but it is how i read that article, and if i am wrong, please correct me
 

MacVap

New Member
In my understanding, to turn the interrupt off and on again takes some CPU/GPU cycles and that introduces lag. Some instructions need to be executed somewhere, decision has to be made. No such thing as free lunch (decision being made without using some CPU cycles).

IMHO there is no reason to keep the default, except for saving energy (battery life on laptops).

I and many others see smother mouse movement when it's turned off.

It was the tipping pont for me several years ago, before that 7 wasn't as smooth as XP, no matter what I did.
 
Last edited:

MacVap

New Member
Code:
REGEDIT4

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\GraphicsDrivers\Scheduler]
"VsyncIdleTimeout"=dword:0
 

Clanger

Well-Known Member
LTSC 1809 and W7 screen grabs for comparison

LTSC 1809
Capture.JPG

w7, in various states
W7-BAREBONE-SERVICES-wNETWORKING-20210921
W7-BAREBONE-SERVICES-wNETWORKING-20210921.jpg

W7-BAREBONE-SERVICES-20210922
W7-BAREBONE-SERVICES-20210922.jpg

older
W7-COMPARISON.PNG
 
Last edited:

4nt

New Member
Microsoft (the OS) is shutting down the graphics driver every 160 miliseconds since Vista, for the purpose of saving energy. That HAS to add some slugishness and/or latency.

I apply the reg tweak from there everywhere I can and I always see the difference, so do other people that are not tech savvy.

But some dont. ¯\_(ツ)_/¯

Edit: It applies to all OSs to date, after Vista.
Well that may explain why disabling Aero makes everything smoother, from the interface to videos to games. Aero is said to enforce Vsync even when disabled in the driver, and apparently that's why switching DWM off or using Windows Classic themes make it all so much better. I've just applied the tweak but haven't restarted yet (tons of open windows), but will report back as soon as I do. Thank you so much for the link.

Also, it appears that GeForce cards always use WDDM, and nVidia being such a good MS follower it's then very likely that it does implement Vsync timeout to save power - esp. on notebooks. I suspect then that maybe it's disabled when you set "Prefer maximum performance" in the driver, but haven't tested it either (yet).

Clanger I'd have never thought w10 could use less mem than 7, stoked with the news. I'll get my hands dirty on it tomorrow, hopefully, as I think I'll then have the time.

From this thread alone, it looks like I hadn't fiddled remotely as much as I thought I had. Great, I love learning! :)
 

4nt

New Member
It's possible it's just the specific model/drivers I'm using. I haven't spent much time yet trying to see if I can fix it, because I've been focusing all my time on taming the OS first. Then once that's done I was going to look into this other issue. But I've ran LatencyMon on a clean Windows 10 21H2 install and it had perfect DPC except for the nvidia graphics driver kernel would ocassionally spike for no reason, idling at the desktop.

Eventually I'll try using older drivers and/or changing out the card to see if that matters, but with the known problems historically ever since Microsoft started using the multimedia scheduler more and more, and messing with DPC related stuff, and bloating out the OS I have a feeling that it's just going to be a Microsoft and/or Nvidia issue that we can't control very much. I do hope I'm wrong and maybe my card is just dying, that'd be a nice easy fix.
W10 brought WDDM 2.0, upgraded to 2.7 in W10 20H1. There seems to have been significant changes to scheduling, including Vsync-counter disable that MacVap cited above, which might explain those video DPC issues, particularly: spiking during idle. In this case, updating may be better than reverting drivers - but it's MS and nVidia we're talking, one never knows...

I've just found a very interesting thread on 20H1 & WDDM 2.7 at Guru3d that brings up a whole field of exploration for reg tweakers and NTLite users, they've done quite a bit as well. In-depth discussions and explanations in the links all throughout the discussion.

*Sigh* I guess I'll have to contain my anxiety until next weekend, as there won't be time time to read through all of that and make tests this week.

(On a side note, one thing to be checked is whether there are any significant differences to scheduling settings between different W10 "flavors" and their releases...)

Edit: I've been saying "vsync disable" when it's actually "VSync-counter disable". If I get it right, it means Windows won't update the screen as often as it should, instead delaying refresh cycles and adding extra overhead to refreshes - which totally explains both of DPC latency and stutters - which totally defeats, well, vertical *sync*.
I'm struggling to think of a possibly shittier approach to the issue. Sooooo Microsoft...
 
Last edited:

Hellbovine

Active Member
Nice find MacVap, this may very well go hand in hand with another issue that I've been investigating inside the power plan settings--something I noticed that's extremely concerning and is a problem for all gamers, is that even under load, the CPU will STILL constantly downclock by 50% for 1-2 full seconds at a time about once every 30 seconds or so.

This happens whenever there's milliseconds worth of "downtime" the CPU decides to try and save battery. You can see it in the frametime responses during benchmarks, normally it's steady at 16ms per frame for me in my system, but it will double to 32 whenever the CPU downclocks and you can see the spikes. I know I can get rid of the issue, it's a simple registry tweak(s), but the problem is testing it and making sure you don't fry your system with overheating, because just straight up disabling all throttling raised my idle temps by a full 30 degrees fahrenheit, so that's not really a viable option for people. Instead the reg key for the downclocking just needs to have a higher delay so that while you are gaming it doesn't have enough time to experience any "downtime".
 
Last edited:

Hellbovine

Active Member
i dont overclock, i dont turbo boost, disable all sleep states, cpu and ram at their default speeds
Yeah same here, I tried it in the past and it just wasn't worth it, the temperatures destroy the components really fast. I'd probably do it in a water cooled system, but zero overclocking for me in a fan cooled, it's just not worth the headaches. Been there, done that, got the t-shirt.
 

Clanger

Well-Known Member
i had an A10 7890K and i clocked down from stock 4.1 in 100mhz steps to 3.1 and audio rendering speeds increased by 2 seconds, maybe 3 max. so a cpu at 3x the price would be 3 seconds faster. id rather put that money to a bigger storage drive to be honest.

a 2c/4t pentium gold turned out to be faster at rendering than a 4/4 i3 8300.
 
Top