Posts
Search
Contact
Cookies
About
RSS

Intel hd4000 (and similar) disable VSYNC

Added 24 Oct 2013, 6:16 a.m. edited 18 Jun 2023, 1:12 a.m.
I've waited a long time for intel cpu's to have decent integrated graphics, and although people seem to dismiss GPU's like the HD4000 something that can handle really intensive games like GTA4 and is openly documented (allowing for really good quality open source drivers) should not be so quickly overlooked! As you can tell I'm a fan! but I just came across a slight gottcha, now normally you'd want VSYNC enabled, especially if you're on battery power. However if you are trying to optimize some OpenGL rendering having your frame rate pegged at 60fps really isn't too useful! My initial port of call was to create an xorg.conf rather than rely on autoprobing, and a tip here if Xorg -configure won't run (which it didn't for me) it may well leave a file called xorg.conf.new in /root/ oddly despite -configure failing this config worked fine (I did edit it later to remove some spurious devices and screens) Once I had a xorg.conf I then added
Option     "SwapbuffersWait"    "False"
to the appropriate device clause and duly got
[    31.019] (**) intel(0): SwapBuffers wait disabled
alas for some reason, still pegged at 60fps! unfortunately there seems to be a bug with driconf which I used to create ~/.drirc (my next port of call) it seems to get the device name very wrong - some out of production chipset and in any case I discovered what actually works, is using a device name of dri2 ! go figure I'd have never guessed that one - just the price you pay occasionally for flexibility I guess.
<device screen="0" driver="dri2">
   <application name="Default">
   <option name="vblank_mode" value="0"/>
   </application>
</device>
While my distro doesn't have the same multi part xorg config style this page was rather useful https://wiki.archlinux.org/index.php/Talk:Intel_Graphics#Disable_VSYNC A big thank you to whoever figured this out and why it isn't in their actual wiki page who knows anyhow glxgears now renders at silly rates and I've discovered they jMonkeyEngine optimizes rendering really well no matter how badly you code!