Does LoG2 use real triple buffering?

Have trouble running Legend of Grimrock 2 or do you have questions about the purchasing options? Look for help here.
Post Reply
eLPuSHeR
Posts: 676
Joined: Tue Jan 08, 2013 7:42 pm

Does LoG2 use real triple buffering?

Post by eLPuSHeR »

According to this old article on Anandtech's real triple buffering only works for OpenGL.

I am wondering if the VSYNC+Triple buffering option under LoG2 configuration is the real thing.

I have been suffering terrible input lag since game came out and it's almost alleviated by changing pre-rendered frames under nVidia's control panel to the lowest value (1).

Could anyone shed any light on this?.

Of course, when playing with VSYNC OFF I notice an annoying and terrible screen tearing. Right now my monitor refresh is 75Hz and I have also set the maximum capped framerate to 75 too. My i7 seems strong enough to maintain a steady 75fps.
Intel i7 5960X
Gigabye GA-X99-Gaming 5
8 GB DDR4 (2100)
GeForce GTX 970 (Gigabyte)
User avatar
Dr.Disaster
Posts: 2876
Joined: Wed Aug 15, 2012 11:48 am

Re: Does LoG2 use real triple buffering?

Post by Dr.Disaster »

eLPuSHeR wrote:According to this old article on Anandtech's real triple buffering only works for OpenGL.
In regard to the nVidia driver setting this is true. The triple buffer setting there works only with OpenGL.
eLPuSHeR wrote:I am wondering if the VSYNC+Triple buffering option under LoG2 configuration is the real thing.
Definately. The triple buffer option in the game options prevents the fps cuts usually seen when VSYNC notices that the GPU can't deliver the requested frames per second.
eLPuSHeR wrote:I have been suffering terrible input lag since game came out and it's almost alleviated by changing pre-rendered frames under nVidia's control panel to the lowest value (1).

Could anyone shed any light on this?.
I've noticed some input-lag here and there but it's too sporadic to call it terrible. It might be a side-effect of the LoG2 engine trying to distinguish between a normal right-click attack and a right-click-hold attempt for a special attack. I did not pay more attention to it because i could not reproduce it reliable.
eLPuSHeR wrote:Of course, when playing with VSYNC OFF I notice an annoying and terrible screen tearing. Right now my monitor refresh is 75Hz and I have also set the maximum capped framerate to 75 too.
Running with VSNYC off naturally results in screen-tearing which was the reason to implement a VSYNC option.

Setting the max framerate in "grimrock.cfg" only matters when you run the game without VSYNC.
Enabling VSYNC renders this setting irrelevant unless it is set too low.

Running VSYNC without the triple-buffing option results in massive jumps in fps when the CPU/GPU combo can't deliver.
They prolly go like this for your setup: 75 -> 37.5 -> 25 -> 18.75 -> 15 -> 12.5 -> ..

Now nVidia drivers have a VSYNC option that work similar to triple-buffering and prevents these fps cuts: it's called "adaptive". Yet it's a rather crude mechanism simply disabling VSYNC when the GPU can't deliver the needed frame rate. This means when fps are too low you will still have fluent video but again with tearing.

I've seen in other post that your monitor allows 60 and 75Hz. Did you test with 60 and 75 or only 75?
eLPuSHeR wrote:My i7 seems strong enough to maintain a steady 75fps.
Did you verify this i.e. by running FRAPS or setting "debugInfo = true" inside grimrock.cfg?

I'm running a 1st gen i7 GTX 760 combo and i know it can't always maintain 75fps for 1920x1080 because there are spots/areas in the game where it can't even maintain 60 on max settings. Yet i do not notice the reduction in framerate due to the triple-buffer setting in game options. Without showing FPS on screen i would not know about the fps drop at all.
eLPuSHeR
Posts: 676
Joined: Tue Jan 08, 2013 7:42 pm

Re: Does LoG2 use real triple buffering?

Post by eLPuSHeR »

I am not running at native resolution (1920x1080) but at 1440x900.

I once tested nVidia's adaptive VSYNC and I still got tearing (obvious) so I prefer the real VSYNC ON thing.

Yes, for measuring FPS I enabled the debug line int LoG2 configuration file.

I could try running 60Hz instead of 75Hz. I don't really notice any difference in the real world. Higher refresh rates could be good for old CRT monitors but it seems it isn't the case with LCD/TFT displays.
Intel i7 5960X
Gigabye GA-X99-Gaming 5
8 GB DDR4 (2100)
GeForce GTX 970 (Gigabyte)
User avatar
Dr.Disaster
Posts: 2876
Joined: Wed Aug 15, 2012 11:48 am

Re: Does LoG2 use real triple buffering?

Post by Dr.Disaster »

eLPuSHeR wrote:I am not running at native resolution (1920x1080) but at 1440x900.
Huh? Why?
eLPuSHeR wrote:Yes, for measuring FPS I enabled the debug line int LoG2 configuration file.
Which values do you see in general and at top-demanding spots i.e. at the entrance into forgotten river from twigroot forest?
eLPuSHeR wrote:I could try running 60Hz instead of 75Hz. I don't really notice any difference in the real world. Higher refresh rates could be good for old CRT monitors but it seems it isn't the case with LCD/TFT displays.
Aye higher refresh rates (70+Hz) on CRT's was the way to go for an eye-friendly display while on flatscreens 60Hz is plenty. The IMO only reason to go higher than 60Hz with LCD/TFT's is for 3D i.e. 2x 60= 120Hz or even more, depending on how good the display is.
eLPuSHeR
Posts: 676
Joined: Tue Jan 08, 2013 7:42 pm

Re: Does LoG2 use real triple buffering?

Post by eLPuSHeR »

Dr.Disaster wrote:
eLPuSHeR wrote:I am not running at native resolution (1920x1080) but at 1440x900.
Huh? Why?
eLPuSHeR wrote:Yes, for measuring FPS I enabled the debug line int LoG2 configuration file.
Which values do you see in general and at top-demanding spots i.e. at the entrance into forgotten river from twigroot forest?
eLPuSHeR wrote:I could try running 60Hz instead of 75Hz. I don't really notice any difference in the real world. Higher refresh rates could be good for old CRT monitors but it seems it isn't the case with LCD/TFT displays.
Aye higher refresh rates (70+Hz) on CRT's was the way to go for an eye-friendly display while on flatscreens 60Hz is plenty. The IMO only reason to go higher than 60Hz with LCD/TFT's is for 3D i.e. 2x 60= 120Hz or even more, depending on how good the display is.
I am running at 1440x900 because I do not like 16:10 aspect ratio. Also, everything is smaller. Steam overlay text is almost unreadble for me at native resolution. Damn TFTs. On CRTs all resolutions were good looking.

I have been doing more tests and everything seems maxed out and stable both at 60Hz/75Hz

For my setup, it seems prerendered frames is what matters most. That's why I miss OpenGL. D3D has always been quite unoptimized.
Intel i7 5960X
Gigabye GA-X99-Gaming 5
8 GB DDR4 (2100)
GeForce GTX 970 (Gigabyte)
Post Reply

Return to “Support and Tech Discussion”