FPS Problem

2 replies [Last post]
thoghart
User offline. Last seen 12 years 26 weeks ago. Offline
Joined: 2011-08-28
Posts: 2

I previously used Hedgewars 9.15 and had around 70 fps with the highest setting of graphics most of time (unless there were leaves in the background). I updated to 9.16, but the graphics are slow, I have 23 fps on the lowest quality setting. What's going on?

od107
User offline. Last seen 9 years 15 weeks ago. Offline
Joined: 2011-10-20
Posts: 4

My fps also dropped from 125 (max) to about 45 after the upgrade.
I guess it's normal that new features mean slower response but imo the graphics haven't changed much, and i am not using stereo...
I was curious about the grayscale stereo but in this mode it completely grinds to a halt (normal stereo works fine)

nemo
nemo's picture
User offline. Last seen 5 hours 21 min ago. Offline
Joined: 2009-01-28
Posts: 1861

Greyscale should operate almost identically to regular. From OpenGL's perspective they should operate the same.
The glColor operations do have a few multiplies before them, but in the context of the game's overall math, that's negligible.

As for frame rate dropping, both of you post game logs somewhere.
My guess is that you lost graphics acceleration somewhere.
Log would provide clues.

For Windows users, setting up GL should be identical, we disabled any flags due to really horrible windows drivers.

For Linux and OSX, the following is now being called before context creation, where it actually does something, instead of after, where it didn't do anything...

SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
// vsync is default in SDL 1.3
SDL_GL_SetAttribute(SDL_GL_SWAP_CONTROL, LongInt((cReducedQuality and rqDesyncVBlank) = 0));
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 0); // no depth buffer
SDL_GL_SetAttribute(SDL_GL_RED_SIZE, 5);
SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE, 6);
SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE, 5);
SDL_GL_SetAttribute(SDL_GL_ALPHA_SIZE, 0); // no alpha channel required
SDL_GL_SetAttribute(SDL_GL_BUFFER_SIZE, 16); // buffer has to be 16 bit only
SDL_GL_SetAttribute(SDL_GL_ACCELERATED_VISUAL, 1); // try to prefer hardware rendering

In theory what is *supposed* to happen is it looks for a driver that handles all these things.
Of those, basically all of 'em are things we need or that usually improve performance (like vsync), but if your card/driver wasn't reporting support for them correctly, it might completely fail to setup acceleration and failover to software accel. That's what was happening under Windows before we disabled setting these, and just did "create-context-and-pray"

We might have to disable this for all platforms perhaps.

You can try updating your graphics card driver. If under Linux, try nvidia/fglrx drivers non-free drivers, or if using intel, a more recent version of your distro or the driver.

--
Oh, what the heck. 1PLXzL1CBUD1kdEWqMrwNUfGrGiirV1WpH <= tip a hedgewars dev

Copyright © 2004-2023 Hedgewars Project. All rights reserved. [ contact ]