Linux Nvidia 650M bad performance

6 replies [Last post]
bender
bender's picture
User offline. Last seen 40 weeks 6 days ago. Offline
Joined: 2010-11-30
Posts: 61

I have Nvidia 650M configured with Bumblebee like ArchWiki says i should have. optirun gives me doubled fps with glxspheres test so i guess it's working fine.

Unfortunately hedgewars works bad with nvidia card. The game shows ~64-70 fps with optirun but it tears so much that it's unplayable. Actually 60 fps with Intel card (no optirun) gives me better quality.

It's not a cpu issue because none of the cores went over 30% during the game.

Can i do something with this?

My game0.log

nemo
nemo's picture
User offline. Last seen 10 hours 29 min ago. Offline
Joined: 2009-01-28
Posts: 1861

Does optirun disable vsync?

--
Oh, what the heck. 1PLXzL1CBUD1kdEWqMrwNUfGrGiirV1WpH <= tip a hedgewars dev

bender
bender's picture
User offline. Last seen 40 weeks 6 days ago. Offline
Joined: 2010-11-30
Posts: 61

It's possible. I've googled a lot for it but no luck.

nemo
nemo's picture
User offline. Last seen 10 hours 29 min ago. Offline
Joined: 2009-01-28
Posts: 1861

The reason I ask is that 60fps sounds like vsync to me.
The wildly variable rate w/ optirun sounds like no vsync.

No vsync could easily lead to tearing.
Hedgewars is not the sort of game that should benefit from 120fps and whatnot.

Assuming you could attain it... Perhaps you need to adjust your setup, since 75fps is awfully low. When I disable vsync and game frame rate limits, I can approach the max of 1000fps imposed by the game on my GeForce 9800T under linux. Something like 975fps. This is of course a waste of cpu, but useful when measuring perf tweaks.

--
Oh, what the heck. 1PLXzL1CBUD1kdEWqMrwNUfGrGiirV1WpH <= tip a hedgewars dev

bender
bender's picture
User offline. Last seen 40 weeks 6 days ago. Offline
Joined: 2010-11-30
Posts: 61

Turnes out optirun was fail idea from the begining at the design level. The new "cool" way to use bumblebee is primusrun which has a lot better performance with forced 60fps.

The end result is that "primusrun hedgewars" makes hedgewars working fine at exacly 60pfs. Since Intel has exacly the same quality with exacly the same fps there is no point to use nvidia at all. At least playing hedgewars will be more energy efficient Smile

It's worth noting that KDE compositing causes screen tearing with all running methods.

nemo
nemo's picture
User offline. Last seen 10 hours 29 min ago. Offline
Joined: 2009-01-28
Posts: 1861

Ah. Yeah, I often suggest people turn off compositing.
Regardless of OS, can help, but under linux there's the issue of a bunch of different unpredictible compositing effects people might be using, open source drivers that are still reverse-engineering the device or half-assed binary drivers from manufacturers, and, well, as the mozilla guys can attest, the sorry state of xrender and support for it.

Not too sure why you're wrapping hedgewars. Should run fine w/o launching it from something else.
But, whatev *shrug*

--
Oh, what the heck. 1PLXzL1CBUD1kdEWqMrwNUfGrGiirV1WpH <= tip a hedgewars dev

bender
bender's picture
User offline. Last seen 40 weeks 6 days ago. Offline
Joined: 2010-11-30
Posts: 61

nemo allegedly wrote:

Not too sure why you're wrapping hedgewars. Should run fine w/o launching it from something else.
But, whatev *shrug*

That's exacly how this case ended. There is no benefit of using primusrun for hedgewars. Intel handles it perfectly well. I was testing all off this because i thought screen tearing was intel fault but it was compositing all along.

Copyright © 2004-2023 Hedgewars Project. All rights reserved. [ contact ]