gonetz / GLideN64

A new generation, open-source graphics plugin for N64 emulators.
Other
770 stars 179 forks source link

Banjo-Tooie Low Performance on HIGHEND Computer #664

Closed theboy181 closed 8 years ago

theboy181 commented 9 years ago

Banjo-Tooie has a 60 fps hack http://forum.pj64-emu.com/showpost.php?p=63166&postcount=59

I understand that this is a hack, but this hack and other plugins like JABO, RICE, and Glide64 Final all work great!

I have a 4.6 i7 with a 295x ATI card, and I get slow downs to 30FPS with GLideN64 plugin. I have tested this hack on my 3.2 i7 laptop with Nvidia 780 and the slowdowns are not as frequent, but there are still performance issues.

I have done minor testing to whether or not its FBE, but when you turn it off all you get is a white sceen with none of the buildings ect. drawn to the screen.

capture

If I disable, and re enable the default FBE effects it seems to clear up the performance issues a fair bit, but there are still some slow downs when transitioning between stages that load.

Unchecking the FBE elements that are selected seem to make this game playable at 60fps. And it looks like the per game settings are still applied. not sure where these effects are used.

GamingHD commented 9 years ago

I can't vouch for the technical or coding side of things, but I do know that GLideN64 with all it's improvements in accuracy uses quite a lot more power than any of those other plugins. I use an R9-285 and have minimal issues on games that run at 30fps, but if a game runs at 60 I'll find it dropping frames here and there, the same goes for the FPS hack - and this game is one of the worst for the drops on my system. I've noticed better consistency on Nvidia cards, too. The emulated Frame Buffer is what has the most impact on the performance, so that'd explain why turning it off makes it playable.

Additionally, on a rig with a Titan X I noticed next to no drop. Which is what I expected from my testing.

Hopefully some of that information is relevant.

theboy181 commented 9 years ago

From what I understand is that the FBE emulation is set in the per game ini. So just u checking the FBE settings and leaving the per game settings give back the performance and still uses the FBE effects the game requires. There are GPU cycles getting wasted for this game when you have the other two default options checked.

I think that there is something wrong with the plugin not the system specs.

Worth looking into.

purplemarshmallow commented 9 years ago

"Copy frame buffer to RDRAM" and "Copy depth buffer to RDRAM" options are enabled by default. This means frame buffer and depth buffer are copied once per frame. If you have more fps performance will suffer if you have these options enabled.

theboy181 commented 9 years ago

There is a huge performance loss on my higher end ATI vs my lower end Nvidia card. This doesn't seem correct. Could there be a bug in the code?

purplemarshmallow commented 9 years ago

The plugin reads data from the video card. Some video cards have a better read speed. What happens if you disable "Copy frame buffer to RDRAM" and "Copy depth buffer to RDRAM"? The plugin does not read any data from the video card if you disable these options.

theboy181 commented 9 years ago

I just tested this Game on My computer with GTX 970 and I am able to run full speed with no slowdowns using this hack.

I don't understand how my 295 X2 is not keeping up.. Something is not right.

I also noticed that when using my ATI card, the first boss in the cave, he doesn't separate after he attacks, he just disappears and reappears when you get close to him??

I thought that Gonetz had been creating code around ATI cards?

ghost commented 9 years ago

Can you provide a savestate at that boss (preferably after he drinks the potion, but before the battle starts as sometimes it varies).

Are you using the latest catalyst omega drivers?

phly95 commented 9 years ago

If you are using Jabo audio, then change the settings on the audio plugin (not project64's option, but the plugin's option) to "sync game to audio" and it should be smooth. Same happened to me after getting glide on my new desktop. Also, under game options in project64, set it to virtual table lookup instead of physical table lookup to get rid of stuttering when loading new rooms.

phly95 commented 9 years ago

Physical lookup table is for crappy computers with like 500MB of RAM. They should make virtual default now because everyone has enough ram for an n64 game now.

EDIT: turns out, virtual lookup table only works with interpreter, not recompiler. If you use recompiler with vlu, it will give you a critical error.

GameRecover commented 9 years ago

Syncing the game to the audio is a good temporary solution. If you have to do this though, it means your computer doesn't have enough power to render all the frames it needs to. Syncing it to audio forces it to render whatever frame the audio is currently up to - meaning it'll skip frames here and there. The frame rate code itself is not the issue - it's the fact it's rendering twice as many (60 as opposed to 30) and your system simply can't maintain twice the frame rate. Disabling the frame buffer helps because the frame buffer uses a lot of system resources to be emulated properly (but it's also a crucial part of many visual effects in n64 games and that's why the game looks like it does when you disable it - it requires it).

In order for GlideN64 to look as good as it does and to be as accurate as it is, it requires a massive build. It's the type of thing where it'd be better off making it slow and accurate (so that future hardware could accommodate it) rather than putting together hacky fixes in order to please individual users. Additionally, a single gpu (isn't the 295x dual?) would be better in this situation, and I believe an Nvidia card would be more stable than AMD in this case also.

I think this should be either closed as it's not so much an issue as an unfortunate case of the software superseding your hardware, even though you do have a very good build. I am sorry.

theboy181 commented 9 years ago

The 295x2 is a 1600 CAD card. This performance issue is not the lack of hw. The 970 I have is a 450 card.

I'm sure that the issue can be resolved and deserves some attention.

phly95 commented 9 years ago

Yeah, with a good amd card I have now, I was having some performance issues with no bottleneck I could see. Lowering all the settings didn't help, but adding all the effects (antialiasing etc.) Didn't hurt it either. I did sync to audio, and comes to think of it, there are a few lost frames. I just don't get what's causing this random microstuttering. I can play battlefield 3 at max (I didn't bother buying the newest battlefield) so it can't be the card. I checked cpu usage in task manager and it barely uses any, and I got 6GB RAM (I think 2 GB VRAM) but I'd rather a small video stutter than popping audio.

phly95 commented 9 years ago

This was on Zelda MM BTW.

GameRecover commented 9 years ago

Well, I may be wrong. I'm just not sold on the fact it's not a hardware issue (maybe even partially) and I am glad that this is still being investigated!

Just because your build can run a recent game doesn't necessarily mean it can handle emulation though. A good example of this is the bsnes core for SNES emulation. An older emulator like ZSNES runs on both older and newer builds very easily. Higan, Bizhawk and lsnes all have much higher requirements as they use this more accurate emulation core. If you don't believe me, try getting Bizhawk (using the bsnes core) to play games at 4x speed. Even moreso, download lsnes and have a look at how it runs.

Emulation takes a lot more power to compute than traditional gaming; for instance my R9-285 can run any of the Call of Duty games (tried to pick a series with a big area of reference) really well, maintaining well over 60fps on relatively high settings @ 1080p - but it suffers from the same issue that the original poster described.

My other build, with two Titan X cards in SLI doesn't have the issue at all. It can play at the 60fps with the hack with no issue. I'm also assuming as I didn't set any SLI settings that it may only be using one of the cards. Additionally, this build also has a faster processor.

Sorry if my first message (or this one) came across as harsh or closed minded. I signed up specifically to comment here and I'm not use to typing on forums so I type the way I'd talk, no disrespect was meant.

purplemarshmallow commented 9 years ago

I believe the slowdown comes from the "copy frame buffer to RDRAM and "copy depth buffer to RDRAM" options. You can disable them for a speedup. You may need to disable "Use custom per-game settings" too. Some effects can be missing if you disable these options.

phly95 commented 9 years ago

I know emulation it more resource intensive, but I checked GPU and cpu usage and neither was being bottlenecked, to there must be something else. If it was a CPU bottleneck, 1. Sync with audio wouldn't work since CPU processes all the game events. 2. CPU usage would be a lot higher, plus I have run this on a laptop with similar results, with a CPU a third as powerful.

If it was a GPU bottleneck, then why on my nice card can I change the settings, high, low, doesn't matter, performance is exactly the same, even with best antialiasing enabled.

The only think I can come up with is a RAM speed bottleneck. Not running out of RAM, rather having DDR3 RAM instead of DDR4. If it is that picky about RAM, then this is crazy.

Otherwise, it may not be a hardware bottleneck, but something in the software limiting speed lower than hardware is capable. This is what I think it really is, a fake bottleneck that is in place for some reason. Maybe I need to try mupen and other emulators, but considering Android phones have the same issue, it must be software.

purplemarshmallow commented 9 years ago

I know there's a problem with some GPU drivers. The driver can't detect that GLideN64 needs high performance. You may need to enable performance mode manually in your driver settings

theboy181 commented 9 years ago

No disrespect taken. :). I have a feeling that there is some amd issue and the plugin.

I like purplemashmellow suggestion I will have to look into that.

As it doesn't make sense that nvidia handles it just fine and the Radeon doenst.

LegendOfDragoon commented 9 years ago

@theboy181 you should profile it, after compiling one day. It may help with finding out what's wrong.

theboy181 commented 9 years ago

If you get some free time and the ability for n00bs to compile is out there count me in.

GameRecover commented 9 years ago

I did some testing and found that in both cases, you're right. It's only reaching about 20% on my CPU and a little higher on my GPU at any given time. If I uncap the frame rate limit, rendering at 320x240 with no anti aliasing gives about the same speed as rendering at 960x720 with full anti aliasing. I expected some difference in values but did not see any. This applies to both of the builds I mentioned above.

gonetz commented 8 years ago

Is it still actual?

fzurita commented 8 years ago

This sounds too much like the color format issue I was having in android. When hard coding the color format of the glReadPixels to GL_RGB, this game went from 14 fps to 41 fps with async copy color buffer to RDRAM.

theboy181 commented 8 years ago

Closing, as I bought a GTX1080 and believe that the issue was related to AMD

gonetz commented 8 years ago

GTX1080? Not bad. If you don't need your old poor AMD card anymore, send it to me :)

theboy181 commented 8 years ago

If I wouldn of know. I sold it to a buddy