Closed bearoso closed 6 years ago
Add menu option to enable\disable BlockInvalidVRAMAccess maybe? If done add note recommended on but may fix issues when disabled?
(Windows port)
Obviously fixing the games would be the ideal solution, but I’d prefer just adding the games to a hack list before making that flag visible.
ok so add kick off to a hack list, and another thought would be to move fps display to black border when rendering in 4:3 in a wide screen resolution.
Just throwing ideas out here not demanding. Im already super happy with Snes9x and appreciate having this emulator.
and maybe add static display or some sort of logo or animation when rom is not loaded like punes, zsnes, kega fusion if you get what i mean.
these are all little suggestions that are obviously not needed but giving ideas if they are wanted.
I'll try to find some time this or next weekend to give the glsl shaders a shot.
For the refresh rate: you want to query this and then set up the input rate accordingly? I'd still want to expose the input rate, since it might be running in auto mode without vsync. So maybe an auto-adjust button that queries and sets the slider? Also shouldn't the dynamic rate control be ok in most cases?
I'm not sure about the superfx overclock setting. I guess I don't mind putting a gui setting somewhere.
Are there any plans to get in sync with the libretro repository again? It appears the last attempt was made here https://github.com/snes9xgit/snes9x/pull/60, but it never went anywhere.
I ported some of the changes for 1.55. I'll see if there have been any since then that make sense for us - I'm not sure about fast savestates and hard audio disable, for example.
Is there an option to hide the mouse pointer if using a mouse as a gun device? Ill check this when i get home. That is always handy in emulators when using a modern lcd gun. I own a top gun ii but never tried it in snes9x yet. I have spent most of my time in snes 9x playing mario world and all stars and yoshis island and final fight and arkanoid and r type latley and others but only with my snes usb pads. For the longest time i was a zsnes user many years ago then jumped to snes9x then snes gt then snes9x then bsnes now back to snes9x. I find the current dev build of snes9x for windows to be my favorite
For the refresh rate: you want to query this and then set up the input rate accordingly? I'd still want to expose the input rate, since it might be running in auto mode without vsync. So maybe an auto-adjust button that queries and sets the slider? Also shouldn't the dynamic rate control be ok in most cases?
What I'm doing now with the Gtk+ port is using a checkbox to automatically set things by default and sets the slider accordingly. Unchecking it lets the user change it at will. I had an auto-adjust button there instead, but I wasn't sure if the user would know to hit it when first configuring the emulator.
There was a thread on the libretro forums about the dynamic rate control being audible, and the samples provided showed I was susceptible to it. Also, it won't very well with large variations, so making sure it's centered properly helps.
I'm not sure about the superfx overclock setting. I guess I don't mind putting a gui setting somewhere.
I noticed when debugging Doom that our default timings were too slow, but it also seemed correct for Star Fox. Previously I had noticed Yoshi's Island 2 running too slowly. Because some games may run too fast and others may run too slowly with our SuperFX code, I figured I'd let the user configure it. It's up to you.
@OV2, let me know if you start working on something. Since I've gotten my feet wet working with Windows, I can jump in and do some stuff like the input rate thing.
and maybe add static display or some sort of logo or animation when rom is not loaded like punes, zsnes, kega fusion if you get what i mean.
You should see the funny thing I did with the Gtk+ port--a static logo with a bunch of patterns to test filters on.
I'll start with the shaders tomorrow - if you want to give the input rate a shot be my guest :)
Hi, i didnt get 1.55, only 1.54 because... no will to do it :P - and i saw something strange:
Why 2 changelogs, .rft and .txt? I mean, only one is enough (a simple .txt file like it exists in other emulators would do) but the .txt changelog is a mess to read while the .rft is not, excluding one and prettyfing the other would be a simple solution for this. ;)
Geez i thought i was picky
I've made a first commit that includes the glsl shaders, although it's not really working at the moment. One problem is the POT texture used in the win port for the main output. Since regular and hi-res can have different scaling I'd have to change the texture size on the fly, I'll have to think about that some more. Besides that the shaders don't seem to affect the image at all, even though all gl functions succeed. I'll get back on it later.
It looks like you're still calling DrawArrays after you call the glsl render function, so maybe it's overwriting the image? I had the glsl render process use the framebuffer as output for the last pass to avoid that extra viewport-sized copy at the end, which differs from how you had Cg doing it and is why the viewport is passed to the function.
I put a glLogErrors function in shader_helpers that you can call somewhere during shader init and it'll log all the normally hidden OpenGL stuff to stderr. Could be helpful.
Thanks, that was the problem. I'm already using glIntercept to get ogl errors, but the logging might be useful in the future.
One issue with the current implementation: you can't change the viewport depending on the output size of the last pass. For example when using the plain hq2x shader and you want it centered in the window instead of stretched. Any ideas? A viewport callback maybe?
I haven't switched the CG code yet since you mentioned you might drop it.
I didn't quite understand what you were doing with the viewport in the Cg code. A viewport callback or port function might be necessary. I've already got something in the form of S9xApplyAspect in the GTK+ port, and you've got CalculateDisplayRect. I'd prefer something a little more portable than either, though. I think a port function something like
S9xCalculateDisplayViewport(int width, int height, int *outx, int *outy, int *outwidth, int *outheight)
with pointers instead of references would be the most versatile.
I should point out that the Retroarch people added a bunch of stuff to the Cg shader spec since you added support. There's stuff like passprev, aliases, mipmapping, and a bunch of other things used by the common shader repository. I started to add those, but realized instead of trying to fill out our Cg support, we'd probably be better off putting work towards moving to the future-proof slang format.
Since fixing the PAL version of Secret of Evermore (+ Super Punch Out?) - as a major title for the machine - got fixed and should be flawlessly playable now, why not bump the version to 1.60? This would fetch more attention as well.
I simply cannot skip 56. https://youtube.com/watch?v=Qs8kDiOwPBA
Damn, now I have to re-watch futurama!
Not only are there fixes for those major titles, this release also fixes Speedy Gonzales, one of byuu's go-to examples for comparing less accurate emulators against higan. Between that and the complete cheats overhaul to a completely new data format, I kind of agree with the idea of 1.6...
Eh, Speedy Gonzales is an interesting edge case, but an inaccurate emulator can definitely get away with it. There obviously wasn’t any real accuracy improvement other than for a very specific situation. This just means byuu needs a different example.
Spoken like a true engineer with no eye for marketing. In addition to the other points, we also have that snazzy new icon...
One issue with the current implementation: you can't change the viewport depending on the output size of the last pass. For example when using the plain hq2x shader and you want it centered in the window instead of stretched. Any ideas? A viewport callback maybe?
I added a viewport callback to the GLSL render function. You still have to pass in what you think is the final viewport because some shaders use that in interrim stages. The viewport callback parameters contain both the rendered image size and the viewport parameters it has used for rendering those interrim passes. You set the output parameters to your desired glViewport params.
Also, in case it's confusing: make sure you check the parameters passed into render. I switched the order of viewport x, y and width, height.
I've changed the OGL code to use NPOT textures if available and used the viewport callback, looks good now. I'm still working on the parameter dialog, but that shouldn't be too hard.
Regarding the new CG stuff: I saw that a while ago, but since the spec was never updated and it seemed somewhat fluid I never added anything. I'd like to keep it around so that there is at least something for D3D, but I'm not sure if it is worth the trouble updating since many of the new changes did not even work in RA D3D-mode.
I've added a first version of a glsl parameter settings dialog that seems to work fine.
Have you decided if you want to keep cg in? I wouldn't mind switching to a central implementation.
call 1.56 1.56ish instead like this subject.
I've added a first version of a glsl parameter settings dialog that seems to work fine.
Have you decided if you want to keep cg in? I wouldn't mind switching to a central implementation.
I'll drop Cg. Our current implementation doesn't have those weird features that it can do that GLSL can't, and I'd rather just do slang for those, and I don't need to keep Cg for DirectX. Plus, I made a mess out of your code and it would be an effort to make it portable.
call 1.56 1.56ish instead like this subject.
Can't say I didn't think about it.
tested latest appvayor build 134. working great. decided to download some liberto glsl shaders seem to work.
Im assuming this is what you guys are working on right?
new logo? with purple buttons to match new icon?
https://drive.google.com/file/d/1TUqdmd8HjSffGi6fTMGCernxCfaGpc85/view?usp=sharing
maybe a crazy idea but what if you added the ability to allow a user to switch to a liberto core for a specific rom if they wanted?
example use snes 9x and if a specific title worked better on let say bsnes core user can choose to load and switch to that core.
Not something I really want but just a suggestion if you guys are looking for ideas to add to a next major build.
kinda how bizhawk lets you switch between snes9x and bsnes
again just a thought as a cool feature. dont know if anyone would really care for it.
also the custom open rom menu. make it bigger, make it possible for a user to define a static rom directory that always remains static.
good for installing for non savvy users who might "lose" their rom directory.
Have you ever heard of a phrase called "feature creep?"
So with all these recent shader changes, what does this exactly mean? ELI5 me on it if you can. I hope we don't lose the current shaders that are already in Snes9x, I happen to like a lot of them and there is basically no setup required.
It means we additionally support .glsl and .glslp shader files. Nothing has been removed.
Cool, in addition is always good, I was a little worried that the old shaders were going in favor of newer ones.
They're the same shaders in a different format. In fact, they're a little more compatible because the Cg format was secretly updated several times and we never noticed.
At this point what is there left to do for 1.56?
I noticed my IRQ changes broke Battle Blaze's title screen. I'll have to look into that.
I'll add the fx overclock setting, and I guess we should see if someone is willing to compile the osx port again.
And you'll have to decide how the release should be named ;)
Battle Blaze is entering an IRQ handler, reading 4211 to clear the IRQ line, but then sets another IRQ timer that occurs during that handler and pulls the IRQ line low again so that it triggers instantly upon leaving. We can allow it to pull the IRQ line while the handler is active, but that breaks Marko's Magic Football. Shrug.
Mortal Kombat 2 also seems broken: the character models randomly break up for a short time. Might be a similar issue.
I've pushed a fix for Battle Blaze. See if this fixes Mortal Kombat 2.
I suspect Marko's is expecting the timer to be blocked while the IRQ runs, but somehow resetting $4200 allows the timer to trigger again for Battle Blaze?
Unfortunately id didn't change anything for mk2.
IIRC from the higan sources the irq line is held for 4 cycles when it triggers, and blocked for a while after certain events (e.g. writing 4200), but I can't remember for how long.
Did Mortal Kombat 2 have problems before? From what I found by byuu a loooong time ago, MK2 is very "picky." I'd have to analyze it.
Marko isn't a problem with IRQ timing, AFAICT. What SimpleTease/sluffy say in issue #231 is true. It reads from 4211 with a data bank of 7F during the interrupt handler, which is WRAM. I'll have to look at where that 7F is coming from.
Yeah, mortal kombat is exactly like the problem byuu had back in 2007: http://forums.nesdev.com/viewtopic.php?t=2932
MK2 worked previously because of the delay in CheckInterrupts
static inline void S9xCheckInterrupts (void)
{
bool8 thisIRQ = PPU.HTimerEnabled | PPU.VTimerEnabled;
if (CPU.IRQLine & thisIRQ)
CPU.IRQTransition = TRUE;
If I add something similar to the new way MK2 works again, but then battle blaze exhibits its problem.
Nah, I fixed it. It's exactly like byuu's problem. WAI needed to be more granular.
With all the changes lately, it's getting time for a new release. We need to come up with a list of TODOs to accomplish before it's ready.
@OV2, you mentioned getting the GLSL shader support into Windows. I might drop the Cg support from Gtk+ since I did it so haphazardly. I've been looking at the new slang format, but that can wait until the next release.
I'd like to get support for reading the monitor's true refresh rate into the Windows port so we don't have to rely on the user to set the input rate or even know what it is. For reference, I added something like this to Retroarch a while back using QueryDisplayConfig. In Gtk+ I've got it set by default to just poll the monitor and not need input rate adjustment.
I did add a GUI config to the Gtk+ port to adjust the SuperFX overclock. I don't know if you want to expose that to Windows users.