Open TheBrigandier opened 5 years ago
You can already stream the internal core framebuffer (1x, no shaders) directly to Twitch from RA itself, OBS isn't needed.
If you really wanted to keep the CRT shader in-tact (assuming the client-side viewer isn't causing its own problems) you need to use lossless compression, because the 2D block-based averaging approach of traditional video codecs is what destroys the scanline effects.
@bparker06 ,
I am aware you can stream directly to Twitch, but many people want to use OBS or other software for rich streaming features (templates, webcam, timer splits, etc).
The goal is to keep scanlines just for the person playing, while providing a 1:1 pixel perfect output for the streaming software to use without scanlines. Adjusting the game to fit into your streaming template (or people playing the video at different sizes per browser window size or zoom) causes scanline moire effects. This would allow a clean output for people to use in capture software while maintaining the effects they prefer for play.
The closest I have come to this is having Retroarch stream to ffplay, but the latency/overhead of encoding this to lossless video to get it to pop up in another small window seems silly.
Stream the raw framebuffer via UDP, then ffeed that into anything you want
On Sat, Dec 8, 2018, 3:33 PM Jake Skipper <notifications@github.com wrote:
@bparker06 https://github.com/bparker06 ,
I am aware you can stream directly to Twitch, but it seems this doesn't allow for feeding into something like OBS and adding your own additions + audio? There seems to be no solution for people who want to use shaders while providing a rich streaming output with webcam, split timers, etc.
Also, the goal is to keep scanlines just for the person playing, while providing a 1:1 pixel perfect output for the streaming software to use without scanlines. Adjusting the game to fit into your streaming template (or people playing the video at different sizes per browser window size or zoom) causes scanline moire effects. This would allow a clean output for people to use in capture software while maintaining the effects they prefer for play.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/libretro/RetroArch/issues/7695#issuecomment-445487889, or mute the thread https://github.com/notifications/unsubscribe-auth/ABpC0OXFiOFz2EHVe6VDutEPPYWeqPEUks5u3CIpgaJpZM4Y_tFF .
What radius said.
Do you mean the Stream to local option, with UDP port 56400? When going that route, the output seems to lag behind the gameplay by a couple seconds and there's no option for lossless there. Custom is available, but to get it to load a .cfg I have to define it in the retroarch.cfg (the user interface won't see it). Do you happen to know something I am missing on this front or have a config to share for ffmpeg to get the speed/quality up there? Again, capturing the gameplay window into OBS is pretty much instant feedback to the streaming software and doesn't require a lot of fussing with delaying all your other input streams (webcam etc) to match up with the Retroarch output, this method seems really duct taped together.
Custom is available, but to get it to load a .cfg I have to define it in the retroarch.cfg (the user interface won't see it).
This was fixed already
Would have posted this on the libretro forums if not for the crazy amount of hoops to jump through to post a thread... maybe someone here can duplicate it there?
I love playing games on Retroarch with CRT Royale; however, the output looks iffy when streamed depending on the viewer's resolution. I'd like to post a $50 bounty for the ability to have a separate 1:1 res window to the side with zero shader effects on it, for the purpose of loading into OBS for streaming to services like Twitch. This would be amazing if it only appears/outputs video when the game is playing, hiding the Retroarch interface from the viewers as well.
I've tried several workarounds for this, such as streaming to ffplay or ffmpeg dumping frames. Maybe I am doing something wrong, but either of these methods seem to have increased latency, increased load on the system encoding, etc. It seems like a multi window option would be much more elegant solution.
Thanks!