SuperFlyTV / sofie-demo-blueprints

https://superflytv.github.io/sofie-demo-blueprints/
7 stars 16 forks source link

Support OBS as video production engine #29

Open jstarpl opened 2 years ago

jstarpl commented 2 years ago

OBS is a popular software video mixer allowing running reasonably complicated video productions on common, off-the-shelf computer systems. Sofie TSR will support OBS starting with Release 37. It would be great to be able to allow users of Spreadsheet blueprints to easily change between the current ATEM+CasparCG setup and a pure OBS setup.

Timeline State Resolver types documentation: https://nrkno.github.io/tv-automation-state-timeline-resolver/modules/timeline_state_resolver_types.html

jstarpl commented 2 years ago

Hi there! Sorry to have kept you waiting. Yeah, I guess going with the current development version would the most future proof. So essentially, all the updateSourceLayerHotkeys would need to go and they would have to be replaced with new migrations setting up Action Triggers for those AdLibs, clearing source layers, sticky AdLibs and all that. I wouldn't worry about that in the beginning though, since while triggering AdLibs using hotkeys is important, it's not the biggest problem here, and can certainly be solved at the very end of implementation. First thing to set up would be a switch in the blueprint settings (I guess, the Studio, since this is a hardware-related switch) to switch between using an ATEM and using OBS. You'll also probably need some sort of a config table to select which scenes to use in OBS for which Parts.

Then, I think I would go with something simple like the CAM Part adapter and start from there. It uses a createAtemInputTimelineObjects function to create the timeline objects to set up the state of the ATEM. I guess that would have to be replaced with an adapter that would be able to parse the ATEM/OBS field and understand which implementation to use and then switch between createAtemInputTimelineObjects and something new like createOBSSceneTimelineObjects. Then, do the same for all the other part adapters, so VT, full screen gfx, etc.

It's probably going to be nice to be able to switch between using CasparCG for graphics (if one wants to) or some sort of generic REST API to talk to, say, NodeCG or something, and render that directly in OBS.

Then, once that's going and the rundown can be played "as-is" (so, no Ad-Libs), you can look at the global AdLibs and similarly to Part adapters, create adapters to switch between various modes of graphics rendering for graphic AdLibs. it's going to be trivial to set up migrations for the new Action Triggers using the setTriggeredAction and some helper functions to speed up creating the TriggeredActions objects. The helper could look something like this:

export function createAdLibHotkey(
    keys: string,
    sourceLayerIds: SourceLayer[],
    globalAdLib: boolean,
    pick: number,
    tags: string[] | undefined,
    label?: ITranslatableMessage
): IBlueprintTriggeredActions {
    return {
        _id: makeActionTriggerId('adLib', sourceLayerIds.join('_'), !!globalAdLib, pick),
        _rank: rankCounter++ * 1000,
        actions: [
            {
                action: PlayoutActions.adlib,
                filterChain: [
                    {
                        object: 'view',
                    },
                    {
                        object: 'adLib',
                        field: 'sourceLayerId',
                        value: sourceLayerIds,
                    },
                    {
                        object: 'adLib',
                        field: 'global',
                        value: globalAdLib,
                    },
                    !globalAdLib // if not a Global AdLib, trigger only if it's coming from the current segment
                        ? {
                                object: 'adLib',
                                field: 'segment',
                                value: 'current',
                          }
                        : undefined,
                    tags && tags.length > 0
                        ? {
                                object: 'adLib',
                                field: 'tag',
                                value: tags,
                          }
                        : undefined,
                    {
                        object: 'adLib',
                        field: 'pick',
                        value: pick,
                    },
                ].filter(Boolean) as (IRundownPlaylistFilterLink | IGUIContextFilterLink | IAdLibFilterLink)[],
            },
        ],
        triggers: [
            {
                type: TriggerType.hotkey,
                keys: keys,
                up: true,
            },
        ],
        name: label,
    }
}

Creating variants to replicate the "clear" functionality can be just achieved using the

{
    "object": "adLib",
    "field": "type",
    "value": "clear",
}

filter chain link. All of that will be definitely simpler once you've go a working base, though.