Open vade opened 10 months ago
It might make sense to make a separate server that consumes CVPixelBuffers or IOSurfaces directly, rather than the base server I admit. I was sketching in code and wanted to make my changes be as minimal as possible.
Curious your thoughts @bangnoise <3
I'm not completely against this... but...
Instantiating an IOSurface is not free, and instantiating Metal textures from them is not cheap, so as-is this is going to have a severe negative impact on clients - it might even be the case that a server blitting to Syphon's single IOSurface is faster for clients as they are now (I haven't tested). If the only motivation for this is convenience then the cost is too high.
It would need client support to cache surfaces, and for subclasses to cache their resources (eg Metal textures), and a way to retire them, probably with a (configurable) timeout as well as an explicit retire method in the server. Everything except the explicit retire could be done without changing any IPC, just the way clients behave. However existing clients would be penalised if we built this to work with them, so we might want to carefully consider that impact.
Purely on the PR as it is now, SyphonServerBase would need a way to tell subclasses the surface has been invalidated in -publishSurface:
, like SyphonClientBase does with -invalidateFrame
.
So, what im doing here, is leveraging IOSurface backed video playout from AVFoundation by requesting global surfaces in an AVPlayerItemVideoOutput like so:
let playerItemVideoOuput = AVPlayerItemVideoOutput(outputSettings: [ String(kCVPixelBufferPixelFormatTypeKey) : kCVPixelFormatType_32BGRA,
String(kCVPixelBufferMetalCompatibilityKey) : true,
String(kCVPixelBufferIOSurfacePropertiesKey) : [String(kIOSurfaceIsGlobal): true]] )
And then in a CVDisplayLink callback, just
func displayLinkCallback(displayLink: CVDisplayLink) {
var time = CVTimeStamp()
CVDisplayLinkGetCurrentTime(displayLink, &time)
let cmtime = self.playerItemVideoOuput.itemTime(for: time)
if self.playerItemVideoOuput.hasNewPixelBuffer(forItemTime: cmtime)
{
var actualOutputTime = CMTime.invalid
guard let pixelBuffer = self.playerItemVideoOuput.copyPixelBuffer(forItemTime: cmtime, itemTimeForDisplay: &actualOutputTime) else {
return
}
// Convert CVPixelBuffer to IOSurface
guard let surface = CVPixelBufferGetIOSurface(pixelBuffer) else { return }
surface.retain()
self.syphonServer?.publishSurface( surface.takeUnretainedValue() )
surface.release()
}
}
The client is caching in the sense that the internal pool is being re-used and surfaces are being recycled, but yes, each frame does require a publish.
From my limited testing this seems fairly fast / lightweight so far and I see performance at a surface level on par with existing client / servers.
One nuance I forgot about which this PR does not address which doing an internal blit would - flipping.
Today theres no flag to tell a client the image is flipped. And since we consume a surface 'as is' there is no opportunity to correct for flipping.
So clearly this needs some more thought :)
I think in theory, this might be better served with a dedicated CVPixelBufferServer / CVImageBuffer server?
The API is more useful to non expert users, and in theory could sit on top of the Metal Server infrastructure, handle flipping, etc?
Thinking aloud here...
This is a WIP / Proof of concept PR
This implements direct publishing of an IOSurface through SyphonServerBase class,
- (void)publishSurface:(IOSurfaceRef)surface;
You must meet the following requirements EXTERNAL to this new api
This does not currently enforce any of those safety guards, but it could be added.
Internally this abuses publish, and every time the
publishSurface:
method is called we re-broadcast the new incoming IOSurfaces ID to other clients.This is clearly suboptimal, but .. its useful in some situations:
This also addresses issue #67