mrRay / vvopensource

OSC and MIDI frameworks for OS X and iOS, a framework for managing and rendering to GL textures in OS X, and a functional ISF (interactive shader format) implementation for OS X.
231 stars 33 forks source link

HAP from AVFoundation #25

Closed pixlwave closed 5 years ago

pixlwave commented 7 years ago

Hi

Looking through the source, VVBufferPool.allocTexRangeForPlane:ofHapCVImageBuffer: is limited to 32-bit only. Looks like a big change to make (I've tried and failed!) but how much needs to be done to open this up to 64-bit apps using AVFoundation? I'm looking to utilise ISF to run the CoCgYToRGBA shader to work with Hap Q. Am I barking up the wrong tree here?

Many thanks

Doug (on a renamed account from digitalfx 🙂)

vade commented 7 years ago

VDMX is a 64 bit app and to my limited knowledge, uses this code base. Are you confusing a 32 bit value in a 64 bit app / casting issue ?

vade commented 7 years ago

Sorry if that sounded condescending, didn't mean to come across that way - is there a specific compilation issue you've run across?

pixlwave commented 7 years ago

Hey, no it didn't, I probably didn't explain myself very well. Check out here: https://github.com/mrRay/vvopensource/blob/master/VVBufferPool/VVBufferPool.h#L230 These 3 methods are wrapped in a "not a 64-bit platform" compilation check, so they're not exposed to a 64-bit app. Xcode completion isn't even suggesting them to me.

vade commented 7 years ago

Oh jeez. Thats obvious in retrospect - my bad! Haha. My gut - and to be clear, uninformed - I'm in no way in the absolute "know" here - i suspect that prior to a solution for HAP working in AVFoundation, only 32 bit Quicktime could vend buffers without decoding (i.e., the raw HAP samples) correctly. I suspect this is a hold-over? There should be known reason HAP in 64 bit can't work - as per VDMX! I'll shut up and let @mrRay explain in detail haha.

pixlwave commented 7 years ago

Haha, thanks for responding quickly though, it's always appreciated 👍. After a bit more digging, I'm starting to think I should be using these anyway: https://github.com/mrRay/vvopensource/blob/master/VVBufferPool/SampleVVBufferPoolAdditions.h#L64

vade commented 7 years ago

That makes sense haha.

mrRay commented 7 years ago

"Looking through the source, VVBufferPool.allocTexRangeForPlane:ofHapCVImageBuffer: is limited to 32-bit only."

yes, that's right- i think i used that back when i was working with quicktime, which uses CVImageBuffers, so it's 32-bit only...

"I'm looking to utilise ISF to run the CoCgYToRGBA shader to work with Hap Q. Am I barking up the wrong tree here?"

not at all- i use an ISF to do this conversion, too.

"After a bit more digging, I'm starting to think I should be using these anyway: https://github.com/mrRay/vvopensource/blob/master/VVBufferPool/SampleVVBufferPoolAdditions.h#L64"

yep, that's what i'm using!

pixlwave commented 7 years ago

Thanks and sorry for being slow! I've now got as far as setting up a VVBuffer with those additions, but I'm just getting a black buffer. As per always I'm doing this in Swift, but am I missing any obvious logic for a regular Hap video in this example: https://github.com/pixlwave/Hap-AVF-Syphon/blob/4e80bd21714891d2966ead194fec39642f31b406/Hap-AVF-Syphon/ViewController.swift

mrRay commented 7 years ago

no apology necessary, i should probably make a sample app demonstrating integration of the two frameworks. there are undoubtedly a number of different ways to do this (at the end of the day, all you really have to do is upload DXT data from your HapDecoderFrame into a GL texture), but here's how i'm doing it:

some pseudo-code:

//  make the "alloc frame" block: this takes a CMSampleBufferRef and returns a HapDecoderFrame that has been configured and is ready to be decompressed into
[hapOutput setAllocFrameBlock:^(CMSampleBufferRef decompressMe) {
    //  make an empty decoder frame from the buffer (the basic fields describing the data properties of the DXT frame are populated, but no memory is allocated to decompress the DXT into)
    HapDecoderFrame     *returnMe = [[HapDecoderFrame alloc] initEmptyWithHapSampleBuffer:decompressMe];
    //  make a CPU-backed/tex range VVBuffer for each plane in the decoder frame
    NSArray             *bufferArray = [_globalVVBufferPool createBuffersForHapDecoderFrame:returnMe];
    //  populate the hap decoder frame i'll be returning with the CPU-based memory from the buffers, and ensure that the decoder will retain the buffers (this has to be done for each plane in the frame)
    void            **dxtDatas = [returnMe dxtDatas];
    size_t          *dxtDataSizes = [returnMe dxtDataSizes];
    NSInteger       tmpIndex = 0;
    for (VVBuffer *buffer in bufferArray)   {
        dxtDatas[tmpIndex] = [buffer cpuBackingPtr];
        dxtDataSizes[tmpIndex] = VVBufferDescriptorCalculateCPUBackingForSize([buffer descriptorPtr],[buffer backingSize]);
        ++tmpIndex;
    }
    //  add the array of buffers to the frame's userInfo- we want the frame to retain the array of buffers...
    [returnMe setUserInfo:bufferArray];
    return returnMe;
}];
//  make the post-decode frame block: tell the buffers from the decoded frame that their backing has been updated
[hapOutput setPostDecodeBlock:^(HapDecoderFrame *decodedFrame)  {
    NSArray     *buffers = [decodedFrame userInfo];
    for (VVBuffer *buffer in buffers)   {
        [VVBufferPool pushTexRangeBufferRAMtoVRAM:buffer usingContext:<a CGLContextObj>]
    }
    //  ...at this point, the VVBuffer instances in "buffers" have the images that you want to work with- do whatever you want with 'em!
}];
pixlwave commented 7 years ago

Thanks I think I've got my head round this now, I'll see where I can get to 🙂

pixlwave commented 7 years ago

So I never got very far with this and am just revisiting it. I guess I'm not retaining something I need to as I get the following error when I enable zombie objects in my build scheme diagnostics:

[HapDecoderFrame containsTime:]: message sent to deallocated instance

My (very direct) translation of the pseudo-code is here if anyone is willing to help a bit 🙂 https://github.com/pixlwave/Hap-AVF-Syphon/blob/9eff36bf335345167f486d333ac62136dcab909c/Hap-AVF-Syphon/ViewController.swift

mrRay commented 7 years ago

howdy-

i thought a sample app might shed some light on things, so i added one- just build & run "HapInAVF Test App" in this repos:

https://github.com/mrRay/vvopensource/tree/master/HapInAVF%20Test%20App

...hope this helps!

pixlwave commented 5 years ago

Very delayed thanks, finally got the Swift code decoding frames to ISF following that sample app. It's a bit of a hacky result as I'm simply holding onto the last 2 HapDecoderFrames in an array to prevent deallocation, but it works 😃