Closed yuya-h-29 closed 2 years ago
Hi @yuya-h-29, sorry for the delay. I'll try to answer all of your questions.
But are there some other ways to add custom images, particles... etc on the stream?
There are 2 ways to add multiple layers to the stream.
logoSlot
.Also, I'm wondering if there are 10 different emojis that the user can display, and those emojis should be added to the mixer slot when the broad cast session is configured?
The mixer can be modified at runtime. You can add/remove slots and bind/unbind devices at any time after the creation of the IVSBroadcastSession
.
But the custom image wasn't added on the stream, so I guess it was not the right way to add some imaged to the stream.
In that scenario you are only adding your image to UI hierarchy, but you are not submitting it to the SDK. The liveCameraView
object does not have anything to do with what the SDK broadcasts, you can still broadcast the camera's feed without any preview, as long as the CMSampleBuffer
s are still being sent to the SDK.
So could you point me out to the right direction for adding custom image at the user tapped location?
We do not have sample code available for this scenario, but dynamically using the mixer APIs (add, remove, bind, unbind) is the correct approach. You would do something like the following
zIndex
to something greater than your other slots.UIImage
, CIImage
CGImage
, etc) into a CVPixelBuffer
, wrap that in a CMSampleBuffer
, and then submit it to the custom image source.That should result in the broadcast stream showing your image. In order to see the preview locally, you'll need to use the preview image returned by IVSBroadcastSession.previewView
instead of the preview for your camera (available on IVSImageDevice.previewView
). Hopefully that helps some.
@bclymer
No worries and thank you so much for answering my questions!! I always appreciate your kind support...🥹
I was able to display my test .png image on the stream and camera preview (used IVSBroadcastSession.previewView
as you said. Thank you!!) by adding a new slot :)
The png static image was displayed on the broadcast by submitting the CMSampleBuffer
using IVSCustomImageSource.onSampleBuffer
method once. Now I'm testing to add a custom UIView with some animations on the broadcast with the same approach (create a new mixer, add a custom source to the mixer, and submit the sample buffer), but since the UIView contains some animations, do I need to call the IVSBroadcastSession.onSampleBuffer
method when a new video frame is written (like in the captureOutput(_:didOutput:from:)
)?
I guess when adding a static image to the broadcast, calling IVSBroadcastSession.onSampleBuffer
once is enough, but if it includes some animation, the onSampleBuffer
should be called every frame, am I right?
@yuya-h-29 that is correct, anything dynamic is going to require you to resubmit the sample every time you want to SDK to apply an update. Otherwise I'm happy it's all working for you!
If everything is resolved, feel free to close this issue. Otherwise I'll probably circle back around in a month if there is no further discussion to close it.
@bclymer Gotcha! I'll try resubmitting sample buffers then! Thank you as always!!
@bclymer So sorry I reopened this issue 😓
Since I figured out how to add a static image to the broadcast, now I'm trying to add an animation to the broadcast. (the animation is like floating hearts in the Instagram live app, and I'm thinking to create with UIView animation at first) Since the UIView does have animation, I believe the process would be like
CMSampleBuffer
from captureOutput(_:didOutput:from:)
CMSampleBuffer
(maybe use CIContext
?)As you instructed me before, anything dynamic needs to resubmit to the sample every time, so I think I need to do something similar above to show the animated view on the broadcast. But the thing is I'm not sure if it is a way to add an animated view (or even not sure using UIView is the right approach or not.)
So my question is what approach you would take if you need to add an animated view (like something simple as floating heart animation) to the broadcast?
If I were trying to accomplish that, I would do the following:
zIndex
of 0, and then another slot for your heart animation, zIndex
of 1.IVSVideoConfiguration
make sure enableTransparency
is set to true
.zIndex
).UIImage
most likely) to a CVPixelBuffer
, and then wrap that in a CMSampleBuffer
(you can find guides online for this). The timestamp won't matter.Timer
that fires 30 times per second, in the timer's function submit the CMSampleBuffer
with the heart image to the custom image source you created in step 4.position.y
to something smaller so that it would float up. The SDK will handle moving the heart over the duration
you provided.Hopefully that helps.
@bclymer Thank you so much! I get the idea of how it works!! Appreciate your support!!
Hi! Thank you for your hard work and I appreciate you all providing the amazing SDK for us!
Thanks for the support, I finished developing a basic broadcast part using the SDK in my app, and now I'm testing if it's possible to add some effects (like partcle effects, face filters, etc..), emojis / custom images, and bgm on the stream. To begin with, I'm trying to add emojis on the stream at the user tapped location.
For example, when a user who is broadcasting with an iPhone taps the screen, a smiling face image (😊) is added on the stream, so other viewers can watch the streaming video with the like image on it. Here is the image of what I'm saying...
I was looking at the sample code for the
MixerViewController
in the sample project, and I think it's not exactly the same but the adding the logo while configuring a broadcast session part seems similar to what I want to do.But are there some other ways to add custom images, particles... etc on the stream? I think I can add images at the fixed position using codes above, but I want to locate emojis at the tapped location. Also, I'm wondering if there are 10 different emojis that the user can display, and those emojis should be added to the mixer slot when the broad cast session is configured?
Since I wanted to add some camera functions (like zoom in/out and take picture while stream), I created a capture session using
AVFoundation
, like you guys have done inCustomSourcesViewController
. I was able to display a custom image by adding a sublayer to a view. But the custom image wasn't added on the stream, so I guess it was not the right way to add some imaged to the stream.So could you point me out to the right direction for adding custom image at the user tapped location? Let me know if I need more extra information.