aws-samples / amazon-ivs-broadcast-ios-sample

MIT No Attribution
18 stars 11 forks source link

add some emojis on the stream at the user tapped location #27

Closed yuya-h-29 closed 2 years ago

yuya-h-29 commented 2 years ago

Hi! Thank you for your hard work and I appreciate you all providing the amazing SDK for us!

Thanks for the support, I finished developing a basic broadcast part using the SDK in my app, and now I'm testing if it's possible to add some effects (like partcle effects, face filters, etc..), emojis / custom images, and bgm on the stream. To begin with, I'm trying to add emojis on the stream at the user tapped location.

For example, when a user who is broadcasting with an iPhone taps the screen, a smiling face image (😊) is added on the stream, so other viewers can watch the streaming video with the like image on it. Here is the image of what I'm saying...

broadcastimage

I was looking at the sample code for the MixerViewController in the sample project, and I think it's not exactly the same but the adding the logo while configuring a broadcast session part seems similar to what I want to do.

logoSlot = IVSMixerSlotConfiguration()
logoSlot.size = CGSize(width: MixerGuide.smallSize.height, height: MixerGuide.smallSize.height) // 1:1 aspect ratio
logoSlot.position = CGPoint(x: MixerGuide.bigSize.width - MixerGuide.smallSize.height - MixerGuide.borderWidth, y: MixerGuide.smallPositionBottomRight.y)
logoSlot.preferredVideoInput = .userImage
logoSlot.zIndex = 3
try logoSlot.setTransparency(0.7)
try logoSlot.setName("logo")

But are there some other ways to add custom images, particles... etc on the stream? I think I can add images at the fixed position using codes above, but I want to locate emojis at the tapped location. Also, I'm wondering if there are 10 different emojis that the user can display, and those emojis should be added to the mixer slot when the broad cast session is configured?

Since I wanted to add some camera functions (like zoom in/out and take picture while stream), I created a capture session using AVFoundation, like you guys have done in CustomSourcesViewController. I was able to display a custom image by adding a sublayer to a view. But the custom image wasn't added on the stream, so I guess it was not the right way to add some imaged to the stream.

let image = UIImage(named: "test_image")!
let imageLayer = CALayer()
let aspect: CGFloat = image.size.width / image.size.height
let width = videoSize.width
let height = width / aspect
imageLayer.frame = CGRect(
x: 0,
y: -height * 0.15,
width: width,
height: height)
imageLayer.contents = image.cgImage
liveCameraView.layer.addSublayer(imageLayer)

So could you point me out to the right direction for adding custom image at the user tapped location? Let me know if I need more extra information.

bclymer commented 2 years ago

Hi @yuya-h-29, sorry for the delay. I'll try to answer all of your questions.

But are there some other ways to add custom images, particles... etc on the stream?

There are 2 ways to add multiple layers to the stream.

  1. Through adding more slots to the mixer, like you tried with the logoSlot.
  2. Rendering them yourself using Metal or CoreImage/CoreVideo and sending the rendered image to the SDK as a custom image source.

Also, I'm wondering if there are 10 different emojis that the user can display, and those emojis should be added to the mixer slot when the broad cast session is configured?

The mixer can be modified at runtime. You can add/remove slots and bind/unbind devices at any time after the creation of the IVSBroadcastSession.

But the custom image wasn't added on the stream, so I guess it was not the right way to add some imaged to the stream.

In that scenario you are only adding your image to UI hierarchy, but you are not submitting it to the SDK. The liveCameraView object does not have anything to do with what the SDK broadcasts, you can still broadcast the camera's feed without any preview, as long as the CMSampleBuffers are still being sent to the SDK.

So could you point me out to the right direction for adding custom image at the user tapped location?

We do not have sample code available for this scenario, but dynamically using the mixer APIs (add, remove, bind, unbind) is the correct approach. You would do something like the following

  1. Create a new mixer slot at the touch position. a. Make sure to set the zIndex to something greater than your other slots.
  2. Create a new custom image source.
  3. Bind the custom image source to your new mixer slot.
  4. Convert your image (either UIImage, CIImage CGImage, etc) into a CVPixelBuffer, wrap that in a CMSampleBuffer, and then submit it to the custom image source.

That should result in the broadcast stream showing your image. In order to see the preview locally, you'll need to use the preview image returned by IVSBroadcastSession.previewView instead of the preview for your camera (available on IVSImageDevice.previewView). Hopefully that helps some.

yuya-h-29 commented 2 years ago

@bclymer No worries and thank you so much for answering my questions!! I always appreciate your kind support...🥹 I was able to display my test .png image on the stream and camera preview (used IVSBroadcastSession.previewView as you said. Thank you!!) by adding a new slot :)

The png static image was displayed on the broadcast by submitting the CMSampleBuffer using IVSCustomImageSource.onSampleBuffer method once. Now I'm testing to add a custom UIView with some animations on the broadcast with the same approach (create a new mixer, add a custom source to the mixer, and submit the sample buffer), but since the UIView contains some animations, do I need to call the IVSBroadcastSession.onSampleBuffer method when a new video frame is written (like in the captureOutput(_:didOutput:from:))?

I guess when adding a static image to the broadcast, calling IVSBroadcastSession.onSampleBuffer once is enough, but if it includes some animation, the onSampleBuffer should be called every frame, am I right?

bclymer commented 2 years ago

@yuya-h-29 that is correct, anything dynamic is going to require you to resubmit the sample every time you want to SDK to apply an update. Otherwise I'm happy it's all working for you!

If everything is resolved, feel free to close this issue. Otherwise I'll probably circle back around in a month if there is no further discussion to close it.

yuya-h-29 commented 2 years ago

@bclymer Gotcha! I'll try resubmitting sample buffers then! Thank you as always!!

yuya-h-29 commented 2 years ago

@bclymer So sorry I reopened this issue 😓

Since I figured out how to add a static image to the broadcast, now I'm trying to add an animation to the broadcast. (the animation is like floating hearts in the Instagram live app, and I'm thinking to create with UIView animation at first) Since the UIView does have animation, I believe the process would be like

  1. get CMSampleBuffer from captureOutput(_:didOutput:from:)
  2. add animated UIView to the image buffer somehow (haven't find out how to achieve yet) and return a new CMSampleBuffer (maybe use CIContext?)
  3. submit the newly returned sample buffer to the broadcast SDK

As you instructed me before, anything dynamic needs to resubmit to the sample every time, so I think I need to do something similar above to show the animated view on the broadcast. But the thing is I'm not sure if it is a way to add an animated view (or even not sure using UIView is the right approach or not.)

So my question is what approach you would take if you need to add an animated view (like something simple as floating heart animation) to the broadcast?

bclymer commented 2 years ago

If I were trying to accomplish that, I would do the following:

  1. Create 2 slots, the first slot for the camera with a zIndex of 0, and then another slot for your heart animation, zIndex of 1.
  2. Make sure your heart asset is transparent.
  3. In your IVSVideoConfiguration make sure enableTransparency is set to true.
  4. Bind your camera to the first slot, then create a custom image source and bind that to the 2nd slot (with the higher zIndex).
  5. Convert your heart image (UIImage most likely) to a CVPixelBuffer, and then wrap that in a CMSampleBuffer (you can find guides online for this). The timestamp won't matter.
  6. Create a Timer that fires 30 times per second, in the timer's function submit the CMSampleBuffer with the heart image to the custom image source you created in step 4.
  7. Use the IVSBroadcastMixer.transition API to transition your slot over time. So you would do something like change position.y to something smaller so that it would float up. The SDK will handle moving the heart over the duration you provided.

Hopefully that helps.

yuya-h-29 commented 2 years ago

@bclymer Thank you so much! I get the idea of how it works!! Appreciate your support!!