Closed sdgandhi closed 5 years ago
@sdgandhi Thanks for reaching out.
Our Quickstart sample app demonstrates the use of H.264 codecs. You can choose the video codec by "Settings > Video Codec > H.264". Are you able to reproduce the problem on Quickstart?
Hey @sdgandhi,
When screen capturing AR from SceneKit using technique from sample app.
Is there any chance you are you trying to capture in a resolution > 1280x720? This won't work, and is a current limitation of our iOS Client and Group Rooms.
https://www.twilio.com/docs/video/managing-codecs#muti-codecs-limitations-and-known-issues
Best, Chris
@piyushtank I cannot reproduce the problem with the Quickstart with the same phones.
@ceaglest Constraining the capture resolution to 1280x720 does not fix the problem either. Still getting black frames on receiver side.
In addition, the camera preview in my app works fine with H264. It's only the AR capture that sends black frames for H264.
Therefore, I have to surmise that there's an issue with the video data pipeline from AR capture -> H264. Do you guys have a working sample of this?
@sdgandhi Thanks for posting more information. I am assuming you are having a custom capturer but It is hard to say where the problem is without looking a the code.
Is it possible for you to share code to help reproduce the problem? or if you can change sample app to reproduce the problem?
We have ARKitExample which demonstrates custom capturer APIs and ARKit use with our video sdk. Please note that ARKitExample has a listed known issue of performance because it snapshots the ARSCNView. We are going to improve the performance of the example app in future.
@piyushtank Yes, modifying the regular video quickstart by using the custom capturer from the ARKitExample will repro the issue.
@sdgandhi In order to debug the problem, can you share the code of the modified quickstart which reproduces the problem?
Has there been any update on this. We are seeing this exact issue with our app. We are using ARKit and running video to a custom captureConsumer. If we set the codec to h264 we get no output (just black frames) from the sender. We can only initialize calls using VP8 on iOS, but this causes some conditions where one participant will sometime fail to connect to a room or will connect then immediately disconnect. We would prefer to use a hardware accelerated codec. This is how we are implementing our capture consumer and initializing the ARKit stream.
func startCapture(_ format: TVIVideoFormat, consumer: TVIVideoCaptureConsumer) {
self.captureConsumer = consumer
self.displayLink = CADisplayLink(target: self, selector: #selector(self.displayLinkDidFire))
self.displayLink?.preferredFramesPerSecond = self.sceneView.preferredFramesPerSecond
self.displayLink?.add(to: .main, forMode: .commonModes)
captureConsumer?.captureDidStart(true)
}
// Send SceneKit View to pixelBuffer
@objc func displayLinkDidFire(timer: CADisplayLink) {
let image = self.sceneView.snapshot()
autoreleasepool {
let pixelBuffer = ImageProcessor.pixelBuffer(forImage: image)
let aFrame = TVIVideoFrame(timeInterval: timer.timestamp, buffer: pixelBuffer!, orientation: TVIVideoOrientation.up)
self.captureConsumer?.consumeCapturedFrame(aFrame!)
}
}
// Processing pixel buffer
struct ImageProcessor {
static func pixelBuffer (forImage image:CGImage) -> CVPixelBuffer? {
// let frameSize = CGSize(width: image.width, height: image.height)
let frameSize = CGSize(width: 720, height: 1280)
var pixelBuffer:CVPixelBuffer? = nil
let status = CVPixelBufferCreate(kCFAllocatorDefault, Int(frameSize.width), Int(frameSize.height), kCVPixelFormatType_32BGRA , nil, &pixelBuffer)
if status != kCVReturnSuccess {
return nil
}
CVPixelBufferLockBaseAddress(pixelBuffer!, CVPixelBufferLockFlags.init(rawValue: 0))
let data = CVPixelBufferGetBaseAddress(pixelBuffer!)
let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo = CGBitmapInfo(rawValue: CGBitmapInfo.byteOrder32Little.rawValue | CGImageAlphaInfo.premultipliedFirst.rawValue)
let context = CGContext(data: data, width: Int(frameSize.width), height: Int(frameSize.height), bitsPerComponent: 8, bytesPerRow: CVPixelBufferGetBytesPerRow(pixelBuffer!), space: rgbColorSpace, bitmapInfo: bitmapInfo.rawValue)
// context?.draw(image, in: CGRect(x: 0, y: 0, width: image.width, height: image.height))
context?.draw(image, in: CGRect(x: 0, y: 0, width: 720, height: 1280))
CVPixelBufferUnlockBaseAddress(pixelBuffer!, CVPixelBufferLockFlags(rawValue: 0))
return pixelBuffer
}
}
// How we build a room
let connectOptions = TVIConnectOptions.init(token: self.accessToken) { (builder) in
// Use the local media that we prepared earlier.
builder.audioTracks = self.localAudioTrack != nil ? [self.localAudioTrack!] : [TVILocalAudioTrack]()
builder.videoTracks = self.localVideoTrack != nil ? [self.localVideoTrack!] : [TVILocalVideoTrack]()
// Use the preferred audio codec
if let preferredAudioCodec = Settings.shared.audioCodec {
builder.preferredAudioCodecs = [preferredAudioCodec]
}
// Use the preferred video codec
if let preferredVideoCodec = Settings.shared.videoCodec {
builder.preferredVideoCodecs = [TVIH264Codec(), TVIVp8Codec(), TVIVp9Codec()]
}
// Use the preferred encoding parameters
if let encodingParameters = Settings.shared.getEncodingParameters() {
builder.encodingParameters = encodingParameters
}
// The name of the Room where the Client will attempt to connect to. Please note that if you pass an empty
// Room `name`, the Client will create one for you. You can get the name or sid from any connected Room.
builder.roomName = roomName
// The CallKit UUID to assoicate with this Room.
builder.uuid = uuid
print("Built room", uuid)
}
@cspecter Thanks for reaching out. Apologize for the delay as some of our team members are on vacation at present.
I will try to reproduce the problem tomorrow in PST day time using our ARKitExample sample app. In the meantime, can you provide following information to debug the problem - 1
Sure thing. We are testing on an iPhone X and an XS Max. 12.1.1. Also we are on the 2.5.6 version of Twilio. We have not switched to 2.6, mostly because of a lack of examples. It looks a bit different so maybe behavior will be different?
@cspecter Thanks for the info. can you share RoomSid when you run into the problem?
We have added an escalation ticket in our current sprint for this issue. We are going try reproduce this problem and get back soon. I will keep you posted.
Hi,
We tried to reproduce this issue locally, but were unable to. Please reopen if you can provide a Room SID that we can diagnose.
Thanks, Chris
Description
Using the TVIH264Codec causes black frames on the receiver's end. When screen capturing AR from SceneKit using technique from sample app. Without preferring the H264 coded, the frames are transferred correctly.
Steps to Reproduce
Code
Expected Behavior
Preferring the TVIH264Codec should send correct frames.
Actual Behavior
Receiver gets black frames
Reproduces How Often
100%
Logs
I'm seeing a few of these in the debug log on the sender side.
And here's a sip message that includes the part specifying h264 (on the sender side).
Versions
Video iOS SDK
TwilioVideo 2.1.0 vis CocoaPods
Xcode
9.4
iOS Version
Sender: 11.3.1 Receiver: 11.4
iOS Device
Sender: iPhone X Receiver: iPhone 6s