LaiFengiOS / LFLiveKit

LaiFeng IOS Live Kit,H264 and AAC Hard coding,support GPUImage Beauty, rtmp transmission,weak network lost frame,Dynamic switching rate
MIT License
4.39k stars 1.11k forks source link

Audio is not audible on live stream facebook after converting pcmbuffer to Data and then passed to pushAudio(data) #338

Open hitesh3195 opened 4 years ago

hitesh3195 commented 4 years ago

I am trying to push the AudioFile present in the project's bundle of format (raw or m4a) to the Facebook live stream but I am unable to get the audio on Facebook. I am converting the file into the PCM buffer and then to Data and then pass to the function pushAudio.Please tell if I am missing something or doing anything wrong?

harsh12312 commented 4 years ago

Facing same issue, can't seem to stream mp3 file from device or url. Kindly mention code if it works for you, thanks in advance.

Tried this so far, it plays song randomly sometimes and other times nothing happen. Also the stream gets stuck for a bit. Hoping someone can help work it out.

EDIT- Got it working, code mentioned below

let queue = DispatchQueue(label: "Timer DispatchQueue", qos: .background, attributes: .concurrent, autoreleaseFrequency: .workItem, target: nil)

func loopAmplitudes(audioFileUrl: URL) {

        queue.async { [unowned self] in

            let asset = AVAsset(url: audioFileUrl)
            let reader = try! AVAssetReader(asset: asset)
            let track = asset.tracks(withMediaType: AVMediaType.audio)[0]

            let settings = [
                AVFormatIDKey : UInt(kAudioFormatLinearPCM),
                AVNumberOfChannelsKey: 1,
                AVLinearPCMBitDepthKey: 16,
                AVSampleRateKey: track.naturalTimeScale,
                AVLinearPCMIsNonInterleaved: false,
                AVLinearPCMIsFloatKey: false,
                AVLinearPCMIsBigEndianKey: false
                //         AVAudioFileTypeKey: kAudioFileAAC_ADTSType
                ] as [String : Any]

            let readerOutput = AVAssetReaderTrackOutput(track: track, outputSettings: settings)
            reader.add(readerOutput)
            reader.startReading()

            while let sampleBuffer = readerOutput.copyNextSampleBuffer() {

                var audioBufferList = AudioBufferList()

                var blockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer)
                //      var data = Data()
                let format = CMSampleBufferGetFormatDescription(sampleBuffer)!
                // print(format)
                //            MemoryLayout<AudioBufferList>.size
                CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, bufferListSizeNeededOut: nil, bufferListOut: &audioBufferList, bufferListSize: MemoryLayout.size(ofValue: audioBufferList), blockBufferAllocator: nil, blockBufferMemoryAllocator: nil, flags: kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment, blockBufferOut: &blockBuffer)

                //            let buffers = UnsafeBufferPointer<AudioBuffer>(start: &audioBufferList.mBuffers, count: Int(audioBufferList.mNumberBuffers))
                let buffers = UnsafeMutableAudioBufferListPointer(&audioBufferList)

                for audioBuffer in buffers {

                    let audio = audioBuffer.mData!.assumingMemoryBound(to: UInt8.self)
                    let newdata = Data(bytes: audio, count: Int(audioBuffer.mDataByteSize))

                    self.session.pushAudio(newdata)
                    Thread.sleep(forTimeInterval: 0.1)
                }
            }
            RunLoop.current.run()
        }
    }