dmrschmidt / DSWaveformImage

Generate waveform images from audio files on iOS, macOS & visionOS in Swift. Native SwiftUI & UIKit views.
MIT License
980 stars 109 forks source link

Background thread rendering #5

Closed AliMunchkin closed 5 years ago

AliMunchkin commented 6 years ago

I wonder if this (awesome!) library could be modified so the waveform is generated on a background thread (then of course displayed on the main thread)? I've had a play dropping some of the processing onto a global queue but could not get it working.

I would love to use this in a DAW style app but require 'on the fly' updating of waveforms (recording functionality etc.). Doesn't matter if the waveforms take a little while to display, but cannot hold up the main thread to maintain UI responsiveness.

dmrschmidt commented 5 years ago

Hey, sorry for the late reply. I haven't been paying a lot of attention to github recently.

It's a very good point you're bringing up. I'll look into it more closely when I find some free time and integrate / document this better.

After heaving read up on the some Apple docs, it looks like the drawing should actually be thread safe and you can call it from a background thread.

So slightly modifying the examples from DSWaveFormImageExample/ViewController.swift, you can do this:

let bounds = middleWaveformView.bounds
DispatchQueue.global(qos: .userInteractive).async {
    let bottomWaveformImage = waveformImageDrawer.waveformImage(fromAudioAt: audioURL,
                                                                size: bounds.size,
                                                                color: UIColor.blue,
                                                                backgroundColor: UIColor.lightGray,
                                                                style: .filled,
                                                                position: .custom(0.9),
                                                                paddingFactor: 5.0)
    DispatchQueue.main.async {
        self.bottomWaveformView.image = bottomWaveformImage
    }
}

or

let waveform = Waveform(audioAssetURL: audioURL)!
let configuration = WaveformConfiguration(size: lastWaveformView.bounds.size,
                                          color: UIColor.blue,
                                          style: .striped,
                                          position: .bottom)

DispatchQueue.global(qos: .userInteractive).async {
    let image = UIImage(waveform: waveform, configuration: configuration)
    DispatchQueue.main.async {
        self.lastWaveformView.image = image
    }
}

respectively. Using DSWaveformImage directly however would require actual code changes to the library itself. Sounds like in your example though you'd actually want to use the UIImage category or the WaveformImageDrawer directly anyway due to the increased amount of control.

AliMunchkin commented 5 years ago

Thanks so much for picking this up and providing some example code, I really appreciate it :)

I hope to consider this for an upcoming project and let you know how I get on.

dmrschmidt commented 5 years ago

Sure thing! If you end up using it - or simply exploring it a bit - I appreciate every bit of feedback :) As the core of your original issue can be addressed and the README is now explaining it too, I’ll consider this closed for now. Will still get back to supporting this more out of the box at some later point.